0% found this document useful (0 votes)
68 views39 pages

SEO Audit Report for YourSite.com

The SEO audit for yoursite.com reveals an overall SEO score of 61%, indicating good performance but with significant room for improvement. Key action items include optimizing keyword focus, title tags, meta descriptions, and enhancing the quality and quantity of inbound links. The report provides detailed analyses, recommendations, and guidelines across various SEO elements to help improve the site's rankings and visibility.

Uploaded by

Daniyal Khan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
68 views39 pages

SEO Audit Report for YourSite.com

The SEO audit for yoursite.com reveals an overall SEO score of 61%, indicating good performance but with significant room for improvement. Key action items include optimizing keyword focus, title tags, meta descriptions, and enhancing the quality and quantity of inbound links. The report provides detailed analyses, recommendations, and guidelines across various SEO elements to help improve the site's rankings and visibility.

Uploaded by

Daniyal Khan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 39

2515 4th Avenue, Suite 708  Seattle, WA 98121  800.798.

2430
Credit: Khuram Malik, SEBT

Yoursite.com
flstatewideinsurance.com
Search Engine Optimization
Audit

2|Page
SEO EXECUTIVE SUMMARY 5
Top Content Action Items
Top Indexing Action Items
Top Linking/Social Action Items
SEO SCORECARD 6
CONTENT OVERVIEW 7
Content Analysis
Content Score
KEYWORD FOCUS 8
Analysis
Recommendations
Guidelines
URL STRUCTURE 9
Analysis
Recommendations
Guidelines
TITLE TAGS 10
Analysis
Recommendations
Guidelines
META DESCRIPTION TAGS 11
Analysis
Recommendations
Guidelines
META KEYWORDS 12
Analysis & Recommendations
Guidelines
HEADING TAGS 13
Analysis
Recommendations
Guidelines
CONTENT 14
Analysis
Recommendations
Guidelines

INTERNAL LINKING & ANCHOR TEXT 16


Analysis
Recommendations
Guidelines
IMAGE NAMES & ALT TAGS 18
Analysis
Recommendations
Guidelines
NOFOLLOW ANCHOR TAGS 19
Analysis

3 | TABLE OF CONTENTS
Recommendations
Guidelines
INDEXING OVERVIEW 20
Top Indexing Action Items
Indexing Analysis
Indexing Score
INDEXING OPTIMIZATION ANALYSIS 21
PAGE EXCLUSIONS 22
Analysis
Recommendations
Guidelines
PAGE INCLUSIONS 23
Analysis
Recommendations
Guidelines
URL REDIRECTS 24
Analysis
Recommendations
Guidelines
DUPLICATE CONTENT 25
Analysis
Recommendations
Guidelines
BROKEN LINKS 26
Analysis
Recommendations
Guidelines

CODE VALIDATION 28
Analysis
Recommendations
Guidelines
PAGE LOAD SPEED 29
Analysis
Recommendations
Guidelines
LINKING ANALYSIS OVERVIEW 31
Linking Analysis
Linking Score
INBOUND FOLLOWED LINKS 32
Analysis
Recommendations
Guidelines
LINKING ROOT DOMAINS 33
Analysis
Recommendations
Guidelines
AUTHORITY & TRUST 34

4 | TABLE OF CONTENTS
Analysis
Recommendations
Guidelines
SOCIAL MEDIA MENTIONS & VISIBILITY 35
Analysis
Recommendations
Guidelines
COMPETITIVE LINK COMPARISON 36
Analysis
CONCLUSION 37
List of Supporting Documents

5 | TABLE OF CONTENTS
SEO EXECUTIVE SUMMARY
yoursite.com has scored a 61% for overall SEO-ability. This is good, but still leaves
significant room for improvement. This prioritized list shows the key elements to
optimize, from highest to lowest priority, for the three main areas of SEO;Content,
Indexing and Linking/Social.
Top Content Action Items
Keyword Focus
The site as a whole has a keyword focus, but its using less competitive keywords
that won’t drive more traffic, it can be further improved.

Title Tags
6 Title tags are too long, 2 are same as H1, 8 are duplicates, 3 are too short, and 0 is
missing.
Meta Description Tags
0 pages have duplicate Meta descriptions, 3 are too long, 1 is too short and 36 are missing
Image Names and ALT Tags
8 Images are missing ALT tags and not all images had descriptive, keyword focused file
names.

Top Indexing Action Items

Page Exclusions
No pages are excluded from indexation.

Page Load Speed


Page load speed is 4.04 sec, you can improve page load speed by reducing page
file size and number of requests.

Code Validation
Code validation is good.

Top Linking/Social Action Items


Linking Root Domains
Increase both the quantity and quality of linking root domains.
Inbound Followed Links
Work to build out more keyword rich anchor text links for non-branded keywords
while also working to increase the overall quantity and quality of inbound links.

6 | SEO AUDIT
Simply start with #1 in each category and work your way down the list. By
implementing the changes that we have recommended, the website will be well on
its way to achieving more excellent rankings for its chosen keyword phrases.SEO
SCORECARD
SEO ELEMENT SCORE
Keyword Focus 5
URL Structure 10
Title Tags 7
Meta Description Tags 5
Meta Keyword Tags 0
Heading Tags 6
Content 9
Internal Linking and Anchor Text 7
Image Names and ALTs 7
NoFollow Anchor Tags 8
On-site SEO Score (26% of Algorithm) 61%
Page Exclusions 10
Page Inclusions 10
URL Redirects 10
Duplicate Content 8
Broken Links 5
Code Validation 6
Page Load Speed 7
Indexing Score (12% of Algorithm)
83%
Inbound Followed Links 7
Linking Root Domains 7
Authority and Trust 6
Social Media Mentions and Visibility 8
Competitive Link Comparison -

Linking/Social Score (62% of Algorithm) 86%

OVERALL SEO SCORE 61%

A score of 10 is perfect execution and a score of 1 means that element is missing


entirely. These areas are addressed in the following report. In each overview section
the issues are color coded as follows:

7| SEO AUDIT
Red = Major Issues (1-3) | Yellow = Some Issues (4-7) | Green = Minor Issues If
Any (8-10)

8| SEO AUDIT
SCORE
Score
82 61%
%
CONTENT OVERVIEW
Top Content Action Items
Keyword Focus
Title Tags
Meta Descriptions
Image Names and ALT Tags

Content Analysis
This analysis addresses all of the on-site SEO content related issues found on the
website. Implementing the recommended changes should provide valuable SEO
benefits. There are 10 main on-site SEO topics covered, as follows:

Keyword Focus
URL Structure
Title Tags
Meta Descriptions
Meta Keywords
Heading Tags
Content
Internal Linking and Anchor Text
Image Names and ALT Tags
NoFollow Anchor Tags

Red = Major Issues (1-3) | Yellow = Some Issues (4-7) | Green = Minor Issues If
Any (8-10)

For each main on-site SEO topic, there are 3 sub sections: Analysis,
Recommendations and Guidelines.

• The Analysis section details the findings from our analysis.


• The Recommendations section identifies our proposed action items.
• The Guidelines section gives specific instructions for implementing the required
changes.

Content Score
The website has an On-Site SEO Score of 50%. This indicates above average on-site
SEO-ability. Correct implementation of the proceeding guidelines will help to
improve existing rankings, and to rank for more competitive keyword phrases in the
future.

9 | TOP CONTENT ACTION ITEMS


URL STRUCTURE

Analysis
We found the URL structure to be good. A few, pages had URLs that had
underscores (-) in them. For example:

URL URL
LENGTH

10 | TOP CONTENT ACTION ITEMS


SCORE
10/10

There are no URLs that exceeded 100 characters or used uppercase letters and no
pages with duplicate URLs. It makes the URL structure overall great and perfect.

Recommendations
Ideally, no URL should be longer than 100 characters in length. If possible, create
shorter URLs for any pages that exceed this limit (if you change a URL, 301
redirects will be necessary). The attached site crawl report will help you to identify
the URLs in need of optimization. That said, keep in mind that this is far from a
critical issued, and could safely be ignored.

Guidelines
A site’s URL structure is extremely important to both users and search engines. Poor
URL structure can hurt rankings, prevent pages from being indexed and lower your
click-through-rate (CTR).
It is extremely important that URLs be readable, user friendly, and that they contain
the keyword of the page. Always separate keywords with dashes, not underscores.
If relevant, a geo-qualifier (such as Vancouver, BC) should also be included. URL file
names should always be written in lowercase letters.
URLs should also be relatively short, with 100 characters in length being the current
SEO best practice. While longer URLs aren't necessarily bad, the shorter the URL the
less likely that URL is to truncate in search results and the more likely it is to have a
positive impact on SERP click-through-rates.
If query parameters are necessary for analytics or paid advertising reasons, make
sure to use Google and Bing Webmaster Tools to tell search spiders to ignore those
parameters from an indexing perspective. It won’t impact your tracking, but will
prevent any indexing issues related to parameters from occurring. Make sure that
parameter pages are not linked to internally.

TITLE TAGS
Analysis

11 | TOP CONTENT ACTION ITEMS


URL & Titles

12 | TOP CONTENT ACTION ITEMS


SCORE
6/10

We found 6 pages where the Title tags exceeded Google’s truncation limit, 3 that
are too short, 8 duplicates, and 0 pages that are missing a Title tag.
Score
While existing Title tags were fairly well written overall. The home page7/10
Title tag
epitomizes this; while branded, it lacked some keyword focus, and is one of the
most important places on a site for keywords to be used.

Recommendations
We recommend that optimized Title tags be re-written for each page that has an
overly long, short or otherwise un-optimized Title tag. We particularly recommend
writing a new home page Title tag.

Guidelines
After the URL, the second most important place on a page to have the keyword is
the Title tag. The Title tag is the first description of the page that search engine
users will read, and it is extremely important to both users and search engines that
it contain the keyword they are searching for. This will not only help to improve
rankings, but can significantly improve your click-through-rate as well.
A proper Title tag will:
Be 15-60 characters in length (50-60 is ideal).*
Be unique to that page (don’t use the same title tag on multiple pages).
Use the keyword of that page twice if space permits (once at the start, followed by separator
such as a colon, hyphen, or pipe, and then once again in a call to action). If the character
limit prevents the use of the keyword twice, use it once in a good call to action, with the
keyword as close to the beginning of the Title tag as possible.
If relevant, include a geo-qualifier (such as Washington or Seattle, WA) .

* While you may hear some SEOs say to limit Titles to 70 characters, it's a
misconception. Google is actually using a pixel width limit, not a character limit.
Title tags appear in 12pt Arial font by default, with searched for keywords bolded,
and Google has a pixel width limit of 520 pixels.
Using 60 characters as your Title character limit will avoid truncation in the vast
majority of cases.
You can see if a Title will truncate by doing the following: simply use Excel, set
column width to 520px, set columns to wrap text, and font to Arial 12pt. Type in
your Title, and bold the main keyword. If the line breaks, your Title tag will truncate.
(You can alsouse this tool to check.)
One example of a proper title tag structure might be:

<title> | </title>

13 | TOP CONTENT ACTION ITEMS


SCORE
5/10

META DESCRIPTION TAGS


Analysis
36 pages were missing Meta description tags, 0 pages have duplicate Meta
descriptions, 3 Meta descriptions are too long and 1 pages have too short, 1 page
on the site has 2 Meta description tags.

Recommendations
We recommend that unique, keyword and user targeted Meta description tags be
re-written for each page that has two tags, or that is currently missing a description
(every page on the site). Every page on a site should have a unique, keyword
optimized Meta description tag.
Score
Guidelines 6/10
While Meta description tags are a factor in the ranking algorithm, they are also used
as the description searchers will see in the search engine results. Having the
keyword used properly in the Meta description tags can increase the likelihood that
users will click on the link to the page if the keyword usage matches their search
query.

Meta descriptions should adhere to the following guidelines:


Be unique and relevant to that page.
Be written as descriptive ad text, with a call to action.
No more than 160 characters in length including spaces and punctuation (140-150
is ideal), but no less than 51 characters (Google considers 50 characters or less to
be too short).
Contain 1-2 complete sentences with correct punctuation, and no more than 5
commas.
Use the keyword once per sentence, as close to the start of each sentence as
possible.
Include a geo-qualifier, such as “Seattle, WA”, if relevant.

A proper Meta description tag example might be:


<meta name=”description” content=” The real estate signage options make it easy help
advertise at your location. Hundreds of items are in stock at Slimline Warehouse for your
business today!!”/>

14 | TOP CONTENT ACTION ITEMS


SCORE
7/10

HEADING TAGS
Analysis
0 pages are missing H1 tags, 59 pages have duplicate H1 tags and 6 H1 tags are
too long. 63 pages have multiple description tags. 66 pages on the site make use of
H2 tags and all 66 are duplicate.

Recommendations
You should consider adding H1 tags to the pages that are missing them, and you
should make sure that heading tags in general are unique & use keywords more
effectively where relevant.

Guidelines
Search engines weight text for SEO value based on text size and position on the
page. Heading tags are supposed to be larger than the other text on the page, and
should appear prominently on the page, thus the added benefit of having the
keyword in the heading tags.
Every page should have an H1 tag, as search engines look to the H1 to help
Score
determine the topic of a page. It should be the first thing in the body text of the
10/10
page, and should appear prominently.
H1 tags should never contain images or logos, only text. The keyword of a page
needs to be used in the H1 tag, and in at least half of the total heading tags on a
page, if more than one heading tag is present.
From a usability perspective, paragraphs should never be longer than 5 lines of
text, and it is wise to break up a page every 2-3 paragraphs with a sub-heading in
the form of an H tag (H2 or H3) or an image. Testing has shown that when users are
faced with a large block of unbroken text, most either skim over the text or skip it
altogether, so content needs to be divided into usable chunks.
It is important that the keyword of a page be used in the H1 tag, as close to the
beginning of the H1 as possible. Ideally, there should be at least one additional H
tag on each page that contains the keyword, for added SEO value. Heading tags are
a nested element, and should be used in the correct order.
No H tag should be used of the preceding numerical tag has not been used (don't
use an H2 if there is no H1, don't use an H5 if there is no H4, etc.).

15 | TOP CONTENT ACTION ITEMS


SCORE
8/10

CONTENT
Analysis
Currently on the website, the content situation is as follows:
Keyword Usage in Content
Keyword and variations used? One or two main keywords per page?
Content Amount/Quality
# of words, spelling, grammar, unique, useful
For some of the pages we looked at there appeared to be a significant amount of
unique, topically focused and user useful content. The content tends to be very
well written.
Content Growth
New content added regularly? Existing content updated from time to time?
The site is static, and new content isn’t added very regularly. Some content might
be periodically updated, but the main focus isn’t on adding fresh content
frequently.
Media to Text
If Flash or images contain text, is it also available in readable html text?
While there are some images that contain text, ALT tags are generally used
correctly.
Score
8/10

Recommendations
You should optimize your content and make the site thicker. You should consider
building out more top-level pages too.

16 | TOP CONTENT ACTION ITEMS


Guidelines
One of the most important on-site SEO elements is text content, and search engines
want to see a significant amount of unique text on each page. Under ideal
circumstances we would recommend having 400-600 words of unique text per
page, though we believe the safe minimum is 300 words.
For usability reasons, no page should ever be longer than 800 words (with the
exception of blog posts). This content needs to contain the relevant keywords for
the page. Having unique, keyword rich text on a page can help to improve search
engine rankings significantly.
Duplicate content is viewed as a big negative. It can not only hurt rankings, but can
prevent a page from ranking. Search engines want to see unique content on a site,
and can penalize a site for using duplicate content. Using flash or images in place of
text is considered a big negative. While search engines like a variety of media,
itmust not take the place of text. Consider using Flash replacement text, like SIFR.

17 | TOP CONTENT ACTION ITEMS


SCORE
7/10
INTERNAL LINKING & ANCHOR TEXT
Analysis

We didn’t find a significant amount of internal linking throughout the site, which
isn’t a perfect situation!.
However, internal navigation is using a mix of relative URLs (/) and absolute URLs
(http://flstatewideinsurance.com/). This can sometimes cause indexing issues, and is
not advised. Only absolute URLs should be used for internal linking. (The Home link
is the only relative that we found.)
Some pages contain more than the recommended limit of 100 links per page, but this
is due largely to the comment system. While not ideal, this isn't a critical issue.
However, the site contain a number of site-wide links pointing to an external
domains (http://www.corelogic.com/ , https://www.nachi.org/pb.htm ,
http://www.sun-sentinel.com/topic/politics/government/mary-l.-landrieu-
PEPLT003755.topic), and this isn’t ideal. Not only do site-wide links tend to be
discounted, but by doing this you are essentially leaking authority and PageRank
from every page on the site to these external domains.
Recommendations
We recommend that absolute URLs be used internally, not relative URLs. We also
recommend that the total number of links on each page be reduced as needed to
keep the total fewer than 100 links per page to stay in-line with SEO best practices.
This might require re-coding the comment system to prevent commenter names
from being links.
Also, minimize instances of external linking from within the site. If site-wide links
must be used, we recommend implementing NoFollow tags on all such links.

18 | TOP CONTENT ACTION ITEMS


Guidelines
It is very important to cross link within the pages of one’s site using keyword rich
anchor text, though you should do so sparingly. Pages of similar topic should cross
link to each other using they keywords of those pages in the anchor text. We
recommend 2-3 keyword rich internal links on any given page.
In addition to linking from within the text of a page, keyword relevant anchor text
should be used in the main navigation elements. Where space prevents the use of a
relevant keyword for the page being linked to in the navigation, it is important to
include the title element in the anchor tag, as follows:
<ahref=”http://www.QuickSprout.com/” title=”Neil Patel's Digital Marketing Blog”>Home</a>
The same goes for links outside of the site. Approximately 15-30% of all inbound
Score
links from blogs, forums, press releases, or any other external link building should
10/10
include the keyword of the page being linked to in the anchor text of the link (or a
close variation).

19 | TOP CONTENT ACTION ITEMS


SCORE
7/10
IMAGE NAMES & ALT TAGS

Analysis
While many of the main images throughout the site had ALT tags, relatively few of
the images found on the site had descriptive, keyword rich image names.

MISSING ALT TEXT

http://yoursite.com/wp-content/uploads/2015/04/FLsliders-21-627x358.png

http://yoursite.com/wp-content/uploads/2015/04/FLsliders-41-627x358.png

http://yoursite.com/wp-content/uploads/2015/04/FLsliders-61-627x358.png

Recommendations
We recommend that ALT tags be written for each image (including design elements
like borders, buttons, etc.) that currently doesn’t have ALT tags. Keywords should
be used in ALT tags.

Guidelines
For the benefit of search engines, code compliance, and visually impaired users,
every image MUST have an ALT tag. The ALT tag should contain a keyword relevant
to the page/en-us (but only if the keyword is relevant to the image as well).
Image file names should be descriptive words, not numbers or query strings. They
should accurately describe the image, and if relevant should also use the keyword.
If an image is used as a link, then the ALT tag functions in place of anchor text. A
linked image should follow this structure:
<ahref=”http://www.targeturl.com/”><img src=”http://www.domain.com/images/keyword-
rich-image-name.jpg” alt=”use a keyword if relevant” /></a>
By ensuring that all images are properly named and tagged, you will not only increase the
SEO value of those images, but you will increase the likelihood of receiving referral traffic
from image search results.
Also, for code compliance reasons, all images should also specify a height and width in
the image tag.

20 | TOP CONTENT ACTION ITEMS


SCORE
8/10

NOFOLLOW ANCHOR TAGS

Analysis
NoFollow tags are not being used in numerous places throughout the site, and the
usage appears not to be correct. While there are additional places we feel the
NoFollow tag should be used.

Recommendations
You should consider adding NoFollow tags to some of the site-wide links that point
to other websites present on the site.

Guidelines
Google measures how different pages link together, and assigns a weight to those
links based on traffic, relevancy, age, size, content, and hundreds of other
components.
When pages that Google deems relevant link to other pages, some of that “Link
Juice” flows through that link to the site being linked to. A “followed” link is
essentially endorsing the page being linked to.
Enter the rel=”nofollow” tag. Google introduced this tag to help preserve the
relevancy of PageRank, which was being hurt by blog and forum comment
spammers. When the tag rel=”nofollow” is used in an anchor tag (link), Google will
usually pass 50-100% less "link juice" to the page being linked to. Using this tag is
like saying "this page is nice, but we don’t really endorse it."
NoFollow tags should be used on blog comments, site-wide external links, and on
any internal links pointing to low quality or otherwise user-worthless pages.

21 | TOP CONTENT ACTION ITEMS


SCORE
83%

INDEXING OVERVIEW

Top Indexing Action Items

Page Exclusions
Page Load Speed
Code Validation

Indexing Analysis
This analysis addresses all of the SEO Indexing related issues found on the website.
Implementing the recommended changes should provide valuable SEO benefits.
There are 7 main SEO indexing topics covered, as follows:

Page Exclusions
Page Inclusions
URL Redirects
Duplicate Content
Broken Links
Code Validation
Page Load Speed

Red = Major Issues (1-3) | Yellow = Some Issues (4-7) | Green = Minor Issues If
Any (8-10)

For each main SEO indexing topic, there are 3 sub sections: Analysis,
Recommendations and Guidelines.
The Analysis section details the findings from our analysis.
The Recommendations section identifies our proposed action items.
The Guidelines section gives specific instructions for implementing the required changes

Indexing Score
The website has an Indexing SEO Score of 50%. This indicates average index-
ability. Correct implementation of the proceeding guidelines will help to improve its
existing rankings, and to rank for more competitive keyword phrases in the future.

22 | TOP INDEXING ACTION ITEMS


SCORE
83%
INDEXING OPTIMIZATION ANALYSIS

Often, we find large variances in the pages that each search engine decides to keep
in their index for their users. This is usually a symptom of one large or several small
indexing problems. Websites with smaller variances of pages indexed between the
search engines have very few if any indexing problems.
For the site Google has 61 URLs indexed, while Bing has 0. However, our crawl
found 66 static, index-able, non-parameterized URLs reachable by a link-to-link
crawl.

This level of variance means that there are indexing issues present that are causing
the search engines to maintain an incorrect index. The proceeding
recommendations will reduce the confusion of pages available to the search
engines and help in the maintenance of the website. Every page indexed enables
the domain to rank for search terms contained on those pages, which means an
additional point of entry. This is why it is important to keep track of how many
pages there are and how many are indexed.

Score
5/10

23 | TOP INDEXING ACTION ITEMS


SCORE
10/10
PAGE EXCLUSIONS

Analysis
A robots.txt file was found, and was very well optimized. Page level Meta robots
tags such as the NoODP and NoYDir tags were present and used correctly. In
addition, typical Wordpress /page/ issues were handled with NoIndex tags.

Recommendations
Everything looks good and nothing is required for now.

Guidelines
Effectively covering the function of the robots.txt file here isn’t possible. To learn
more about robots.txt best practices, simply visit this page:
http://www.robotstxt.org/robotstxt.html
If you ever have a page that you don’t want Google to index, but that has links you
do want Google to crawl, at that time you would need to implement a page level
tag. Page level Meta robots NoIndex tags would be particularly useful on any page
of your site that you would not want a searcher to enter on, such as a thank you
pages, privacy policy pages or T&C pages.

Score
10/10

24 | TOP INDEXING ACTION ITEMS


SCORE
10/10
PAGE INCLUSIONS

Analysis
Page inclusions consist primarily of creating an XML sitemap and submitting it
through your Google and Bing Webmaster Tools accounts. We found that the
current sitemap is an Html URL page on the site that is hurting it a lot. Score
10/10

Recommendations
Create a proper xml sitemap, upload & submit it to all Gwt.

Guidelines
An XML sitemap should be created for every website, and should be updated and
re-submitted whenever changes are made to the site. Additional information about
XML sitemaps can be found here:
http://www.google.com/support/webmasters/bin/answer.py?answer=156184

Score
10/10

25 | TOP INDEXING ACTION ITEMS


SCORE
10/10
URL REDIRECTS

Analysis
The standard non-WWW to WWW redirects are in place. 301 redirects appeared to
be used correctly. However 3 redirects with 302 response were also found.

Recommendations
No changes are needed at this time with 301 redirects but all the 302 redirects
should be removed and replaced with 301.

Guidelines
Unless a redirect is truly temporary (such as for a time sensitive promotion), 302
redirects should never be used. 302 redirects don’t pass any link value, and are
essentially a dead end for SEO. In almost every scenario where a redirect is needed,
a 301 redirect should be used.
Any page that changes URLs or is deleted needs a 301 permanent redirect to tell
search engines and users that the page has moved/is gone. There should never be
more than one URL path to a page.

26 | TOP INDEXING ACTION ITEMS


SCORE
8/10
DUPLICATE CONTENT

Analysis
We found some instances of duplicate content on your site. We did not find any
coding issues present that could create potential duplicate content problems that
were not already addressed correctly.

Recommendations
Try to minimize your duplicate content.

Guidelines
Search engines really don't like duplicate content, as it leads to a poor user
experience and other content quality issues. If you have duplicate content, you
need to do everything you can to eliminate it.
There are 4 main options for addressing duplicate content:
Fix the URL structure of the site to eliminate accidental duplicate content coming from URL
issues, per the recommendations in the URL Redirects section and this section.
Re-write all duplicate text content to make it unique.
301 redirect the duplicate content to one canonical page/site, if it is in your control.
Implement the rel="canonical" tag to identify the original source/root page to search
engines.

You specify thecanonical version of the URLusing a tag in the head section of the
page as follows:
<linkrel="canonical" href="http://flstatewideinsurance.com"/>

Google makes a pretty serious effort to find duplicate content and keep it out of its index, so this is not something
to take lightly.

27 | TOP INDEXING ACTION ITEMS


SCORE
SCORE
9/10
5/10
BROKEN LINKS
Analysis
Google Webmaster Tools for the site identified 0, error pages with 404 response
code and 0 no response pages, that will need to be 301 (permanently) redirected to
the most appropriate page.

404 PAGES
No 404 pages found

Recommendations
Nothing needed for now but if 404 pages are found then, Implement a 301 redirect
for any 404 page, pointing to the most appropriate live URL. In the future, whenever
any changes are made to the URL of a page that has previously been live, you will
need to implement a 301 redirect pointing the old URL to the new URL. You will also
need to change all internal links on the site that point to the old URL to point to the
correct new URL.
Also, regularly monitor Google and Bing Webmaster Tools for crawl errors (broken
pages and other site errors) and assign someone to create 301 redirects for any
broken pages that are found.

Guidelines
Because Google and other search engines crawl the web link-to-link, broken links
can cause SEO problems for a website. When Google is crawling a site and hits a
broken link, the crawler immediately leaves the site. If Google encounters too many
broken links on a site it may deem that site a poor user experience, which can
cause a reduced crawl rate/depth, and both indexing and ranking problems.
Unfortunately, broken links can also happen due to someone outside of your site
linking in incorrectly. While these types of broken links can’t be avoided, they can
be easily fixed with a 301 redirect.
To avoid both user and search engine problems, you should routinely check Google
and Bing Webmaster Tools for crawl errors, and run a tool like XENU Link Sleuth or
Screaming Frogon your site to make sure there are no crawlable broken links.
If broken links are found, you need to implement a 301 redirect per the guidelines in
the URL Redirect section. You can also use your Google Webmaster Tools account to
check for broken links that Google has found on your site.

28 | TOP INDEXING ACTION ITEMS


SCORE
6/10
CODE VALIDATION

Analysis
We ran the home page through the W3C Validation Tool, and this was what we saw:

Errors are fairly common, 21 errors and 9 warnings mean your site couldn’t
potentially have trouble on browsers, and can’t potentially pose a problem to search
engine spiders.

Recommendations
While it isn’t strictly necessary, you would ideally want to make whatever code
changes you need to make to ensure sure all pages on the site validate at 100% via
the following validator to err on the side of caution.
In addition to validation via W3.org, you will want to test the site on all major
browser types to make sure that there are no cross-browser compatibility issues
(you can do this with Adobe’s Browser Lab tool).

Guidelines
The W3C markup validator service can be found at this location
http://validator.w3.org/.
Because there are so many programming languages and so many ways to
accomplish any one thing using each language search engines rely on certain rules
in which they read the content of the website.
Having code that adheres to these rules removes and helps to minimize errors when
parsing or separating the code from the content of any one page.
Search engines such as Google have openly stated that W3C standards are what
they suggest when making the code easy to understand for them. We typically only
test the home page of the website, because many issues can be easily fixed a
crossed the entire website using just its page templates.

29 | TOP INDEXING ACTION ITEMS


SCORE
7/10
PAGE LOAD SPEED
Analysis
When we scanned your site using Pingdom Tools, we found the following:

While a page load speed of 4.04 seconds is considered slow, we feel that the load
speed shown on this scan was incorrect. Based on the number of requests and the
page size, we believe the load speed is likely in the 5 seconds range. Webmaster
Tools or Google Analytics can provide a more accurate page load speed. If the page
load speed really is in the 5 seconds range as we believe it is, that is below average,
and is slow enough that it is almost certainly impacting your conversions, especially
on mobile devices.

30 | TOP INDEXING ACTION ITEMS


53 requests is a significant number, and the number of items loading is a key factor
in load speed. We recommend that no page call in more than 40 times, preferably
fewer if possible. In addition, a page size of 3.1 mb is significant. We recommend
that page size be kept to 500kb or less to avoid load speed issues. We saw no use
of compression or minification to minimize file sizes.

Recommendations
There are a few things you can do to increase that page load speed to reach
Google’s recommended page load speed of 1.4 seconds or less. We would
recommend leveraging browser caching, CSS Sprites for images where possible,
and reducing the image file sizes as much as possible for images that can’t be
spirited (different file types, removing unnecessary color channels, etc.).
We would also recommend reducing the total number of CSS and JavaScript files by
combining them into fewer files, and minimizing the file sizes by using compression
and minification where feasible.
You might also see benefits by using a content delivery network (CDN)for your
images.
W3 Total Cache is an excellent Wordpress plug-in that can help with page load
speed issues, and a simple CDN can be set-up via Amazon AWS for very little
money. You can learn how to do this here.

Guidelines
On April 9th, 2010,page load speed officially became a part of the Google search
ranking algorithm. For usability reasons, best practices dictate that a web page
should load within 1-2 seconds on a typical DSL connection. However, according to
Google Webmaster Tools data a load time of 1.4 seconds is the threshold between a
fast page and a slow page. That means, ideally, that every page on your website
should load in 1.4 seconds or less, to receive the maximum SEO benefit for fast
loading pages.
Google gathers page load time data through actual user experience data collected
with the Google search toolbar, and may also be combining that with data collected
as Google crawls a website. As such, page load speed in terms of the ranking
algorithm is being measured using the total load time for a page, exactly as a user
would experience it.

31 | TOP INDEXING ACTION ITEMS


SCORE
76%
LINKING ANALYSIS OVERVIEW
Top Linking Action I t e m s

Linking Root Domains


Inbound Followed Links

Linking Analysis
This analysis addresses the key linking related issues found. Implementing the
recommended changes should provide valuable SEO benefits.
There are 5 main linking topics covered, as follows:
Inbound Followed Links
Linking Root Domains
Authority and Trust
Social Media Mentions and Visibility
Competitive Link Comparison

Red = Major Issues (1-3) | Yellow = Some Issues (4-7) | Green = Minor Issues If
Any (8-10)
For each main SEO linking topic, there are 3 sub sections: Analysis,
Recommendations and Guidelines.
The Analysis section details the findings from our analysis.
The Recommendations section identifies our proposed action items.
The Guidelines section gives specific instructions for implementing the required
changes.

Score
Linking Score 7/10
The website had an SEO Linking Score of 40%. This indicates below average SEO
link-ability. Correct implementation of the proceeding guidelines will help to
improve its existing rankings wherever necessary, and to rank for more competitive
keyword phrases in the future.

32|TOP LINKING/SOCIAL ACTION ITEMS


INBOUND FOLLOWED LINKS

33|TOP LINKING/SOCIAL ACTION ITEMS


SCORE
7/10

Analysis
One of the single most important elements of the Google ranking algorithm has to
do with the quantity and quality of external inbound links. As far as we know,
roughly half of the ranking algorithm is tied directly to traditional links. Google also
looks at the anchor text, surrounding text, page topic and linking site topic as
signals regarding what keywords a website should rank for.
When we analyzed the website using SEOmoz’s Open Site Explorer (one of the best
link indices available), we found that it currently has 70 external links pointing in to
the site. This isn’t a significant number of links, and it has only 4 root domains, it
means that all these 70 links are from just these 4 websites and the site it is still far
behind many of the ranking competitors in your space.
The anchor text coming in is widely varied, and uses a good mix of brand terms,
naked URLs, keyword rich anchors and junk links (click here, here, No Anchor, etc.)
The current best practice is to have 15-30% of your overall link profile coming from
exact or near match anchor text links (having less than 15% has been shown to
correlate with lower quality sites).
Recommendations
We recommend making an effort to build additional anchor text rich links to balance
out the link profile and to target sought after keywords. We don't recommend
creating too many keyword focused anchor text links (15-30% of the total is a very
safe number), as over-optimization of anchor text can result in ranking penalties.
Guidelines
Google, and all search engines for that matter, look at links as a signal as to how
legitimate a website is. One of the key ranking factors is inbound links, and Google
looks at those for the following:
Quantity – The total number of links coming in.
Trust/Quality – Which sites links are coming from, and where on those sites the links
appear.
Authority – The online authority of the linking website.
Topical Relevancy – How relevant the website and the page linking to you are to
your website.
Anchor Text – The text of the link pointing to you. They also look at the text
immediately surrounding a link for topical clues.
Velocity/Trajectory – The rate at which new links are created/found. Consistency is
key. Generally the more links the merrier, though there are exceptions to that rule.

34|TOP LINKING/SOCIAL ACTION ITEMS


SCORE
7/10
LINKING ROOT DOMAINS

Analysis
When we scanned the website using SEOmoz’s Open Site Explorer (one of the best
link indices available), and we found linking root domains pointing in.

Your number of linking root domains is 70, it is too low when compared to many of
the top sites whose topics overlap with the website.
Recommendations
We recommend working to significantly increase the number of linking root
domains. There is still significant room for improvement.
There should be an emphasis on domain diversity (getting links from many different
domains) and domain authority (a mix of high and medium quality sites). The
creation of link worthy content, guest blog posts, and high profile press mentions
should be a key focus.
Guidelines
The very best links come from trusted domains (sites like the New York Times, Wall
Street Journal, Wired, Inc., TechCrunch, Huffington Post, Wikipedia, etc.). The more
links you can get from authoritative websites, the better, and guest blog posts and
press mentions are a great way to get those links.
One of the things that Google looks at and factors into the algorithm is domain
diversity. Essentially, the concept is that 10 links from 10 domains would be more
valuable as a ranking factor than 10 links form one domain.
From an SEO perspective, you usually want to see a domain diversity of no less than
10% (i.e. 100 links from 10 domains), though higher is usually better. All other
factors being equal, the site with the larger number of linking root domains would
almost always rank higher. That said, in the case of extremely high quality sites an
acceptable domain diversity could be as little as 2%.

35|TOP LINKING/SOCIAL ACTION ITEMS


SCORE
6/10
AUTHORITY & TRUST
Score
Analysis 8/10
Currently, the best metrics available to measure Authority and Trust are from
SEOmoz, and are known as Domain Authority, mozRank and mozTrust. When we
scanned the website, we found:

The Domain Authority is on a 100 point scale, with any site above 40 being
considered a fairly good quality site, and sites above 70 being considered truly high
quality. mozRank and mozTrust are very similar to Google’s PageRank, and serve to
measure the distance of your site from trusted and authoritative websites.
These metrics place the website well above average in terms of trust and authority,
and well on your way to becoming a highly trusted site. However, you're still lagging
a bit behind other, similar sites.
Recommendations
By further increasing the number of high-quality inbound links, and especially the
number of high-quality linking root domains, the trust and authority will continue to
increase. By focusing link building efforts on ever higher quality websites these
metrics will increase faster than they would with links from average sites.
Guidelines
Pages earn mozRank based on the number and quality of other pages linking to
them. The higher the quality of the incoming links, the higher the mozRank.
mozTrust is determined by calculating link “distance” between a given page and a
seeded trust source on the Internet, such as .edu and .gov pages.

36|TOP LINKING/SOCIAL ACTION ITEMS


SCORE
8/10

SOCIAL MEDIA MENTIONS & VISIBILITY


Analysis
Here is what our analysis found:

Score
10/10

Recommendations
You should increase your social media presence as its going to affect your target
audience and get you more customers.
Guidelines
When someone links to a website from their website, Google sees that as an
endorsement, and that endorsement increases a website’s ability to rank well.
Social signals have a similar effect.
When someone tweets or shares a link to your website, that is seen as an
endorsement much link a link. The “trust and authority” of the person who sent that
link is treated just like trust and authority for a website, and is based on the
authority of the social user. To make the most out of social, the key is to:
Make it easy for people to share your content socially, by integrating sharing
features throughout your website, blog posts, etc.
And to create content that is worthy of sharing, and then reaching out to people in
that space via social channels to ask for feedback about said content.

37|TOP LINKING/SOCIAL ACTION ITEMS


CONCLUSION
Overall, I found that the SEO situation on http://yoursite.com is average, particularly
in regards to off-site SEO. That said, there is still room for improvement. By working
to fix the issues identified in this audit, you’ll be able to achieve higher rankings for
more keywords, and for more competitive keywords as well.
We’re confident that you will get there, and we’re here to help!

If you have any questions about your audit, feel free to contact me!

Khuram Malik

38 | TOP LINKING/SOCIAL ACTION ITEMS


39| SEO AUDIT

You might also like