Search Engine Optimization: (IET Incubation Centre)
Search Engine Optimization: (IET Incubation Centre)
OPTIMIZATION
[IET Incubation centre]
Mohammed Sharique
Ist year CS ‘B’
March 16, 2013
INDEX
Definition: :-
An internet-based tool that searches an index of documents for a particular term, phrase or text
specified by the user. Commonly used to refer to large web-based search engines that search
through billions of pages on the Internet.
It is a tool that enables users to locate information on the World Wide Web. Search engines use
keywords entered by users to find Web sites which contain the information sought.
Variety
An Internet search can generate a variety of sources for information. Results from online
encyclopedias, news stories, university studies, discussion boards, and even personal blogs can
come up in a basic Internet search. This variety allows anyone searching for information to choose
the types of sources they would like to use, or to use a variety of sources to gain a greater
understanding of a subject.
Precision
Search engines do have the ability to provide refined or more precise results. Putting quotations
marks around a set of words will bring up results with the exact same words, excluding others.
Some search engines, such as Google or Yahoo, enable you to specify the type of web sources to
be searched. Being able to search more precisely allows you to cut down on the amount of
information generated by your search.
Organization
Internet search engines help to organize the Internet and individual websites. Search engines aid in
organizing the vast amount of information that can sometimes be scattered in various places on the
same web page into an organized list that can be used more easily.
• The first part of a search engine is called the spider. The spider (sometimes called a
crawler or robot) is a program that moves around the World Wide Web visiting websites.
• It reads the webpages it finds and follows the links further down into the website.
• The spider returns from time to time and checks for changes. The pages that it finds are
placed into the catalog.
• This index contains a copy of each page that was collected by the spider.
• When a user requests keywords from a search engine, the search engine software sifts
through all the indexed pages to find matching keywords, then returns the results/hits to
the user.
Crawler-based search engines use automated software programs to survey and categorize web
pages. The programs used by the search engines to access your web pages are called ‘spiders’,
‘crawlers’, ‘robots’ or ‘bots’.
A spider will find a web page, download it and analyze the information presented on the web
page. This is a seamless process. The web page will then be added to the search engine’s database.
Then when a user performs a search, the search engine will check its database of web pages for
the key words the user searched on to present a list of link results.
Crawler-based search engines are constantly searching the Internet for new web pages and
updating their database of information with these new or altered pages.
Meta search engines take the results from all the other search engines results, and combine them
into one large listing. Examples of Meta search engines include:
• Metacrawler (www.metacrawler.com)
• Dogpile (www.dogpile.com)
Hybrid search engines use a combination of both crawler-based results and directory results. More
and more search engines these days are moving to a hybrid-based model. Examples of hybrid
search engines are:
• Yahoo (www.yahoo.com)
• Google (www.google.com)
Human-powered directories
Human-powered directories, such as the Yahoo directory, Open Directory and LookSmart, depend
on human editors to create their listings. Typically, webmasters submit a short description to the
directory for their websites, or editors write one for the sites they review, and these manually
edited descriptions will form the search base. Therefore, changes made to individual web pages
will have no effect on how these pages get listed in the search results.
Search engine optimization (SEO) is the process of improving the visibility of a website or a web
page in a search engine's "natural" or un-paid ("organic" or "algorithmic") search results, i.e.
It is the method of analyzing and constructing individual web pages, as well as entire sites, so that
they can be discovered, analyzed, and then indexed by various search engines.
It can also be defined as the process of structuring a web page so that it is found, read, and indexed
by search engines in the most effective manner possible.
Targeted Traffic
Search engine optimization campaign can increase the number of visitors for your website for the
targeted keyword(s) or phrase. Converting those visitors into potential customers is one of the arts
of search engine optimization. Search engine optimization is the only campaign which can drive
targeted traffic through your website. Essentially more targeted traffic equal more sales.
Increase Visibility
Once a website has been optimized, it will increase the visibility of your website in search
engines. More people will visit your website and it will give international recognition to your
products/services.
Cost-effective
One of the great benefits of search engine optimization is that it is cost effective and requires the
minimum amount of capital for the maximum exposure of your website.
Flexibility
It is possible to reach an audience of your own choice through a seo campaign. You can get traffic
according to the organizational strategy to meet the needs and requirements of your choice.
Measurable results
It is a unique quality of seo campaigns that you can quantify the results of seo by positioning
reports of search engines, visitor conversion and the other factors of this nature.
Root Domains
A root domain is the top level hierarchy of a domain. Root domains are purchased from registrars.
The following are examples of root domains:
.example.com
.seomoz.org
.blogspot.com
Subdomains
A subdomain is a "third level" domain name that is part of a larger, top level domain. For example,
"blog.example.com" and "english.example.com" are both subdomains of the ".example.com" root
domain.
Hyphens
For readability sake, a domain name that is longer than 3 words should be separated with hyphens.
i.e. use of hyphens also correlate with spamminess so domain names with more than 3 words
should be avoided.
Subdomains or Sub-folders
Since search engines keep different metrics for domains than they do subdomains, it is
recommended that webmasters place link worthy content like blogs in subfolders rather than
subdomains. (i.e. www.example.com/blog/ rather than blog.example.com) The notable exception
to this is language specific websites. (i.e. en.example.com for english)
That sounds like a great way to save money, but it can reduce your credibility with visitors and
visibility to search engines:
Bad domain names: Low cost or free hosts may not give you access to your own top level
domain name (TLD). You get a name like AAALumpkins.Bad-Cheap-Host.com or Bad-Cheap-
Host.com/~AAALumpkins/ instead of AAALumpkins.com. Your own TLD is easier to remember
and lends a more credible, professional look to your site.
It's worth paying a little more each month to get the most from your domain name!
Ads and frames: Make sure that your free host isn't placing your site inside its own frame or
placing ads or annoying pop up windows on your site. Often, a "free" hosting package means that
the host gets to advertise for free on your Web site. Some search engine spiders have problems
indexing framed sites: your host's frame could be turning spiders away from your page!
Looks like spam: Spammers often use free hosting services or very low cost hosts to create
duplicate sites and submit them to search engines. In response, search engine companies
sometimes try to stem the flood by refusing to index sites that share space with spam sites.
Download limits: Pay close attention to the download limits imposed by the host. Otherwise you
could be the victim of your own success. Some hosts impose monetary penalties on sites that
exceed download limits. If you don't promptly pay the extra cost, some hosts will take your site
down and hold it hostage until you cough up the ransom money.
Static IP address: If you are serious about getting good search engine rankings for your site, you
need to have a static IP address of your own. If you don't know whether you do or not, call or
email your web hosting company and find out whether your IP address for your site is static or
dynamic. If it a static IP address, they should be able to tell you exactly what your static numeric
IP address is. Find out what that is. For example, the static numeric IP address for Words in a Row
is: 50.87.131.12. You should be able to substitute your numeric IP address after the
"http://______", type that into your browser and go directly to your website.
Keyword Research
Before you can start optimizing your site for the search engines, you must first know which terms
you want to target. A good start would be to choose 3 or 4 keywords you would like your website
to rank well for. With these keywords in your mind you can then set a goal to rank in the top 10
results on Google for each of them. When designing your site, start with a list of keywords that
you've researched, and make pages to focus on each keyword phrase. Google has the best free
keyword tool to help you select keywords, and it shows you an estimate of how many people
searched in the last month for each of the keywords that it suggests. Here's the link: Google
Keyword Tool(https://adwords.google.com/select/KeywordToolExternal). Honestly, this is all
most people will need to find the right keywords for their pages.
Another good free tool for discovering keywords you can use is located here:
Keyword Discovery (http://www.keyworddiscovery.com/search.html)
As a general rule, you should aim to fill up the following character limits within each of your meta
tags:
Page title – 70 characters
Meta description – 160 characters
Meta keywords – No more than 10 keyword phrases
Now, as you’re filling out your meta tags using the character limits shown above, you’ll want to
pay special attention to the way your target keyword phrases are included. Just because the
practice of keyword stuffing your meta tags has been long since devalued by the search engines
doesn’t mean that these phrases shouldn’t be included in your tags. It just means that you need to
be more strategic about how you introduce them.
Example:
A good sample page title tag for the Single Grain “About Us” page might look like:
“About Us | Single Grain Digital Marketing – SEO & CRO Industry Leaders”
A bad example of a sample tag for the same page would be:
“About Us – Single Grain SEO Agency, SEO Marketers, and SEO Consultants”
Although both sample title tags use all of the 70 characters allotted to this meta tag and use SEO
industry related keywords, the good title tag version is a clearly written, compelling option that
utilizes possible target keywords in a natural way.
Remember, your title tags and meta descriptions aren’t just fields that you’re optimizing in the
hopes of receiving some nebulous SEO boost. Instead, these fields often go on to form your
snippet in the natural search results, which means that they must be written to be as compelling as
possible!
Example:
Imagine encountering the two following snippets in the SERPs:
“How to Build Links in 2013
This article talks about link building techniques that will work well in 2013, including email link
prospecting, social media marketing and content marketing.”
Versus
“31 Ways to Easily Attract Backlinks in 2013
Are you using dated link building practices that could be harming your brand? Find out how to
effortlessly build links using these 2013-approved techniques.”
Writing good meta descriptions draws on the principles of copywriting as much as it does SEO
best practices. This can take practice, but the reward is a higher click-through rate, increased
natural search traffic to your website and potentially higher SERPs rankings if—as some SEO
experts believe—it’s true that your overall CTR contributes in some small way to your snippet’s
placement in the natural search listings.
If you aren’t yet an expert copywriter, consider the following guidelines when it comes to crafting
your title tags and meta descriptions:
Add a call to action. Asking people to do something (as in the case of “Find out how” in the
example above) often results in readers taking the action you’ve requested. Other possible calls to
action for your meta descriptions include “Discover how,” “Read more about,” “Click here,” or
other related variation.
Use cliffhangers. The first meta description show above gives everything away, that is, there’s no
real reason for the reader to click through to read the article, as its content is given away by the
snippet. Instead, use cliffhangers in your meta descriptions to encourage viewers to click through
for the full story.
Write your tags for yourself. Once you’ve come up with a possible meta tag, ask yourself,
“Would I click through based on this information?” If your tags don’t yet seem compelling,
rewrite them until you come up with something more enticing.
Put Key Words in Headings
When you're structuring the HTML coding for a Web page, it can look a little like an outline, with
main headings and subheadings. For best SEO results, it is important to place keywords in those
headings, within Heading tags.
Search engines pay special attention to the words in your headings because they expect
headings to include clues to the page’s main topics. You definitely want to include the page’s
keywords inside Heading tags.
Heading tags also provide your pages with an outline, with the heading defining the
paragraph that follows. They outline how your page is structured and organize the information.
The H1 tag indicates your most important topic, and the other H# tags create subtopics.
When assigning Heading tags, keep them in sequence in the HTML, which is how the search
engines can most easily read them. If you wanted to add an H3 tag, it would have to follow an H2
in the code. Similarly, if you had an H4 tag, it could only follow an H3 tag and not an H2.
The words in each Heading tag should be unique and targeted to the page they're on. Unique
and targeted means that your Heading tag's content shouldn't be duplicated anywhere across the
site. If the heading on your tires page is "Classic Mustang tires," "Classic Mustang tires" shouldn't
be the H1 on any other page in your site.
Search engines look for uniqueness on your page. For example, if you have an H1 heading
of Ford Mustang Convertible at the top of two different pages, the search engine might read one of
the pages as redundant and not count it. Having unique Heading tags allows the search engine to
assign more weight to a heading, and headings are one of the most important things on the page
besides the Title tag.
But for the purposes of optimizing your page for your targeted keywords we can make use of the
ALT tags. When you formulate your descriptions for your images make sure to include the
keywords you are using to optimize your site. Also place a variety of these keywords' synonyms
into the ALT tags of the images.
Limit your description, however, to a maximum of 20 words. These do not necessarily
include the stop words the, and, is, of, for, in, at, or, etc. These are not read by search engines and
are often ignored during a search query unless commanded to include them through a plus sign (+)
search operator.
Benefits:-
• If you cannot edit your webpage's Page Title or its meta tags then the next best thing
for you to do is to optimize your articles or blog posts for your targeted keywords.
And one way is to insert keywords in ALT tags for the search engines. This is
because the search engines only read text on any webpage. They can not read images
or videos or any embedded object like Java applets or Flash-based objects.
• Your images will also be optimized for Google's or Yahoo's image searchif you place
keywords inside their ALT tags. An internet user who searches for images and clicks
on an image found on your site becomes a possible return visitor. The image's link
opens a new window where the page on which the image is found is opened. This
page in effect becomes a landing page. In fact any webpage on your website can
serve as a landing page.
The above example from Google's SEO Starter Guide shows that anchor text is directly
related to the content at the site. If the surrounding text discussed collecting rare baseball cards
then this link would add even more to the destination site SEO. A list of links with "Click here"
would add little to the destination site's SEO score.
Another function of the meta robots tag is that it tells search engine crawlers what links
to follow and what links to stop with. When you have a lot of links going out of your
website you should know that you lose some Google juice. And as a result, your page
rank would lower down. So what you want to do is to keep that juice to yourself with
some of the links - and you tell the search engine crawlers not to follow the links going
out of your site because in doing so, they will also take some of your Google juice with
them.
If you don’t have a meta robots tag though, don’t panic. By default, the search engine
crawlers WILL index your site and WILL follow links. Let me make it clear that
search engine crawlers following your links is not bad at all. Losing some of your juice
won’t affect your site much in exchange for getting the attention of other websites
you’re linking out to. In fact I don’t recommend using nofollow at all if you don’t have
too much outbound links
Basically the meta robots tag can be cracked down to four main functions for the
search engine crawlers:
• FOLLOW – a command for the search engine crawler to follow the links in
that webpage
• INDEX – a command for the search engine crawler to index that webpage
• NOFOLLOW – a command for the search engine crawler NOT to follow the
links in that webpage
• NOINDEX – a command for the search engine crawler NOT to index that
webpage
Off-page optimization :-
Off-page optimization techniques focus on improving a site’s online credibility against
the major engines. External link credibility is the primary algorithm factor that influences
major search engine rankings. The more quality links you have pointing to your site, the
higher your site will rank on the SERP (search engine results page).
Go to Google and sort in ‘click here’ (without the quote). You will see that the
highest results are from Adobe Reader, Flash Player and Apple QuickTime. If you study these
WebPages, you’ll understand that the phrase ‘click here’ does not even seem on those webpages.
Hence the so-called ‘Onpage’ optimization can’t be found in these WebPages at all.
The only reason these webpages rank high for the phrase ‘click here’ is because there are
most likely tens of thousands of internet sites linking to these webpages using the phrase ‘click
here’ because the anchor text and you know why, don’t you?
Fact :-
Around 80% of your SEO efforts depend on off page optimization. So, in order for the full SEO
process to work properly, BOTH on and off the page techniques should be implemented.
How Does Off Page Optimization Work?
The main concept of off page SEO, is to obtain links from higher ranked websites that
are relevant to your niche and keywords. In order to work, ON page optimization must compliment
your off page efforts in regards to relevance and keyword usage. Why?
• Search engines like Google determine a website's importance based on authority. Authority
is determined by both the contents present on your website, and by the inbound and
outbound links of your website.
• The number of inbound links a site is a better "popularity" clue for Google, because it shows
that your site is popular to others. Thus, bettering the chance of achieving a higher
PageRank for your niche and keywords.
• Search engines will crawl higher ranked websites more often, so aim to achieve inbound
links from sites with better rankings so that your site can benefit as well.
• Your website QUALITY (what others think of your site) is a key factor in obtaining inbound
links and achieving a higher PageRank. Content-poor sites will have difficulty attracting
links, so make sure that you have unique and informative content.
Step1: backlinks/links
Hyperlinks
In addition to the already mentioned steps backlinks are essential. These are links which refer
from one website to you website. The reason is not to get more visitors from these linked sites, or
atleast it is not the main point. The main purpose is that search engins rate websites with many
backlinks as more important and serious than websites with less links.It is comparable to a
recommendation.
E.g. Google uses this recommendation for a special algorithmus to develop a pagerank. This is a
nummber between 0 and 10, which should show the importance of a website. Sites with the high
pagerank will be listed on higher positions.
Anchortext
In this case The anchortext is “more information about SEO”. It is commendable to use such an
anchortext instead of “here”.
On one hand because the visitor knows where the link refers to. On the other
hand search engines take these text as a topic of the page the link refers to. There is no difference
between intern and extern links, so try to choose suggestive descriptions.
The anchor text of external links should be relevant to your targeted keyword phrase. For example,
if another website is linking back to your blog on social media optimization, make sure the anchor
text is “social media optimization.” (Example below)
WRONG: “For more information on social media optimization, click here.
RIGHT: “For more information, read our blog post on social media optimization.
This will tell search engines what your page is about according to other sources and not just you.
Use Google webmaster tools to determine what sites are linking to yours. You can also type “link:
(your URL)” into the Google search box to get a sample-sized list of these sites as well. Reach out to
other websites that are linking to your site and ask them to hyperlink your targeted keyword if they
are not already.
Step2:usefull domain
Useful domain
• The domain should be on one hand as short as possible and on the other hand convincing
• Moreover it might be helpful to use for all subdirectories and files self-explanatory
identifier.
Social Media
Posting on social networks is one of the fastest ways to build backlinks to your site. In the past,
social networking links have not affected PageRank because of their nofolllow attribute; however,
they will still increase site traffic and may indirectly build inbound links in the long run. Recently,
there have been several discussions in the SEO community around Google adding social
networking links to the algorithm, which would affect rankings, but this has not been confirmed
yet by Google.
Paid Links
You should be aware of paid links as it is not a recommended off-page optimization technique.
Google will penalize your site if you pay for inbound links that pass PageRank and will place a
penalty on your site, limiting or even omitting it from their index if their spam team detects it.
Business Owner, Ryan Abood learned this the hard way when his website was banished from the
Google search results page because of paid links his company had purchased.
Blogs
Blogs are beneficial if they are actively managed because search engines love fresh content. If a
search engine identifies that you post fresh content regularly, then they are going to crawl it more
often. It’s also important to keep fresh content on the blog so that visitors can comment on and
link to it from their own active sites.
External blog commenting can also be a great way to send traffic to your website. If you
comment on a blog with relevant content to your site and provide a useful link to your website,
this can help drive qualified traffic, which increases exposure and the likelihood of garnering more
external links. Guest writing on already established blogs has recently become popular as well.
Yahoo Answers
Yahoo Answers is a great way to drive traffic to your site. You can create a free account and then
use it to answer questions that are related to your website content. Yahoo Answers offers the
ability to provide a link in your response, which you can use to reference your site. It is a perfect
way to drive quality traffic to your website that is more than likely going to be interested in your
content and subscribing to your RSS feeds.
This means that both links would show the same page with the same content. Search engines don’t
know that both URLs refer to the same website. For search engines there are two different pages
with the same content. That is exactly what Duplicate Content or Double Content means.
For internet users it is unnecessary to get the similar page twice. Because of this search engines try
to avoid double listing. They either choose one URL or none of them. In any case these website(s)
will be ranked worse. It will have a bad influence on the pageRank.
Method 2: Canonical-tags
The canonical-tag should be included in the head-tag of this page which should not
appear in the google ranking. The href link is the url of the website, which should appear
instead.
It is the best way to avoid this problem.With Redirect 301 pages are moved permanent.
(There is also 302 redirect which is only a temporary movement)
WebRank Toolbar
WebRank Toolbar is a toolbar that gives you automatically website rank and pages indexed in
various search engines. You can also bookmark the current website at Pick Me and other social
bookmarking websites. Read and write website reviews at Wikia. It is a tool for SEO and website
analysis too. It includes Google Pagerank, Alexa Rank, Compete Rank, Quantcast Rank, pages
indexed in Google, pages indexed in Bing 7, pages indexed in Yahoo, bookmark your favorite
websites at Pick Me and many other popular bookmarking websites like Digg, Reddit and also
share on various social websites like My Space, Facebook among others, read and write reviews
about websites at Wikia, and customize the toolbar the way you want, complete customization.
MozBar Toolbar
Download the MozBar (available in Firefox and Chrome) and streamline your SEO. The MozBar
provides easy access to the most powerful SEO tools and data while you surf the Web!.
Compare link metrics for Google, Yahoo! and Bing with our SERP overlay.
Seo quake
Seoquake is a Mozilla Firefox SEO extension aimed primarily at helping web masters who deal with
search engine optimization (SEO) and internet promotion of web sites. Seoquake allows obtaining
and investigating many important SEO parameters of the internet project under study on the fly, save
them for future work, comparing them with the results, obtained for other, competitive, projects.
Beginner web masters and seos will find SeoQuake useful for estimating effectiveness and
competitive ability of their SEO efforts. In the hands of professional SeoQuake will become a
powerful and indispensable tool for analyzing optimization and promotion level of internet web-
projects.
SeoQuake can be distinguished from similar projects by a wide range of parameters available for
analysis, simplicity of use and also a tool for creation of custom parameters.
SeoQuake consists of two functional parts. The first one is SeoBar. The second part is the
computation of the parameters for search results and its inclusion into search engine result pages –
SERPs – for the most popular search engines.
1. Commit yourself to the process. SEO isn’t a one-time event. Search engine algorithms change
regularly, so the tactics that worked last year may not work this year. SEO requires a long-term
outlook and commitment.
2. Be patient. SEO isn’t about instant gratification. Results often take months to see, and this is
especially true the smaller you are, and the newer you are to doing business online.
3. Build a great web site. I’m sure you want to show up on the first page of results. Ask yourself,
“Is my site really one of the 10 best sites in the world on this topic?” Be honest. If it’s not, make it
better.
4. Include a site map page. Spiders can’t index pages that can’t be crawled. A site map will help
spiders find all the important pages on your site, and help the spider understand your site’s
hierarchy. This is especially helpful if your site has a hard-to-crawl navigation menu. If your site is
large, make several site map pages.
5. Make SEO-friendly URLs. Use keywords in your URLs and file names, such as
yourdomain.com/red-widgets.html. Don’t overdo it, though. A file with 3+ hyphens tends to look
spammy and users may be hesitant to click on it. Related bonus tip: Use hyphens in URLs and file
names, not underscores. Hyphens are treated as a “space,” while underscores are not.
6. Do keyword research at the start of the project. If you’re on a tight budget, use the free
versions of Keyword Discovery or WordTracker, both of which also have more powerful paid
versions. Ignore the numbers these tools show; what’s important is the relative volume of one
keyword to another. Another good free tool is Google’s AdWords Keyword Tool, which doesn’t
show exact numbers.
7. Use a unique and relevant title and meta description on every page. The page title is the
single most important on-page SEO factor. It’s rare to rank highly for a primary term (2-3 words)
without that term being part of the page title. The meta description tag won’t help you rank, but it
will often appear as the text snippet below your listing, so it should include the relevant
keyword(s) and be written so as to encourage searchers to click on your listing. Related bonus tip:
You can ignore the Keywords meta tag, as no major search engine today supports it.
8. Write for users first. Google, Yahoo, etc., have pretty powerful bots crawling the web, but to
my knowledge these bots have never bought anything online, signed up for a newsletter, or picked
up the phone to call about your services. Humans do those things, so write your page copy with
humans in mind. Yes, you need keywords in the text, but don’t stuff each page like a Thanksgiving
turkey. Keep it readable.
9. Create great, unique content. This is important for everyone, but it’s a particular challenge for
online retailers. If you’re selling the same widget that 50 other retailers are selling, and everyone
is using the boilerplate descriptions from the manufacturer, this is a great opportunity. Write your
own product descriptions, using the keyword research you did earlier (see #9 above) to target
actual words searchers use, and make product pages that blow the competition away. Plus, retailer
or not, great content is a great way to get inbound links.
10. Use your keywords as anchor text when linking internally. Anchor text helps tells spiders
what the linked-to page is about. Links that say “click here” do nothing for your search engine
visibility.
11. Use press releases wisely. Developing a relationship with media covering your industry or
your local region can be a great source of exposure, including getting links from trusted media
web sites. Distributing releases online can be an effective link building tactic, and opens the door
for exposure in news search sites. Related bonus tip: Only issue a release when you have
something newsworthy to report. Don’t waste journalists’ time.
12. Start a blog and participate with other related blogs. Search engines, Google especially,
love blogs for the fresh content and highly-structured data. Beyond that, there’s no better way to
join the conversations that are already taking place about your industry and/or company. Reading
and commenting on other blogs can also increase your exposure and help you acquire new links.
Related bonus tip: Put your blog at yourdomain.com/blog so your main domain gets the benefit of
any links to your blog posts. If that’s not possible, use blog.yourdomain.com.
13. Use social media marketing wisely. If your business has a visual element, join the
appropriate communities on Flickr and post high-quality photos there. If you’re a service-oriented
business, use Quora and/or Yahoo Answers to position yourself as an expert in your industry. Any
business should also be looking to make use of Twitter and Facebook, as social information and
signals from these are being used as part of search engine rankings for Google and Bing. With any
social media site you use, the first rule is don’t spam! Be an active, contributing member of the
site. The idea is to interact with potential customers, not annoy them.
14. Take advantage of the tools the search engines give you. Sign up for Google Webmaster
Central, Bing Webmaster Tools and Yahoo Site Explorer to learn more about how the search
engines see your site, including how many inbound links they’re aware of.
15. Diversify your traffic sources. Google may bring you 70% of your traffic today, but what if
the next big algorithm update hits you hard? What if your Google visibility goes away tomorrow?
Newsletters and other subscriber-based content can help you hold on to traffic/customers no
matter what the search engines do. In fact, many of the DOs on this list—creating great content,
starting a blog, using social media and local search, etc.—will help you grow an audience of loyal
prospects and customers that may help you survive the whims of search engines.