Thursday, October 4, 2007

Search Engine Optimization

Search Engine Optimization (SEO)

The process of optimizing a web page for high search engine rankings for a particular search term or set of search terms.

SEO TOPICS:
1.Search Visibility Factors
2.Basics of Search Friendly Design
3.Site Maintenance
4.Resources

Basic Page Components:

1.Text (Keyword) Component
Words and phrases that match what your target audience types into search engines
2.Link Component
Site navigation and URL structure that search crawlers can easily follow
3.Popularity Component
Are sites link to you?

Text: Keyword Density:


Choose one or two keywords or phases to optimize for each page
Do not over use - avoid keyword stuffing
Over use of keywords can result in being penalized or ignored
Incorporate other complimentary words and phrases
Check Keyword Density:
http://www.webjectives.com/keyword.htm
http://www.keyworddensity.com/

Text: Keyword Prominence:

Search engines place “weight” on terms according to where they are used
Place Keywords in …
Title tags
Headings and emphasized text
Visible body text
Description meta tags
Alt text in images
Title and body tags are most important
Keyword in the URL is helpful, but not a significant factor in ranking

The Metadata Myth:

Quote: “Metadata improves search relevancy”
False. Except for title and description tags, Web search engines ignore other metadata. However, Metadata does matter to enterprise search.

Quote: “Using standard metadata is a best practice.”
False. It is only a best practice if the metadata is used by your agency for specific applications (e.g. enterprise search or content syndication). It is not a best practice among professional web designers.

Link Component:

Pages will not rank well if your site does not have a navigation scheme
Navigation scheme must please users and search engines
Create a site map, but also plan how pages link to each other
Avoid dangling pages

Problem Navigation Schemes:


Poor HTML coding
Image maps
Frames
JavaScript
Dynamic Pages
Flash


Creating Links:


Keywords in links tell crawlers about the pages to which you are linking
Keywords in links influence relevancy of the page to which you are linking
Avoid “click here” links, instead create links like:


Popularity Component:


Based on the number pages that link to you
The more popular pages that link to you, the higher your popularity
All search engines have different popularity algorithms
Google’s algorithm is called Page Rank
Every page on the Web is given a calculation of it is popularity based on inbound links


Popularity Factors:

Number and Popularity of Inbound Links
Get listed in Yahoo!, DMOZ
Network with other agency and industry sites
Make your site a link magnet
Anchor Text
Others’ use of keywords in a link to your site
Popularity is assigned per page, not for the entire site
Popularity is not inherited
Need to deliberately link internal pages to pass on PR

Outbound Links:

Your popularity is not determined by the sites you link to
Your outbound links affect the popularity of the sites you’re linking to
Internal links and inbound links have the most impact on your popularity
Outbound links help identify you with a hub


Robots Exclusion:

Meta-Tag Robots Exclusion

Robots.txt File
Place in server’s root directory
Two elements: User-agent, Disallow
Example:
User-agent: *
Disallow: /cgi-bin/
Disallow: /scripts/
Disallow: /images/

Robots Exclusion:


Not all search engines pay attention to robots.txt instructions
MSN and Yahoo! obey robots exclusion more often than Google
Never exclude msnbot … you won’t have site search
Blocking bots is contrary to OMB’s guidance

Basics of Search Friendly Design:

Basic Concepts:

Page Content: using content and text that target your audiences; and attract search engines and links from other sites

Navigation: giving users and crawlers easy access to content

Design Considerations: make sure bells and whistles don’t undermine SEM efforts

Page Rank: link popularity

Managing Page Content:

Types of Pages:

Home Page
About
Contact
Site Map
News
Forms
Galleries
FAQs
Catalogs
Product pages
Shopping cart
Search Results




Text to Include:

Keywords
Use language of your audience
Use keyword selection tools
Yahoo!/Overture: http://inventory.overture.com/
Google: https://adwords.google.com/select/KeywordSandbox
Page Content
Make content appear focused
Title tag, headings, contextual links, cross-links
Body text should be visible, i.e., should not have to do any action to view main page text

Primary vs. Secondary Text

Primary Text:
Title tag
Body text
Text near the top of the page
Text in and around links (e.g. anchor text)

Secondary Text:
Alt text
Description tag
Domain name and URL elements

What Kind of Content to Include?

Write your own
Pros: Original, unique content
Cons: Time consuming, bureaucratic process
Use someone else’s
Pros: Easy way to bulk up content
Cons: Still need to worry about how to add value in order to attract links

Use Syndicated Content:

Tends to update frequently
Crawlers visit frequently-updated pages more often
Combine syndicated content to create a unique resource
Syndicate your own content (e.g. RSS)
Increase site visibility and attracts traffic
Another opportunity for inbound links


Managing Site Architecture:

Site Navigation Scheme:


Text Links
Very search engine friendly
Use for primary or secondary navigation
Problems with Text Links
Can negatively skew keyword density
Crawlers tend to read text links first


Site Navigation Scheme:


Navigation buttons
Okay as long as you include alt text
Avoid JavaScript, unless you can provide navigation crawlers can follow
Recommendation: use alt text and text navigation at the bottom of the page – allows you put keywords in multiple places


Site Navigation Scheme:

Image Maps
Crawlers ignore links inside image maps
User text links or navigation buttons elsewhere
Pull Down Menus
Generally not crawler friendly because they need JavaScript or a CGI program
Always provide two forms of navigation: one for your users, and one for your crawlers

Help Crawlers Navigate:

Create a site map
Subscribe to Google Site Maps
A crawler enabling tool to assist the Google crawler
Analyzes information about your site’s architecture to improve crawling


Design Considerations:

Design Considerations:


Bells and whistles (Flash, JavaScript, animation) enhance user experience but can hurt search visibility
Implement features carefully in order to keep search engine ranking
Ensure design team understands SEM concepts

Use External JavaScript and CSS:

Using JavaScript on site navigation can greatly decrease site crawlability
Crawlers do not follow links embedded inside JavaScript code; or they limit the types of embedded links crawled
JavaScript in can decrease page load time
Long download times may indicate the site is spam, and crawlers could ignore
External files decrease page load time for visitors
External files decrease download time for crawlers
Remember to disallow crawling scripts in the robots.txt
External scripts are easier to re-use

Frames:

Do not design in frames
Crawlers have trouble getting from the frameset page to the actual web page
Frameset does not provide crawler with keyword rich text and links
Each page is indexed separately, so pages that only make sense as a frameset will be indexed individually in search engines
tag is ignored due to spam


Frames Workarounds:

Add Navigation
Give all pages unique title and description tags
Put navigation links on your pages
Add Java Script in your tag

This will force the browser to always load the frameset
However, browser will always load the home page, not the indexed page
Back button will be disabled


Flash:

Few search engines crawl links embedded inside a Flash navigation scheme
Flash sites contain little text
If you include Flash …
Include a “Skip” link so both the user and crawler can go to the real homepage
Include title and description meta tags

Dynamic Pages:

Database-driven, created on the fly by asp, cfm, php, jsp or cgi scripts
Dynamic sites are comprised of templates, but usually without original content
When a page is viewed, the template loads the content from the database
Parameters are added to the URL, which tells the template to load specific content

Example: http://smithsonianstore.com/catalog/product.jsp?productId=14273&parentCategoryId=3151&categoryId=3152
URLs such as this are difficult for search engines to index because they do not know the parameters that define a unique page
The more parameters, the less likely pages will be indexed
A database may continually feed data, crashing your server and scaring off the crawler


Search Friendly Dynamic Pages:

Create static HTML pages
Modify URLs so they don’t look like dynamic pages, fewer parameters
Use URL re-write trick using mod_rewrite

Session IDs:

The kiss of death if left unmanaged
Same content is delivered to the crawler but as unique URLs
Crawlers will ignore web pages with session IDs
Omit session IDs if the requestor is a crawler … but no cloaking!

Optimizing PDFs:

Make sure PDFs contain actual text, not images of text
Same rules for use of keywords and phrases apply
Put the most important text in the title, headlines
Minimize document size (> 100K)
Create optimized HTML pages for PDFs

Managing Page Rank:

Understanding Page Rank:


All search engines assign a value to your site based on inbound links
Google calls this relevancy factor “Page Rank” (PR) - synonymous with popularity ranking
An inbound link is a vote for your page
An outbound link is a vote for the page you’re linking to
A page is assigned PR as soon as it’s indexed
Make sure all your pages have at least one link back into your site
The page receiving the most inbound links gets the highest page rank
You can pass PR around to pages on your site
Controversial topic among SEOs
Supporting View: PR leaks in the sense that a page’s outbound links will decrease that page’s available PR for redistribution throughout the site
Opposing View: Search engines analyze inbound and outbound links to determine your authority as hub. No site is an island


Outbound Link Strategy:

Do not create pages with mostly outbound links
Link to quality, related sites – helps to establish you as an authority or hub
If a page contains several outgoing links, also include links to other pages on your site
Don’t be afraid to link to sites with low page rank, but quality content


Getting Links to Your Site:

Links from Yahoo! and DMOZ
Impact on popularity may vary
Helps to get noticed by crawlers
Establish reciprocal link arrangements with agencies covering similar topics
Reach out to state and local agencies
Syndicate your content
Reach out to professional communities of interest


Site Maintenance:

Re-Designing Your Site:

Plan BEFORE you embark on a re-design
Will the site architecture change?
Static to Dynamic?
How is the content changing?
Add SEO review as an activity to your project plan
Make sure your contractor has SEO skills
Poor planning and execution can kill your search rankings
Do not wait until after you re-designed!
Do not ask the FirstGov Search Team to re-crawl your site!

Changes to Site Architecture:

Try to keep the same filenames and directory structure when redesigning
Follow MSN’s “What to do when your site moves”: http://search.msn.com/docs/siteowner.aspx?t=SEARCH_WEBMASTER_REF_Redirectcode.htm
Recommended: Set up HTTP 301 redirects that point to the new site or pages
Not Recommended: Add a meta-refresh tag to your page header. This won’t remove your original page from the MSN index, and thus your site’s search engine.


Removing Content:

Avoid using default 404 pages
Custom 404 pages are more user-friendly
Submit all 404 URLs to the search engines using their Add URL form – quickest way to get the 404 pages out of the index
Remember to change the HTTP status code on custom error pages from 200 to 404
Do not ask the FirstGov Search Team to re-crawl your site!



Windermere Real Estate is the leader for real estate in Prescott AZ with a huge selection of homes for sale.