1 / 10

Many federal agencies' websites are invisible to search engine users.

Many federal agencies' websites are invisible to search engine users. This is because of barriers to search engine “crawlers”. What can make the information on a site effectively invisible:. Content “hidden” behind search forms Non-HTML links Outdated robots.txt crawling restrictions

sararsmith
Télécharger la présentation

Many federal agencies' websites are invisible to search engine users.

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Many federal agencies' websites are invisible to search engine users.

  2. This is because of barriers to search engine “crawlers” What can make the information on a site effectively invisible: • Content “hidden” behind search forms • Non-HTML links • Outdated robots.txt crawling restrictions • Server errors (crawler times out when fetching content) • Orphaned URLs • Rich media: audio, video • Premium content WEBSearchable DEEP WEBNot searchable

  3. Search engine crawlers cannot navigate database search forms When crawled Search results are invisible

  4. Web search engine The majority of citizens access government through search engines National Institutes of Health (nih.gov) • More than 70% of unique users in July 2006 were referred by commercial search engines (Google, Yahoo, MSN, AOL, Ask) • Only 4% of unique users came directly to nih.gov sites Source: ComScore, 2006

  5. Because of the combination of these factors, many citizens are unknowingly bypassing federal agency websites every day.

  6. The good news • The Sitemap protocol provides a solution: • Makes all pages, documents, records or other data on a federal agency website accessible to search engine crawlers and therefore search engine users • Does not require website redesign or redevelopment, just a comparably modest time investment • Keeps the website owner in control

  7. One solution for all search engines Sitemaps can be made available to web search engines that support the Sitemaps protocol “The launch of Sitemaps is significant because it allows for a single, easy way for websites to provide content and metadata to search engines” — Tim Mayer, Senior Director of Product Management, Yahoo Search “We are 100% behind this protocol -- this kind of collaboration will help improve the search experience for all of our customers” — Ken Moss, General Manager, Live Search

  8. The Sitemap protocol An open, industry standard for web search engine crawling

  9. So, if many federal agencies' websites are invisible to search engine users, how about yours and what can you do about it?

  10. Get informed • Sitemaps.org • Federal Sitemaps: http://tinyurl.com/3byhy7 WEBSearchable DEEP WEBSearchable

More Related