What are Search Engine Sitemaps?

Search engine sitemaps are organized structures of websites' pages, usually useful to search engines for better content indexing. A search engine sitemap (meant only for search engines to see) is a tool provided by Google, Yahoo!, and MSN to allow webmasters to suggest how often, and in what order, a search engine should spider each page within their websites.

Google Sitemaps has become a popular tool in the world of webmasters. The service helps them continually submit fresh content to the search engine.

How to build Sitemaps

Creating a search engine sitemap is quite simple thanks to a free online tool at XML-Sitemaps.com that will automatically spider your website and create the sitemap for you. They offer up to 500 webpages maximum spidered for free.

Alternative tools :

  1. Simply create a sitemap with the free Search Engine Sitemap Generator and upload it to your server.
  2. The Google Sitemap Generator is an open source online application that can help you build a sitemap from scratch. It's a Python script that creates valid search engine sitemaps for your sites using the Sitemap protocol.

Upload the generated script (say sitemap.xml) to your Web site, and tell Google how to find it. Note the file name, and which URL you will use to find it. If you upload it to the root of your domain, then it will be http://www.yourdomain.com/sitemap.xml.

Now log into your Google sitemap account (Google Webmaster Tools), and point Google to the sitemap stored on your site, using the URL that you already know.

Should you get stuck at any point, feel free to browse through Google's official documentation and tutorials on the subject at Using the Sitemap Generator.

Advantages of search engine site map :

The advantage of Search Engine Sitemaps (or XML sitemaps) over a normal “page of links” sitemap is that you can:

  1. Specify the priority of pages to be crawled and/or indexed.
  2. Exclude lower priority pages.
  3. Insure that Search Engines know about every page on your website.

Until today, only Google, Yahoo and MSN supported this protocol. Today however, there is a new member to the family. Ask.com.

No longer! Vanessa announced that all Search Engines have now agreed to accept Sitemap submissions through the robots.txt file on your server.
The robots.txt file is a Search Engine industry standard file that is the VERY FIRST file a LEGIT Search Engine will view when it first comes to your website. Now, you can simply add your sitemap URL to this file in the form of:

Sitemap: http://www.mysite.com/sitemap.xml


Simply create a sitemap with the free Search Engine Sitemap Generator and upload it to your server. Then open the robots.txt file on your server and add the address as above. This makes it as simple as ever to insure that all Search Engines know about your site, know about what pages are in your site and know what pages of your site to list in the search results.

How to Check Your Google Sitemap Reports :

Google will identify any errors in your site and publish the results to you in the form of a report.

Setps to check sitemap report:

  1. Visit the Google Web master tools section of the site. This can be found at www.google.com/webmasters.
  2. Add your site to Google if you haven't already done so.
  3. Verify that you are the site's owner by either uploading an HTML file to your site or adding a Meta tag.
  4. View the statistics and report that Google has already generated about your sitemap. It details the last time the spider went to your page. If your site is not new, then chances are Google has already crawled your site.
  5. Change the option to "Enable Google Page Rank" when prompted in the install process of the program. Then hit Finish and wait.
  6. Click the Add sitemap link to create a new Google sitemap.
  7. Enter your sitemap to tell Google all about your pages.
  8. Visit again to view the reports on your pages.