4 Fundamentals about Website sitemap

What is a Sitemap?

A site map is a list of pages of a web site which can be accessed through search engine crawlers and a general user. It can be either a web page carrying all URLs available in website, it must be according the main and actual navigation of website. Web crawlers also accept XML Sitemaps which also informs robots about the publishing date and change frequency of web pages.

  1. Why Sitemap is necessary

You might own a very good website and that definitely want to be reconised by search engines, they you must have a Sitemap. If you do not have a sitemap for your website, there might be a great risk of not being indexed properly at the search engines. A well-produced and managed sitemap increases the ability of search engine like Google, Yahoo and Bing to locate, access and index all the available URLs of your website. You can create a best site map, the best practices has been mentioned in another article ‘How to create best sitemap for your website’.

The first and foremost thing which you need to keep in mind is ‘Do you list all your URLs in your sitemap ?’ Telling how many URLs you are having at your website is not enough. Make sure that you have told the search engines where your sitemap is located by list it through robots.txt or using sitemap options available a search engine webmasters tool.

  1. Do not use absurd Sitemapwebsite sitemap

Generally the website owners keep changing and adding their web pages and generates new URLs but forget to update the URLs in their sitemap. When search engine robots starts crawling the sitemap which has not been updated since months – sometimes years- even the website keeps changing, gets nothing new to index in your website and search engine robots returns with no-change message for indexing. It happens due to broken or un-updated sitemap. You should first stop referring the old or previous sitemaps until you fully update it. An example of well managed site map can be seen here.

Keeping a SEO best practice, you should regularly verify that the sitemaps referred to in your robots.txt and Webmaster tools are the appropriate ones, and ensure the sitemap content only lists the relevant URLs posted on your site.  Your sitemap should ideally be automatically generated at least once a day. Complimentary to sitemaps, you should also have real time RSS feeds to tell Search Engines about all of your fresh URLs which enables them to discover new URLs in a matter of minutes rather than up to 24 hours.

  1. You can test Sitemap through “View Source”

You have to follow the sitemap creating guidelines provided by the search engines and check whether your website URLs does not encoded with invalid XML files and using ampersand characters.  Usu ‘View Source’ feature in your browser to view your website sitemap. That will show you the sitemap that the search engines actually see. You can use the browser debug feature to find out the error in your sitemap schema.

  1. Keep Sitemap Attribute Value Correct

Search engines are having algorithms to monitor and analyse the URLs including the sitemap which has been produced with proper attribute values. Do not put any attribute in your sitemap which are not having proper values. For example, do not set the <lastmod> value set to the time you generate the sitemap.  <lastmod> should be the date of the last modification of the content linked from your sitemap. If you don’t know how to generate this format, use YYYY-MM-DD. Avoid setting <changefreq> and <priority> attribute values to the same values if you do not really know when the content will change and will not be able to differentiate priority between URLs.

Tell about this Article