I am highly skeptical about the usefulness of XML sitemaps.
My suspicion about sitemaps is that they were primarily created by Google to debug Googlebot. When webmasters submit a sitemap, they tell Google what pages are supposed to be on the site. Google’s engineers can compare that data with the data from Googlebot crawls. Then Google can use that data to improve Googlebot. The Webmaster Tools are the carrot on the stick to get you to add a sitemap (but you don’t need the sitemap to get the carrot).
You can put a totally incorrect XML sitemap on a Web site and Google will not start de-indexing your site or lowering your rankings.
I just found a video where Vanessa Fox talks about sitemaps:
“It’s really not about the ranking; it’s more about crawling… Sitemaps doesn’t impact your ranking at all.
The only way it impacts ranking is that in it helps in that very first obstacle of learning about all your pages because if we don’t about them we won’t index them and we won’t rank them. But other than that it has no impact on ranking.”
So as long as Google can index your pages (which it should be able to do if you have built your site correctly), sitemaps should not be needed.
I think there is also a common misconception about the <priority> element in XML sitemaps. It doesn’t tell search engines how important your pages are in the overall Web; it only tells them how important your pages are in relation to each other. So if you give all your pages a priority of 0.5 it is the same thing as giving all of your page the priority of 1 (the highest number allowed). More info about that in the specification.
Does anyone have any data that shows that sitemaps have any use on sites that are already getting indexed well? I am highly skeptical of their usefulness and think that the time spent on them could be used for more effective tasks.