Google Blogger has come out with a major update regarding the default sitemap and robots.txt of Blogs running on the Blogger platform, including those publishing from their own Custom Domains. This major change happened somewhere during the first week of December 2014 or even some time before that, most Blogs powered by Blogger appear to be unaware of this major change.
Blogger as usual does not inform its users about any changes it makes; it is left to the Blogger and they have to discover this all by themselves.
This site does not use the Blogger platform, but I run a few Sites, using Custom Domains on Blogger and accidently discovered this change because I was trying to find solutions to messages and errors in the HTML suggestions feature of Webmasters Tools, regarding DNS Errors and Duplicate title tags and descriptions resulting from ?m=1 and ?m=0 redirects.
The new default Site Map is
Those who have inserted Custom robots.txt, wide the Search Preferences feature in Blogger should change their default Robots Txt or leave it blank or disable this feature. In case the feature is disabled, Google uses the default Robots Txt that is found on the server. The new updated robots.txt file can also be submitted to Google Search, vide the Test Robots Text feature in Webmasters Tools.
This sitemap can also be submitted to Google through the Webmasters Toots by inserting sitemap.xml after your Blog Address in the Sitemaps feature under the Crawl Section. However, one does not know what happens to the Custom Sitemaps, which have already been submitted by users. I hope a Blogger engineer gives a clarification vide the Google Products Forum.