I find it ironic that Google and Bing ask web masters to provide a a sitemap.xml file.

This assumes either (1) that the web master hasn't properly linked all pages or (2) that their bot's algorithm is too inefficient to extract links, and they would rather some else would do their work for them.

I think its option (2).

I'm not going to use my own bot to build such a tree for a couple of million URLs. This would take several months, by which time several hundreds new URL will have been added.