For eating up bandwidth. Apparently, google represents 60% of my user visit logs. And they crawl alllll of the time. I wonder if sites with large files that eat up a lot of bandwidth. It just seems like a little bit of overkill on google's part.
But then again, people can always use robots.txt