Skip to content

Google Clarifies 15MB Googlebot Limit – It Is A Huge Limit

The other day, I covered how Google added a line to its Google documentation that Googlebot can crawl the first 15MB of content in an HTML file or supported text-based file, after that, it stops crawling. Then I was a bit shocked to see a large number of SEOs begin to panic.

For some reason, SEOs felt 15MB of raw HTML per page is not enough. 15MB is a massive amount of HTML on a URL by URL basis. It does not include downloading videos, images, etc, it is just the HTML source code. Again, it is a huge limit and none of this was new, it was simply just added to the documentation but has been in place at Google for a long time.

So Google’s Gary Illyes did his thing to clarify and posted a nicely titled blog post on the Google blog named Googlebot and the 15 MB thing. In short, Gary explains “You, dear reader, are unlikely to be the owner of one, since the median size of a HTML file is about 500 times smaller: 30 kilobytes (kB). However, if you are the owner of an HTML page that’s over 15 MB, perhaps you could at least move some inline scripts and CSS dust to external files, pretty please.” He digs in more for those who are concerned, so go read it.

Then John Mueller of Google does his Twitter thread version:

Are you still concerned about this 15MB limit?

Forum discussion at Twitter.