20 of Google’s limits you may not know exist

Don’t get caught off guard by limitations you were not aware of! journalist patron saint Stox shares twenty Google limitations which will impact SEO efforts.

Google includes a ton of various tools, and whereas they handle huge amounts of information, even Google has its limits. Here area unit a number of the boundaries you’ll eventually run into.

1. 1,000 properties in Google Search Console

Per Google’s Search Console facilitate documentation, “You will add up to one,000 properties (websites or mobile apps) to your Search Console account.”
2. 1,000 rows in Google Search Console

Many of the information reports among Google Search Console area unit restricted to one,000 rows within the interface, however you’ll be able to sometimes transfer additional. That’s not true of all of the reports, but (like the markup language enhancements section, that doesn’t appear to own that limit).
3. Google Search Console can show up to two hundred website maps

Fill-Up This Form

Your Email :

Go Next>

The limit for the quantity submitted is higher, however you may solely be shown two hundred. every of these may be associate index file likewise, that appears to own a show limit of four hundred website maps in every. you may technically add every page of {a website|an internet {site|website|web website}|a web site} in its own site map file and bundle those into site map index files and be able to see the individual regulation of eighty,000 pages in every property… not that i like to recommend this.
4. deny file size includes a limit of 2MB and one hundred,000 URLs

According to computer program round-table conference, {this is|this is often|this will be} one in all the errors that you simply can receive once submitting a deny file.
5. Render in Google Search Console cuts off at ten,000 pixels

Google Webmaster Trends Analyst John Mueller had mentioned that there was a cutoff for the “Fetch as Google” feature, and it’s like that cutoff is ten,000 pixels, supported testing.

6. ten million hits per month per property in GA (Google Analytics)

Once you’ve reached this limit, you’ll either be sampled or need to upgrade.
7. Robots.txt soap size is 500KB

As expressed on Google’s Robots.txt Specifications page, “A most file size could also be enforced per crawler. Content that is once the utmost file size could also be unheeded. Google presently enforces a size limit of five hundred kilobytes (KB).”
8. Sitemaps area unit restricted to fiftyMB (uncompressed) and 50,000 URLs

Per Google’s Search Console facilitate documentation:

All formats limit one sitemap to fiftyMB (uncompressed) and 50,000 URLs. If you’ve got a bigger file or additional URLs, you may need to break it into multiple sitemaps. you’ll be able to optionally produce a sitemap index file (a file that points to a listing of sitemaps) and submit that single index file to Google. you’ll be able to submit multiple sitemaps and/or sitemap index files to Google.

Leave a Reply

Your email address will not be published. Required fields are marked *