Google can't fetch large sitemap with 50k URLs, nor will browsers render itThe Sitemap ParadoxProblem with Google Webmaster Tools SitemapDoes Rich Snippets error messages prevent new SERP display appearing?Google sitemap priorities, am I doing the right thing?Should I put my locale query param in sitemap url?Google Search Console reporting warnings for URLs in the sitemap that have been removed from it for two weeksHow to change the structure of my URLs along with the Sitemap for a live website?Google Could Not Fetch SitemapWhy the new Google Search Console does not read my sitemap?

Can you decide not to sneak into a room after seeing your roll?

Print the sequence

How much income am I getting by renting my house?

How do you translate "Don't Fear the Reaper" into Latin?

How honest to be with US immigration about uncertainty about travel plans?

Get first and last day of the week in Ampscript

"Dear Stack Exchange, I am very disappointed in you" - How to construct a strong opening line in a letter?

Why are seats at the rear of a plane sometimes unavailable even though many other seats are available in the plane?

Why does 1.1.1.1 not resolve archive.is?

Diamondize Some Text

First aid scissors confiscated by Dubai airport security

Can Slack really claim not to be a data controller?

What is the word for things that work even when they aren't working (e.g. escalators)?

Can I use both 気温 and 温度 when asking for the weather temperature?

Is it plausible that an interrupted Windows update can cause the motherboard to fail?

What's the most efficient way to draw this region?

Fill a bowl with alphabet soup

Glacial, Magnetic and Mossy Lures; what Pokémon do they attract?

Did Feynman cite a fallacy about only circles having the same width in all directions as a reason for the Challenger disaster?

Big Bracket for equations

Fantasy novel/series with young man who discovers he can use magic that is outlawed

What causes standard door hinges to close up to a certain amount automatically?

An idiomatic word for "very little" in this context?

A Grandma Riddle



Google can't fetch large sitemap with 50k URLs, nor will browsers render it


The Sitemap ParadoxProblem with Google Webmaster Tools SitemapDoes Rich Snippets error messages prevent new SERP display appearing?Google sitemap priorities, am I doing the right thing?Should I put my locale query param in sitemap url?Google Search Console reporting warnings for URLs in the sitemap that have been removed from it for two weeksHow to change the structure of my URLs along with the Sitemap for a live website?Google Could Not Fetch SitemapWhy the new Google Search Console does not read my sitemap?






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty
margin-bottom:0;









1

















My sitemap contains 50K URLs/7,8 MB and this following URL syntax:



<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.sitemaps.org/schemas/sitemap/0.9 http://www.sitemaps.org/schemas/sitemap/0.9/sitemap.xsd">
<url>
<loc> https://www.ninjogos.com.br/resultados?pesquisa=vestido, maquiagem, </loc> <lastmod> 2019-10-03T17:12:01-03:00 </lastmod>
<priority> 1.00 </priority>
</url>
</urlset>


The problems are:



• Search Console says "Sitemap could not be read";



• The Sitemap takes 1 hour to load and Chrome stops working;



enter image description here



• In firefox the Sitemap has downloaded in 1483ms and fully loaded after 5 mins);



Things I've done without sucess:



• Disable GZip compression;



• Delete my .htaccess file;



• Create a test Sitemap with 1K URLs and the same syntax and sent it to Search Console and it's worked but the 50K URLs Sitemap still shows ""unable to fetch Sitemap";



enter image description here



• Tried to inspect the url directly but it gave error and asks to try again later while the 1K urls worked;



• Tried to validate the Sitemap in five different sites (YANDEX, ETC) and all worked without no error/warning



Any light?










share|improve this question









New contributor



Eder Leandro is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.





























    1

















    My sitemap contains 50K URLs/7,8 MB and this following URL syntax:



    <?xml version="1.0" encoding="UTF-8"?>
    <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.sitemaps.org/schemas/sitemap/0.9 http://www.sitemaps.org/schemas/sitemap/0.9/sitemap.xsd">
    <url>
    <loc> https://www.ninjogos.com.br/resultados?pesquisa=vestido, maquiagem, </loc> <lastmod> 2019-10-03T17:12:01-03:00 </lastmod>
    <priority> 1.00 </priority>
    </url>
    </urlset>


    The problems are:



    • Search Console says "Sitemap could not be read";



    • The Sitemap takes 1 hour to load and Chrome stops working;



    enter image description here



    • In firefox the Sitemap has downloaded in 1483ms and fully loaded after 5 mins);



    Things I've done without sucess:



    • Disable GZip compression;



    • Delete my .htaccess file;



    • Create a test Sitemap with 1K URLs and the same syntax and sent it to Search Console and it's worked but the 50K URLs Sitemap still shows ""unable to fetch Sitemap";



    enter image description here



    • Tried to inspect the url directly but it gave error and asks to try again later while the 1K urls worked;



    • Tried to validate the Sitemap in five different sites (YANDEX, ETC) and all worked without no error/warning



    Any light?










    share|improve this question









    New contributor



    Eder Leandro is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
    Check out our Code of Conduct.

























      1












      1








      1








      My sitemap contains 50K URLs/7,8 MB and this following URL syntax:



      <?xml version="1.0" encoding="UTF-8"?>
      <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.sitemaps.org/schemas/sitemap/0.9 http://www.sitemaps.org/schemas/sitemap/0.9/sitemap.xsd">
      <url>
      <loc> https://www.ninjogos.com.br/resultados?pesquisa=vestido, maquiagem, </loc> <lastmod> 2019-10-03T17:12:01-03:00 </lastmod>
      <priority> 1.00 </priority>
      </url>
      </urlset>


      The problems are:



      • Search Console says "Sitemap could not be read";



      • The Sitemap takes 1 hour to load and Chrome stops working;



      enter image description here



      • In firefox the Sitemap has downloaded in 1483ms and fully loaded after 5 mins);



      Things I've done without sucess:



      • Disable GZip compression;



      • Delete my .htaccess file;



      • Create a test Sitemap with 1K URLs and the same syntax and sent it to Search Console and it's worked but the 50K URLs Sitemap still shows ""unable to fetch Sitemap";



      enter image description here



      • Tried to inspect the url directly but it gave error and asks to try again later while the 1K urls worked;



      • Tried to validate the Sitemap in five different sites (YANDEX, ETC) and all worked without no error/warning



      Any light?










      share|improve this question









      New contributor



      Eder Leandro is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.











      My sitemap contains 50K URLs/7,8 MB and this following URL syntax:



      <?xml version="1.0" encoding="UTF-8"?>
      <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.sitemaps.org/schemas/sitemap/0.9 http://www.sitemaps.org/schemas/sitemap/0.9/sitemap.xsd">
      <url>
      <loc> https://www.ninjogos.com.br/resultados?pesquisa=vestido, maquiagem, </loc> <lastmod> 2019-10-03T17:12:01-03:00 </lastmod>
      <priority> 1.00 </priority>
      </url>
      </urlset>


      The problems are:



      • Search Console says "Sitemap could not be read";



      • The Sitemap takes 1 hour to load and Chrome stops working;



      enter image description here



      • In firefox the Sitemap has downloaded in 1483ms and fully loaded after 5 mins);



      Things I've done without sucess:



      • Disable GZip compression;



      • Delete my .htaccess file;



      • Create a test Sitemap with 1K URLs and the same syntax and sent it to Search Console and it's worked but the 50K URLs Sitemap still shows ""unable to fetch Sitemap";



      enter image description here



      • Tried to inspect the url directly but it gave error and asks to try again later while the 1K urls worked;



      • Tried to validate the Sitemap in five different sites (YANDEX, ETC) and all worked without no error/warning



      Any light?







      seo google-search-console xml-sitemap






      share|improve this question









      New contributor



      Eder Leandro is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.










      share|improve this question









      New contributor



      Eder Leandro is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.








      share|improve this question




      share|improve this question



      share|improve this question








      edited 9 hours ago









      Stephen Ostermiller

      72.2k14 gold badges102 silver badges263 bronze badges




      72.2k14 gold badges102 silver badges263 bronze badges






      New contributor



      Eder Leandro is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.








      asked 12 hours ago









      Eder LeandroEder Leandro

      61 bronze badge




      61 bronze badge




      New contributor



      Eder Leandro is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.




      New contributor




      Eder Leandro is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.

























          1 Answer
          1






          active

          oldest

          votes


















          4


















          You should test your sitemap with a downloading program such as curl or wget instead of using a browser like Chrome or Firefox. You should be able to download the file within 3 minutes with a download program. If the file takes longer to download for you, then Googlebot will probably also have problems with it. You can:



          • Upgrade your hosting so your entire site is faster

          • Pre-compress your sitemap with gzip so that the URL is sitemap.xml.gz. That way it will be much smaller and you won't need to disable gzip on your server.

          • Remove lastmod and priority from your sitemap since Google doesn't use them anyway.

          • Break up your sitemap into smaller pieces and use a sitemap index file

          • Remove white space from your sitemap. It looks like all your fields are surrounded by spaces. That isn't correct. It is adding to the size and possibly confusing search engines. <priority> 1.00 </priority> should be <priority>1.00</priority>. Same for <loc> and <lastmod>.

          • Remove unnecessary URLs from your site map.

          On the last point, your example is problematic. I believe "resultados pesquisa" translates to "search results". Google doesn't want to have your search results pages indexed. You should be blocking Googlebot from crawling them and you should remove them from your sitemap. See Search results in search results. Having your site search results indexed is bad user experience for users from Google and it can cause Google to penalize your entire site.



          You tagged your question seo but your XML sitemap probably won't help your SEO at all. Google doesn't rank pages better because they are in a sitemap, nor will Google usually choose to index a page just because it is in the sitemap. See The Sitemap Paradox. The benefits from having a sitemap are mostly in getting better stats out of Google Search Console. You could also use it as one method of telling Google about your canonical URLs (but they are better ways such as canonical tags.) Because they aren't much use, if your sitemaps are giving you headaches, you can just delete them and not worry about having them. It won't hurt your site or its SEO.






          share|improve this answer


























          • My site is about games if a user searches for "horse" it will return "best horse games" with PHP GET, having this indexed is not good for SEO? Given that I don't have horse games category and other millions of possibilities taken from my games API

            – Eder Leandro
            5 hours ago











          • I Mean.. these pages ranked in these thousands of low competition keywords will eventually make my site rank better... especially if the conversion generates any backlinks... or at least I think ... Is it right to think this way?

            – Eder Leandro
            5 hours ago











          • I will compress my sitemap in gz... the Sitemap has not been validated in Search Console so far ... anything i'll update here

            – Eder Leandro
            5 hours ago












          • My apashe server already is configured for compress xml... the compression you say is not the gzip server compression? I'ts a manual gzip compression? If I do this is need disable gzip? because it will do a twice compress

            – Eder Leandro
            5 hours ago






          • 1





            If your pages look like search results Google doesn't want them indexed.

            – Stephen Ostermiller
            2 hours ago












          Your Answer








          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "45"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: false,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: null,
          bindNavPrevention: true,
          postfix: "",
          imageUploader:
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/4.0/"u003ecc by-sa 4.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          ,
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );







          Eder Leandro is a new contributor. Be nice, and check out our Code of Conduct.









          draft saved

          draft discarded
















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fwebmasters.stackexchange.com%2fquestions%2f125525%2fgoogle-cant-fetch-large-sitemap-with-50k-urls-nor-will-browsers-render-it%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown


























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          4


















          You should test your sitemap with a downloading program such as curl or wget instead of using a browser like Chrome or Firefox. You should be able to download the file within 3 minutes with a download program. If the file takes longer to download for you, then Googlebot will probably also have problems with it. You can:



          • Upgrade your hosting so your entire site is faster

          • Pre-compress your sitemap with gzip so that the URL is sitemap.xml.gz. That way it will be much smaller and you won't need to disable gzip on your server.

          • Remove lastmod and priority from your sitemap since Google doesn't use them anyway.

          • Break up your sitemap into smaller pieces and use a sitemap index file

          • Remove white space from your sitemap. It looks like all your fields are surrounded by spaces. That isn't correct. It is adding to the size and possibly confusing search engines. <priority> 1.00 </priority> should be <priority>1.00</priority>. Same for <loc> and <lastmod>.

          • Remove unnecessary URLs from your site map.

          On the last point, your example is problematic. I believe "resultados pesquisa" translates to "search results". Google doesn't want to have your search results pages indexed. You should be blocking Googlebot from crawling them and you should remove them from your sitemap. See Search results in search results. Having your site search results indexed is bad user experience for users from Google and it can cause Google to penalize your entire site.



          You tagged your question seo but your XML sitemap probably won't help your SEO at all. Google doesn't rank pages better because they are in a sitemap, nor will Google usually choose to index a page just because it is in the sitemap. See The Sitemap Paradox. The benefits from having a sitemap are mostly in getting better stats out of Google Search Console. You could also use it as one method of telling Google about your canonical URLs (but they are better ways such as canonical tags.) Because they aren't much use, if your sitemaps are giving you headaches, you can just delete them and not worry about having them. It won't hurt your site or its SEO.






          share|improve this answer


























          • My site is about games if a user searches for "horse" it will return "best horse games" with PHP GET, having this indexed is not good for SEO? Given that I don't have horse games category and other millions of possibilities taken from my games API

            – Eder Leandro
            5 hours ago











          • I Mean.. these pages ranked in these thousands of low competition keywords will eventually make my site rank better... especially if the conversion generates any backlinks... or at least I think ... Is it right to think this way?

            – Eder Leandro
            5 hours ago











          • I will compress my sitemap in gz... the Sitemap has not been validated in Search Console so far ... anything i'll update here

            – Eder Leandro
            5 hours ago












          • My apashe server already is configured for compress xml... the compression you say is not the gzip server compression? I'ts a manual gzip compression? If I do this is need disable gzip? because it will do a twice compress

            – Eder Leandro
            5 hours ago






          • 1





            If your pages look like search results Google doesn't want them indexed.

            – Stephen Ostermiller
            2 hours ago















          4


















          You should test your sitemap with a downloading program such as curl or wget instead of using a browser like Chrome or Firefox. You should be able to download the file within 3 minutes with a download program. If the file takes longer to download for you, then Googlebot will probably also have problems with it. You can:



          • Upgrade your hosting so your entire site is faster

          • Pre-compress your sitemap with gzip so that the URL is sitemap.xml.gz. That way it will be much smaller and you won't need to disable gzip on your server.

          • Remove lastmod and priority from your sitemap since Google doesn't use them anyway.

          • Break up your sitemap into smaller pieces and use a sitemap index file

          • Remove white space from your sitemap. It looks like all your fields are surrounded by spaces. That isn't correct. It is adding to the size and possibly confusing search engines. <priority> 1.00 </priority> should be <priority>1.00</priority>. Same for <loc> and <lastmod>.

          • Remove unnecessary URLs from your site map.

          On the last point, your example is problematic. I believe "resultados pesquisa" translates to "search results". Google doesn't want to have your search results pages indexed. You should be blocking Googlebot from crawling them and you should remove them from your sitemap. See Search results in search results. Having your site search results indexed is bad user experience for users from Google and it can cause Google to penalize your entire site.



          You tagged your question seo but your XML sitemap probably won't help your SEO at all. Google doesn't rank pages better because they are in a sitemap, nor will Google usually choose to index a page just because it is in the sitemap. See The Sitemap Paradox. The benefits from having a sitemap are mostly in getting better stats out of Google Search Console. You could also use it as one method of telling Google about your canonical URLs (but they are better ways such as canonical tags.) Because they aren't much use, if your sitemaps are giving you headaches, you can just delete them and not worry about having them. It won't hurt your site or its SEO.






          share|improve this answer


























          • My site is about games if a user searches for "horse" it will return "best horse games" with PHP GET, having this indexed is not good for SEO? Given that I don't have horse games category and other millions of possibilities taken from my games API

            – Eder Leandro
            5 hours ago











          • I Mean.. these pages ranked in these thousands of low competition keywords will eventually make my site rank better... especially if the conversion generates any backlinks... or at least I think ... Is it right to think this way?

            – Eder Leandro
            5 hours ago











          • I will compress my sitemap in gz... the Sitemap has not been validated in Search Console so far ... anything i'll update here

            – Eder Leandro
            5 hours ago












          • My apashe server already is configured for compress xml... the compression you say is not the gzip server compression? I'ts a manual gzip compression? If I do this is need disable gzip? because it will do a twice compress

            – Eder Leandro
            5 hours ago






          • 1





            If your pages look like search results Google doesn't want them indexed.

            – Stephen Ostermiller
            2 hours ago













          4














          4










          4









          You should test your sitemap with a downloading program such as curl or wget instead of using a browser like Chrome or Firefox. You should be able to download the file within 3 minutes with a download program. If the file takes longer to download for you, then Googlebot will probably also have problems with it. You can:



          • Upgrade your hosting so your entire site is faster

          • Pre-compress your sitemap with gzip so that the URL is sitemap.xml.gz. That way it will be much smaller and you won't need to disable gzip on your server.

          • Remove lastmod and priority from your sitemap since Google doesn't use them anyway.

          • Break up your sitemap into smaller pieces and use a sitemap index file

          • Remove white space from your sitemap. It looks like all your fields are surrounded by spaces. That isn't correct. It is adding to the size and possibly confusing search engines. <priority> 1.00 </priority> should be <priority>1.00</priority>. Same for <loc> and <lastmod>.

          • Remove unnecessary URLs from your site map.

          On the last point, your example is problematic. I believe "resultados pesquisa" translates to "search results". Google doesn't want to have your search results pages indexed. You should be blocking Googlebot from crawling them and you should remove them from your sitemap. See Search results in search results. Having your site search results indexed is bad user experience for users from Google and it can cause Google to penalize your entire site.



          You tagged your question seo but your XML sitemap probably won't help your SEO at all. Google doesn't rank pages better because they are in a sitemap, nor will Google usually choose to index a page just because it is in the sitemap. See The Sitemap Paradox. The benefits from having a sitemap are mostly in getting better stats out of Google Search Console. You could also use it as one method of telling Google about your canonical URLs (but they are better ways such as canonical tags.) Because they aren't much use, if your sitemaps are giving you headaches, you can just delete them and not worry about having them. It won't hurt your site or its SEO.






          share|improve this answer














          You should test your sitemap with a downloading program such as curl or wget instead of using a browser like Chrome or Firefox. You should be able to download the file within 3 minutes with a download program. If the file takes longer to download for you, then Googlebot will probably also have problems with it. You can:



          • Upgrade your hosting so your entire site is faster

          • Pre-compress your sitemap with gzip so that the URL is sitemap.xml.gz. That way it will be much smaller and you won't need to disable gzip on your server.

          • Remove lastmod and priority from your sitemap since Google doesn't use them anyway.

          • Break up your sitemap into smaller pieces and use a sitemap index file

          • Remove white space from your sitemap. It looks like all your fields are surrounded by spaces. That isn't correct. It is adding to the size and possibly confusing search engines. <priority> 1.00 </priority> should be <priority>1.00</priority>. Same for <loc> and <lastmod>.

          • Remove unnecessary URLs from your site map.

          On the last point, your example is problematic. I believe "resultados pesquisa" translates to "search results". Google doesn't want to have your search results pages indexed. You should be blocking Googlebot from crawling them and you should remove them from your sitemap. See Search results in search results. Having your site search results indexed is bad user experience for users from Google and it can cause Google to penalize your entire site.



          You tagged your question seo but your XML sitemap probably won't help your SEO at all. Google doesn't rank pages better because they are in a sitemap, nor will Google usually choose to index a page just because it is in the sitemap. See The Sitemap Paradox. The benefits from having a sitemap are mostly in getting better stats out of Google Search Console. You could also use it as one method of telling Google about your canonical URLs (but they are better ways such as canonical tags.) Because they aren't much use, if your sitemaps are giving you headaches, you can just delete them and not worry about having them. It won't hurt your site or its SEO.







          share|improve this answer













          share|improve this answer




          share|improve this answer



          share|improve this answer










          answered 9 hours ago









          Stephen OstermillerStephen Ostermiller

          72.2k14 gold badges102 silver badges263 bronze badges




          72.2k14 gold badges102 silver badges263 bronze badges















          • My site is about games if a user searches for "horse" it will return "best horse games" with PHP GET, having this indexed is not good for SEO? Given that I don't have horse games category and other millions of possibilities taken from my games API

            – Eder Leandro
            5 hours ago











          • I Mean.. these pages ranked in these thousands of low competition keywords will eventually make my site rank better... especially if the conversion generates any backlinks... or at least I think ... Is it right to think this way?

            – Eder Leandro
            5 hours ago











          • I will compress my sitemap in gz... the Sitemap has not been validated in Search Console so far ... anything i'll update here

            – Eder Leandro
            5 hours ago












          • My apashe server already is configured for compress xml... the compression you say is not the gzip server compression? I'ts a manual gzip compression? If I do this is need disable gzip? because it will do a twice compress

            – Eder Leandro
            5 hours ago






          • 1





            If your pages look like search results Google doesn't want them indexed.

            – Stephen Ostermiller
            2 hours ago

















          • My site is about games if a user searches for "horse" it will return "best horse games" with PHP GET, having this indexed is not good for SEO? Given that I don't have horse games category and other millions of possibilities taken from my games API

            – Eder Leandro
            5 hours ago











          • I Mean.. these pages ranked in these thousands of low competition keywords will eventually make my site rank better... especially if the conversion generates any backlinks... or at least I think ... Is it right to think this way?

            – Eder Leandro
            5 hours ago











          • I will compress my sitemap in gz... the Sitemap has not been validated in Search Console so far ... anything i'll update here

            – Eder Leandro
            5 hours ago












          • My apashe server already is configured for compress xml... the compression you say is not the gzip server compression? I'ts a manual gzip compression? If I do this is need disable gzip? because it will do a twice compress

            – Eder Leandro
            5 hours ago






          • 1





            If your pages look like search results Google doesn't want them indexed.

            – Stephen Ostermiller
            2 hours ago
















          My site is about games if a user searches for "horse" it will return "best horse games" with PHP GET, having this indexed is not good for SEO? Given that I don't have horse games category and other millions of possibilities taken from my games API

          – Eder Leandro
          5 hours ago





          My site is about games if a user searches for "horse" it will return "best horse games" with PHP GET, having this indexed is not good for SEO? Given that I don't have horse games category and other millions of possibilities taken from my games API

          – Eder Leandro
          5 hours ago













          I Mean.. these pages ranked in these thousands of low competition keywords will eventually make my site rank better... especially if the conversion generates any backlinks... or at least I think ... Is it right to think this way?

          – Eder Leandro
          5 hours ago





          I Mean.. these pages ranked in these thousands of low competition keywords will eventually make my site rank better... especially if the conversion generates any backlinks... or at least I think ... Is it right to think this way?

          – Eder Leandro
          5 hours ago













          I will compress my sitemap in gz... the Sitemap has not been validated in Search Console so far ... anything i'll update here

          – Eder Leandro
          5 hours ago






          I will compress my sitemap in gz... the Sitemap has not been validated in Search Console so far ... anything i'll update here

          – Eder Leandro
          5 hours ago














          My apashe server already is configured for compress xml... the compression you say is not the gzip server compression? I'ts a manual gzip compression? If I do this is need disable gzip? because it will do a twice compress

          – Eder Leandro
          5 hours ago





          My apashe server already is configured for compress xml... the compression you say is not the gzip server compression? I'ts a manual gzip compression? If I do this is need disable gzip? because it will do a twice compress

          – Eder Leandro
          5 hours ago




          1




          1





          If your pages look like search results Google doesn't want them indexed.

          – Stephen Ostermiller
          2 hours ago





          If your pages look like search results Google doesn't want them indexed.

          – Stephen Ostermiller
          2 hours ago











          Eder Leandro is a new contributor. Be nice, and check out our Code of Conduct.









          draft saved

          draft discarded

















          Eder Leandro is a new contributor. Be nice, and check out our Code of Conduct.












          Eder Leandro is a new contributor. Be nice, and check out our Code of Conduct.











          Eder Leandro is a new contributor. Be nice, and check out our Code of Conduct.














          Thanks for contributing an answer to Webmasters Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid


          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.

          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fwebmasters.stackexchange.com%2fquestions%2f125525%2fgoogle-cant-fetch-large-sitemap-with-50k-urls-nor-will-browsers-render-it%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown









          Popular posts from this blog

          Canceling a color specificationRandomly assigning color to Graphics3D objects?Default color for Filling in Mathematica 9Coloring specific elements of sets with a prime modified order in an array plotHow to pick a color differing significantly from the colors already in a given color list?Detection of the text colorColor numbers based on their valueCan color schemes for use with ColorData include opacity specification?My dynamic color schemes

          Invision Community Contents History See also References External links Navigation menuProprietaryinvisioncommunity.comIPS Community ForumsIPS Community Forumsthis blog entry"License Changes, IP.Board 3.4, and the Future""Interview -- Matt Mecham of Ibforums""CEO Invision Power Board, Matt Mecham Is a Liar, Thief!"IPB License Explanation 1.3, 1.3.1, 2.0, and 2.1ArchivedSecurity Fixes, Updates And Enhancements For IPB 1.3.1Archived"New Demo Accounts - Invision Power Services"the original"New Default Skin"the original"Invision Power Board 3.0.0 and Applications Released"the original"Archived copy"the original"Perpetual licenses being done away with""Release Notes - Invision Power Services""Introducing: IPS Community Suite 4!"Invision Community Release Notes

          199年 目錄 大件事 到箇年出世嗰人 到箇年死嗰人 節慶、風俗習慣 導覽選單