سلام دوستان
الان خیلی وقته گوگل وب مستر خطای Server error رو می ده و من حدس می زنم وقتی گاه و بیگاه ورودی سایتم از گوگل کم میشه به خاطر همینه ... !
هر چی هم مارک از فیکسد می کنم و از کش گوگل حذفش می کنم درست نمیشه و بعد دو سه روز دباره ظاهر میشه.
تصویرش رو هم بی زحمت ببینید :
http://upcity.ir/images/55190476287994900224.jpg
در واقع به قالب گیر می ده.
خود گوگل هم در این مورد اینارو گفته :
Server connectivity errors
Googlebot couldn’t access your site because the request timed out or because your site is blocking Google. As a result, Googlebot was forced to abandon the request.
Excessive page load times, leading to timeouts, can be due to the following:
Dynamic pages taking too long to respond. If the server is busy, it may have returned an overloaded status to ask Googlebot to crawl the site more slowly. In general, we recommend keeping parameters short and using them sparingly. If you’re confident about how parameters work for your site, you can tell Google how we should handle these parameters.
Your site's hosting server is down, overloaded, or misconfigured. If the problem persists, check with your web hoster, and consider increasing your site’s ability to handle traffic.
Your site may also be deliberately or inadvertently blocking Google. In general, this can be the result of a DNS configuration issue or, in some cases, a misconfigured firewall or DoS protection system (sometimes the site's content management system). Protection systems are an important part of good hosting and are often configured to automatically block unusually high levels of server requests. However, because Googlebot often makes more requests than a human user, it can trigger these protection systems, causing them to block Googlebot and prevent it from crawling your website.
To fix such issues, identify which part of your website’s infrastructure is blocking Googlebot and remove the block. The firewall may not be under your control, so you may need to discuss this with your hosting provider.
Some webmasters intentionally prevent Googlebot from reaching their websites, perhaps using a firewall as described above. In these cases, usually the intent is not to entirely block Googlebot, but to control how the site is crawled and indexed. In this case, check the following:
To control Googlebot’s crawling of your content, use the robots exclusion protocol, including using a robots.txt file and configuring URL parameters.
If you’re worried about rogue bots using the Googlebot user-agent, you can verify whether a crawler is actually Googlebot.
If you would like to change how hard Googlebot crawls your site, you can request a change in Googlebot’s crawl rate. Hosting providers can verify ownership of their IP addresses too