for the first time to modify the robots file after the love Shanghai love Shanghai Google
should be early station set up the robots file, to prevent the late change
snapshot update, that is a 4 day period, daily traffic is large, but the user is in my shield in the /user/ file, the spider can’t grab, the largest flow is the user center, I try to modify the robots file, let the spider can access the file. Before Google included 120 thousand, remember very deep one is to modify the robots.txt file that is included, Google has more than 120 thousand, second day to more than 1000 per day to the speed of the included crawl, look at this love Shanghai, all right. But the unfortunate thing happened… Snapshot update again after the site: domain name, Google included 60 thousand (from more than 120 thousand), this love Shanghai is changed, the original 35 thousand and 600 to 49 thousand after the update snapshot included, then is to modify the robots file to write the page (the shield group purchase is to call the feeling at that time), do not know what is the reason for Google included greatly reduced, a large collection of love Shanghai, but continue to test, only to find the cause of fatal
later modification to robots is the.
two times to modify the robots file to Google spider, direct K station
seconds of the blog is because Google directly modify the robots file, K station, the station to modify the robots file too frequently lead to the opposite effect, also love Shanghai last week did not update the snapshot, the spider is extremely sensitive to the robots file, which is to optimize the minefield can not touch the new station.
website also do so for a long time, the webmaster can encounter things encountered, is nothing more than the most common site is down right, the website snapshot does not update the main keywords website ranking drop, reducing the number of the chain, and so on, these problems are often due to the site early preparatory work ready on the line the result a late replacement website or frequently modify other spiders crawling file often caused, Xiao Bian today and we explore the impact on the site and modify the robots file and some search engine response.
… There is a small The
first station is not always love long established to modify the modifications that, to move home, my station security is robots I "decorate" extremely rich, things happen at the end of the year, the robots file shield Google search spiders love Shanghai outside when Sogou captured rogue (leading to the site traffic is not enough), third days after the modification, from Google included the Guangdong Taobao all the way down to the station until day more than 60 thousand to 1200 included, at the heart of all the cool part, Google has had to crawl the site, included in the 1000 hovering.