Large Robots document literacy how to use robots to improve website ranking

Robots file is a web site and spiders between "gentleman’s agreement" – Robots file not only can save the resource of the website, also can help more effectively grasp the spider net, so as to improve the ranking.

* /

disallow:/

Disallow:/??

User-agent:*

Disallow:/*.asp$

User-agent:*

For example:

said shielding all spiders. When we do the pseudo static treatment, at the same time dynamic pages and static web pages, web content as the page state as like as two peas, mirror, so we need to shield the dynamic web page can use * to shield the dynamic web pages

"Disallow:/folder/" said interception is a directory, all files in the directory under the file is not allowed to crawl, but allowed to grab folder.hlml.

"Disallow:/folder": all files under /folder/ and folder.html can be crawled.

high complex: 贵族宝贝mygaofu贵族宝贝, please indicate the link

can also open the good website, see their robots is how to write the file, and then modify according to their own needs. The Robots file can let the spider to spend more time to grab the content, so the optimization of the robots file is necessary.

3: "*" matches any character

4:$, the end of

Disallow:/folder/

User-agent:*

if you want to stop all reptiles Google BOT:

User-agent:*

Disallow:

Uer-agent: allows spider

2: "/folder/" and "/folder" from

"

if you want to intercept the end with a string of the web site, you can use the $, for example, to intercept the.Asp at the end of the address:

Disallow:/folder

User-agent:*

1: only allow Google bot

. This article from the Dongyang

Leave a Reply

Your email address will not be published. Required fields are marked *