|
最近请万网搭建了一个网站,由于自己什么都不懂,所以只能请人了。网站搭建完成,备案完成,结果有四五天不能被各个搜索引擎收录,于是我在百度站长平台查询robots,以下是查询内容,就各位高人指点下,看看是否有影响,又有问题该如何修改!
网站robots内容:
# Created by Yifan @ 2008/11/3
# Sorry! You will see no welcome descriptions except this because we want to limit the total output bytes.
# Section A: Version 1.0 standard
# Please add disallows in this section!
# DO NOT add allows in this section, since "allow" is only allowed in robots.txt version 2.0.
User-agent: *
Disallow: /web/vcodeimg.aspx # verify code imaging
Disallow: /plugin/doc/docEdit.aspx # user comment edit page
Disallow: /~/plugin/doc/docEdit.aspx # user comment edit page
# Section B: Non standard
Crawl-delay: 30
Sitemap: http://www.krsin.com/web/sitemapsxml.aspx?sid=10042351 # this is automatically generated by robots.aspx
# Section C: Version 2.0 standard
# Please check the following url:
# http://www.conman.org/people/spc/robots2.html#format.directives.robot-version
# You can add allows here! eg:
# Allow: /
# Request-rate Hints:
# Request-rate for the specific time, h=hour, m=minute, omitted=second
Request-rate: 1/1h 0030-0329 # 0830-1129, Beijing, zoom +8
Request-rate: 3/1h 0330-0529 # 1130-1329, Beijing, zoom +8
Request-rate: 1/1h 0530-1029 # 1330-1829, Beijing, zoom +8
Request-rate: 1/6m 1030-1329 # 1830-2129, Beijing, zoom +8
Request-rate: 1/1m 1330-0029 # 2130-0829 +1 day, Beijing, zoom +8
# End of file
网站robots.txt分析结果位置 | 路径 | 访问者 | 说明 | 第8行 | /web/vcodeimg.aspx | * (所有UserAgent) | 不允许抓取 | 第9行 | /plugin/doc/docEdit.aspx | * (所有UserAgent) | 不允许抓取 | 第10行 | /~/plugin/doc/docEdit.aspx | * (所有UserAgent) | 不允许抓取 |
robots.txt错误提示位置 | 文件内容 | 错误原因 | 第25行 | Request-rate: 1/1h 0030-0329 | 语法错误 | 第26行 | Request-rate: 3/1h 0330-0529 | 语法错误 | 第27行 | Request-rate: 1/1h 0530-1029 | 语法错误 | 第28行 | Request-rate: 1/6m 1030-1329 | 语法错误 | 第29行 | Request-rate: 1/1m 1330-0029 | 语法错误 |
求各位大大鉴定下,并给与指点。谢谢诸位了!!!
|
|