共计 1269 个字符,预计需要花费 4 分钟才能阅读完成。
大家都知道robots.txt 文件在搜索引擎中有着重要作用,正确建立robots.txt 文件的内容可有效的优化magento,防止大量垃圾链接网址被搜索引擎纳入索引,降低了网站的权重。那么如何建立robots.txt 文件呢?
下面是示例文件:
# Website Sitemap
Sitemap: http://www.mydomain.com/sitemap.xml
# Crawlers Setup
User-agent: *
Crawl-delay: 10
# Allowable Index
Allow: /*?p=
Allow: /catalog/seo_sitemap/category/
Allow:/catalogsearch/result/
# Directories
Disallow: /404/
Disallow: /app/
Disallow: /cgi-bin/
Disallow: /downloader/
Disallow: /includes/
Disallow: /js/
Disallow: /lib/
Disallow: /magento/
Disallow: /media/
Disallow: /pkginfo/
Disallow: /report/
Disallow: /skin/
Disallow: /stats/
Disallow: /var/
# Paths (clean URLs)
Disallow: /index.php/
Disallow: /catalog/product_compare/
Disallow: /catalog/category/view/
Disallow: /catalog/product/view/
Disallow: /catalogsearch/
Disallow: /checkout/
Disallow: /control/
Disallow: /contacts/
Disallow: /customer/
Disallow: /customize/
Disallow: /newsletter/
Disallow: /poll/
Disallow: /review/
Disallow: /sendfriend/
Disallow: /tag/
Disallow: /wishlist/
# Files
Disallow: /cron.php
Disallow: /cron.sh
Disallow: /error_log
Disallow: /install.php
Disallow: /LICENSE.html
Disallow: /LICENSE.txt
Disallow: /LICENSE_AFL.txt
Disallow: /STATUS.txt
# Paths (no clean URLs)
Disallow: /*.js$
Disallow: /*.css$
Disallow: /*.php$
Disallow: /*?p=*&
Disallow: /*?SID=
上面是针对magento网店写的规范性robots.txt 文件,你只需修改sitemap的路径为你的magento站点地图路径,然后保存为robots.txt上传到magento根目录即可。