It's reasonably priced at $2 per million pages crawled and 3 cents per CPU hour.
而它的价格很合理,每爬行100万个页面只要2美元,每片CPU每小时只收3分钱。
As the program crawled the various Web sites, it would build a database of the sites and pages crawled, the links each page contained, the results of analysis on each pages, and so on.
当爬虫程序爬过不同的Web站点时,它将建立一个数据库,该数据库中包括它所爬过的站点和网页、每一页所包含的链接、每一页的分析结果等数据。
Once you set up the sitemap to contain all relevant pages to be crawled by the respective robots, the information is ready to export to the file system as a Sitemaps 0.90 compliant XML file.
建立了站点地图,并使之包含所有将被各机器爬行的相关页面之后,便可以开始把信息作为Sitemaps 0.90兼容XML文件导出到文件系统。
应用推荐