Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- 2016-11-27 13:44:48 [scrapy] INFO: Scrapy 1.2.1 started (bot: rekrute)
- 2016-11-27 13:44:48 [scrapy] INFO: Overridden settings: {'NEWSPIDER_MODULE': 'rekrute.spiders', 'ROBOTSTXT_OBEY': True, 'DUPEFILTER_CLASS': 'scrapy.dupefilters.BaseDupeFilter', 'SPIDER_MODULES': ['rekrute.spiders'], 'BOT_NAME': 'rekrute', 'LOGSTATS_INTERVAL': 0}
- 2016-11-27 13:44:48 [scrapy] INFO: Enabled extensions:
- ['scrapy.extensions.telnet.TelnetConsole',
- 'scrapy.extensions.corestats.CoreStats']
- 2016-11-27 13:44:48 [scrapy] INFO: Enabled downloader middlewares:
- ['scrapy.downloadermiddlewares.robotstxt.RobotsTxtMiddleware',
- 'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',
- 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',
- 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',
- 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',
- 'scrapy.downloadermiddlewares.retry.RetryMiddleware',
- 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',
- 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',
- 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',
- 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware',
- 'scrapy.downloadermiddlewares.chunked.ChunkedTransferMiddleware',
- 'scrapy.downloadermiddlewares.stats.DownloaderStats']
- 2016-11-27 13:44:48 [scrapy] INFO: Enabled spider middlewares:
- ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
- 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware',
- 'scrapy.spidermiddlewares.referer.RefererMiddleware',
- 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
- 'scrapy.spidermiddlewares.depth.DepthMiddleware']
- 2016-11-27 13:44:48 [scrapy] INFO: Enabled item pipelines:
- []
- 2016-11-27 13:44:48 [scrapy] DEBUG: Telnet console listening on 127.0.0.1:6023
- 2016-11-27 13:44:48 [scrapy] INFO: Spider opened
- 2016-11-27 13:44:48 [scrapy] DEBUG: Crawled (200) <GET http://www.xxxxxxxxxxx.com/robots.txt> (referer: None)
- 2016-11-27 13:44:48 [scrapy] DEBUG: Forbidden by robots.txt: <GET http://www.xxxxxxxxxxxx.com/uuuuu.php?ppppppp_ID=58845>
- [s] Available Scrapy objects:
- [s] scrapy scrapy module (contains scrapy.Request, scrapy.Selector, etc)
- [s] crawler <scrapy.crawler.Crawler object at 0x7f093dd54b90>
- [s] item {}
- [s] request <GET http://www.xxxxxxxxxxxx.com/uuuuu.php?ppppppp_ID=58845>
- [s] settings <scrapy.settings.Settings object at 0x7f093dd54990>
- [s] Useful shortcuts:
- [s] shelp() Shell help (print this help)
- [s] fetch(req_or_url) Fetch request (or URL) and update local objects
- [s] view(response) View response in a browser
Advertisement
Add Comment
Please, Sign In to add comment