Guest User

Scrapy shell error 0

a guest
Nov 27th, 2016
361
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
Python 2.59 KB | None | 0 0
  1. 2016-11-27 13:44:48 [scrapy] INFO: Scrapy 1.2.1 started (bot: rekrute)
  2. 2016-11-27 13:44:48 [scrapy] INFO: Overridden settings: {'NEWSPIDER_MODULE': 'rekrute.spiders', 'ROBOTSTXT_OBEY': True, 'DUPEFILTER_CLASS': 'scrapy.dupefilters.BaseDupeFilter', 'SPIDER_MODULES': ['rekrute.spiders'], 'BOT_NAME': 'rekrute', 'LOGSTATS_INTERVAL': 0}
  3. 2016-11-27 13:44:48 [scrapy] INFO: Enabled extensions:
  4. ['scrapy.extensions.telnet.TelnetConsole',
  5.  'scrapy.extensions.corestats.CoreStats']
  6. 2016-11-27 13:44:48 [scrapy] INFO: Enabled downloader middlewares:
  7. ['scrapy.downloadermiddlewares.robotstxt.RobotsTxtMiddleware',
  8.  'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',
  9.  'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',
  10.  'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',
  11.  'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',
  12.  'scrapy.downloadermiddlewares.retry.RetryMiddleware',
  13.  'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',
  14.  'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',
  15.  'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',
  16.  'scrapy.downloadermiddlewares.cookies.CookiesMiddleware',
  17.  'scrapy.downloadermiddlewares.chunked.ChunkedTransferMiddleware',
  18.  'scrapy.downloadermiddlewares.stats.DownloaderStats']
  19. 2016-11-27 13:44:48 [scrapy] INFO: Enabled spider middlewares:
  20. ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
  21.  'scrapy.spidermiddlewares.offsite.OffsiteMiddleware',
  22.  'scrapy.spidermiddlewares.referer.RefererMiddleware',
  23.  'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
  24.  'scrapy.spidermiddlewares.depth.DepthMiddleware']
  25. 2016-11-27 13:44:48 [scrapy] INFO: Enabled item pipelines:
  26. []
  27. 2016-11-27 13:44:48 [scrapy] DEBUG: Telnet console listening on 127.0.0.1:6023
  28. 2016-11-27 13:44:48 [scrapy] INFO: Spider opened
  29. 2016-11-27 13:44:48 [scrapy] DEBUG: Crawled (200) <GET http://www.xxxxxxxxxxx.com/robots.txt> (referer: None)
  30. 2016-11-27 13:44:48 [scrapy] DEBUG: Forbidden by robots.txt: <GET http://www.xxxxxxxxxxxx.com/uuuuu.php?ppppppp_ID=58845>
  31. [s] Available Scrapy objects:
  32. [s]   scrapy     scrapy module (contains scrapy.Request, scrapy.Selector, etc)
  33. [s]   crawler    <scrapy.crawler.Crawler object at 0x7f093dd54b90>
  34. [s]   item       {}
  35. [s]   request    <GET http://www.xxxxxxxxxxxx.com/uuuuu.php?ppppppp_ID=58845>
  36. [s]   settings   <scrapy.settings.Settings object at 0x7f093dd54990>
  37. [s] Useful shortcuts:
  38. [s]   shelp()           Shell help (print this help)
  39. [s]   fetch(req_or_url) Fetch request (or URL) and update local objects
  40. [s]   view(response)    View response in a browser
Advertisement
Add Comment
Please, Sign In to add comment