Advertisement
Guest User

Scrapy output

a guest
Dec 6th, 2016
180
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 5.32 KB | None | 0 0
  1. 2016-12-06 19:39:16 [scrapy] INFO: Scrapy 1.2.1 started (bot: testscrapyproj)
  2. 2016-12-06 19:39:16 [scrapy] INFO: Overridden settings: {'BOT_NAME': 'testscrapyproj', 'SPIDER_MODULES': ['testscrapyproj.spiders'], 'ROBOTSTXT_OBEY': True, 'NEWSPIDER_MODULE': 'testscrapyproj.spiders'}
  3. 2016-12-06 19:39:16 [scrapy] INFO: Enabled extensions:
  4. ['scrapy.extensions.telnet.TelnetConsole',
  5. 'scrapy.extensions.corestats.CoreStats',
  6. 'scrapy.extensions.logstats.LogStats']
  7. 2016-12-06 19:39:16 [scrapy] INFO: Enabled downloader middlewares:
  8. ['scrapy.downloadermiddlewares.robotstxt.RobotsTxtMiddleware',
  9. 'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',
  10. 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',
  11. 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',
  12. 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',
  13. 'scrapy.downloadermiddlewares.retry.RetryMiddleware',
  14. 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',
  15. 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',
  16. 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',
  17. 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware',
  18. 'scrapy.downloadermiddlewares.chunked.ChunkedTransferMiddleware',
  19. 'scrapy.downloadermiddlewares.stats.DownloaderStats']
  20. 2016-12-06 19:39:16 [scrapy] INFO: Enabled spider middlewares:
  21. ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
  22. 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware',
  23. 'scrapy.spidermiddlewares.referer.RefererMiddleware',
  24. 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
  25. 'scrapy.spidermiddlewares.depth.DepthMiddleware']
  26. 2016-12-06 19:39:16 [scrapy] INFO: Enabled item pipelines:
  27. []
  28. 2016-12-06 19:39:16 [scrapy] INFO: Spider opened
  29. 2016-12-06 19:39:16 [scrapy] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
  30. 2016-12-06 19:39:16 [scrapy] DEBUG: Telnet console listening on 127.0.0.1:6023
  31. 2016-12-06 19:39:16 [scrapy] DEBUG: Redirecting (301) to <GET http://localhost:6024> from <GET http://localhost:6024/robots.txt>
  32. 2016-12-06 19:39:16 [scrapy] DEBUG: Redirecting (301) to <GET http://localhost:6024> from <GET http://localhost:6024>
  33. 2016-12-06 19:39:16 [scrapy] DEBUG: Redirecting (301) to <GET http://localhost:6024> from <GET http://localhost:6024>
  34. 2016-12-06 19:39:16 [scrapy] DEBUG: Redirecting (301) to <GET http://localhost:6024> from <GET http://localhost:6024>
  35. 2016-12-06 19:39:16 [scrapy] DEBUG: Redirecting (301) to <GET http://localhost:6024> from <GET http://localhost:6024>
  36. 2016-12-06 19:39:16 [scrapy] DEBUG: Redirecting (301) to <GET http://localhost:6024> from <GET http://localhost:6024>
  37. 2016-12-06 19:39:16 [scrapy] DEBUG: Redirecting (301) to <GET http://localhost:6024> from <GET http://localhost:6024>
  38. 2016-12-06 19:39:16 [scrapy] DEBUG: Redirecting (301) to <GET http://localhost:6024> from <GET http://localhost:6024>
  39. 2016-12-06 19:39:16 [scrapy] DEBUG: Redirecting (301) to <GET http://localhost:6024> from <GET http://localhost:6024>
  40. 2016-12-06 19:39:17 [scrapy] DEBUG: Redirecting (301) to <GET http://localhost:6024> from <GET http://localhost:6024>
  41. 2016-12-06 19:39:17 [scrapy] DEBUG: Redirecting (301) to <GET http://localhost:6024> from <GET http://localhost:6024>
  42. 2016-12-06 19:39:17 [scrapy] DEBUG: Redirecting (301) to <GET http://localhost:6024> from <GET http://localhost:6024>
  43. 2016-12-06 19:39:17 [scrapy] DEBUG: Redirecting (301) to <GET http://localhost:6024> from <GET http://localhost:6024>
  44. 2016-12-06 19:39:17 [scrapy] DEBUG: Redirecting (301) to <GET http://localhost:6024> from <GET http://localhost:6024>
  45. 2016-12-06 19:39:17 [scrapy] DEBUG: Redirecting (301) to <GET http://localhost:6024> from <GET http://localhost:6024>
  46. 2016-12-06 19:39:17 [scrapy] DEBUG: Redirecting (301) to <GET http://localhost:6024> from <GET http://localhost:6024>
  47. 2016-12-06 19:39:17 [scrapy] DEBUG: Redirecting (301) to <GET http://localhost:6024> from <GET http://localhost:6024>
  48. 2016-12-06 19:39:17 [scrapy] DEBUG: Redirecting (301) to <GET http://localhost:6024> from <GET http://localhost:6024>
  49. 2016-12-06 19:39:17 [scrapy] DEBUG: Redirecting (301) to <GET http://localhost:6024> from <GET http://localhost:6024>
  50. 2016-12-06 19:39:17 [scrapy] DEBUG: Redirecting (301) to <GET http://localhost:6024> from <GET http://localhost:6024>
  51. 2016-12-06 19:39:17 [scrapy] DEBUG: Discarding <GET http://localhost:6024>: max redirections reached
  52. 2016-12-06 19:39:17 [scrapy] DEBUG: Redirecting (301) to <GET http://localhost:6024> from <GET http://localhost:6024>
  53. 2016-12-06 19:39:17 [scrapy] DEBUG: Filtered duplicate request: <GET http://localhost:6024>
  54. 2016-12-06 19:39:17 [scrapy] INFO: Closing spider (finished)
  55. 2016-12-06 19:39:17 [scrapy] INFO: Dumping Scrapy stats:
  56. {'downloader/request_bytes': 4564,
  57. 'downloader/request_count': 22,
  58. 'downloader/request_method_count/GET': 22,
  59. 'downloader/response_bytes': 3124,
  60. 'downloader/response_count': 22,
  61. 'downloader/response_status_count/301': 22,
  62. 'dupefilter/filtered': 1,
  63. 'finish_reason': 'finished',
  64. 'finish_time': datetime.datetime(2016, 12, 7, 1, 39, 17, 25244),
  65. 'log_count/DEBUG': 24,
  66. 'log_count/INFO': 7,
  67. 'scheduler/dequeued': 1,
  68. 'scheduler/dequeued/memory': 1,
  69. 'scheduler/enqueued': 1,
  70. 'scheduler/enqueued/memory': 1,
  71. 'start_time': datetime.datetime(2016, 12, 7, 1, 39, 16, 968859)}
  72. 2016-12-06 19:39:17 [scrapy] INFO: Spider closed (finished)
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement