Advertisement
Guest User

Untitled

a guest
May 22nd, 2019
82
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 5.36 KB | None | 0 0
  1. 2019-05-22 15:54:15 [scrapy.utils.log] INFO: Scrapy 1.6.0 started (bot: scrapybot)
  2. 2019-05-22 15:54:15 [scrapy.utils.log] INFO: Versions: lxml 4.3.1.0, libxml2 2.9.5, cssselect 1.0.3, parsel 1.5.1, w3lib 1.20.0, Twisted 18.9.0, Python 3.7.0 (v3.7.0:1bf9cc5093, Jun 27 2018, 04:06:47) [MSC v.1914 32 bit (Intel)], pyOpenSSL 19.0.0 (OpenSSL 1.1.1a 20 Nov 2018), cryptography 2.5, Platform Windows-7-6.1.7601-SP1
  3. 2019-05-22 15:54:15 [scrapy.crawler] INFO: Overridden settings: {}
  4. 2019-05-22 15:54:15 [scrapy.extensions.telnet] INFO: Telnet Password: bdced3e6efe1309e
  5. 2019-05-22 15:54:15 [scrapy.middleware] INFO: Enabled extensions:
  6. ['scrapy.extensions.corestats.CoreStats',
  7. 'scrapy.extensions.telnet.TelnetConsole',
  8. 'scrapy.extensions.logstats.LogStats']
  9. Unhandled error in Deferred:
  10. 2019-05-22 15:54:15 [twisted] CRITICAL: Unhandled error in Deferred:
  11.  
  12. Traceback (most recent call last):
  13. File "C:\Users\WCS\AppData\Local\Programs\Python\Python37-32\lib\site-packages\scrapy\crawler.py", line 172, in crawl
  14. return self._crawl(crawler, *args, **kwargs)
  15. File "C:\Users\WCS\AppData\Local\Programs\Python\Python37-32\lib\site-packages\scrapy\crawler.py", line 176, in _crawl
  16. d = crawler.crawl(*args, **kwargs)
  17. File "C:\Users\WCS\AppData\Local\Programs\Python\Python37-32\lib\site-packages\twisted\internet\defer.py", line 1613, in unwindGenerator
  18. return _cancellableInlineCallbacks(gen)
  19. File "C:\Users\WCS\AppData\Local\Programs\Python\Python37-32\lib\site-packages\twisted\internet\defer.py", line 1529, in _cancellableInlineCallbacks
  20. _inlineCallbacks(None, g, status)
  21. --- <exception caught here> ---
  22. File "C:\Users\WCS\AppData\Local\Programs\Python\Python37-32\lib\site-packages\twisted\internet\defer.py", line 1418, in _inlineCallbacks
  23. result = g.send(result)
  24. File "C:\Users\WCS\AppData\Local\Programs\Python\Python37-32\lib\site-packages\scrapy\crawler.py", line 80, in crawl
  25. self.engine = self._create_engine()
  26. File "C:\Users\WCS\AppData\Local\Programs\Python\Python37-32\lib\site-packages\scrapy\crawler.py", line 105, in _create_engine
  27. return ExecutionEngine(self, lambda _: self.stop())
  28. File "C:\Users\WCS\AppData\Local\Programs\Python\Python37-32\lib\site-packages\scrapy\core\engine.py", line 69, in __init__
  29. self.downloader = downloader_cls(crawler)
  30. File "C:\Users\WCS\AppData\Local\Programs\Python\Python37-32\lib\site-packages\scrapy\core\downloader\__init__.py", line 88, in __init__
  31. self.middleware = DownloaderMiddlewareManager.from_crawler(crawler)
  32. File "C:\Users\WCS\AppData\Local\Programs\Python\Python37-32\lib\site-packages\scrapy\middleware.py", line 53, in from_crawler
  33. return cls.from_settings(crawler.settings, crawler)
  34. File "C:\Users\WCS\AppData\Local\Programs\Python\Python37-32\lib\site-packages\scrapy\middleware.py", line 35, in from_settings
  35. mw = create_instance(mwcls, settings, crawler)
  36. File "C:\Users\WCS\AppData\Local\Programs\Python\Python37-32\lib\site-packages\scrapy\utils\misc.py", line 140, in create_instance
  37. return objcls.from_crawler(crawler, *args, **kwargs)
  38. File "C:\Users\WCS\AppData\Local\Programs\Python\Python37-32\lib\site-packages\scrapy_selenium\middlewares.py", line 71, in from_crawler
  39. browser_executable_path=browser_executable_path
  40. File "C:\Users\WCS\AppData\Local\Programs\Python\Python37-32\lib\site-packages\scrapy_selenium\middlewares.py", line 43, in __init__
  41. for argument in driver_arguments:
  42. builtins.TypeError: 'NoneType' object is not iterable
  43.  
  44. 2019-05-22 15:54:15 [twisted] CRITICAL:
  45. Traceback (most recent call last):
  46. File "C:\Users\WCS\AppData\Local\Programs\Python\Python37-32\lib\site-packages\twisted\internet\defer.py", line 1418, in _inlineCallbacks
  47. result = g.send(result)
  48. File "C:\Users\WCS\AppData\Local\Programs\Python\Python37-32\lib\site-packages\scrapy\crawler.py", line 80, in crawl
  49. self.engine = self._create_engine()
  50. File "C:\Users\WCS\AppData\Local\Programs\Python\Python37-32\lib\site-packages\scrapy\crawler.py", line 105, in _create_engine
  51. return ExecutionEngine(self, lambda _: self.stop())
  52. File "C:\Users\WCS\AppData\Local\Programs\Python\Python37-32\lib\site-packages\scrapy\core\engine.py", line 69, in __init__
  53. self.downloader = downloader_cls(crawler)
  54. File "C:\Users\WCS\AppData\Local\Programs\Python\Python37-32\lib\site-packages\scrapy\core\downloader\__init__.py", line 88, in __init__
  55. self.middleware = DownloaderMiddlewareManager.from_crawler(crawler)
  56. File "C:\Users\WCS\AppData\Local\Programs\Python\Python37-32\lib\site-packages\scrapy\middleware.py", line 53, in from_crawler
  57. return cls.from_settings(crawler.settings, crawler)
  58. File "C:\Users\WCS\AppData\Local\Programs\Python\Python37-32\lib\site-packages\scrapy\middleware.py", line 35, in from_settings
  59. mw = create_instance(mwcls, settings, crawler)
  60. File "C:\Users\WCS\AppData\Local\Programs\Python\Python37-32\lib\site-packages\scrapy\utils\misc.py", line 140, in create_instance
  61. return objcls.from_crawler(crawler, *args, **kwargs)
  62. File "C:\Users\WCS\AppData\Local\Programs\Python\Python37-32\lib\site-packages\scrapy_selenium\middlewares.py", line 71, in from_crawler
  63. browser_executable_path=browser_executable_path
  64. File "C:\Users\WCS\AppData\Local\Programs\Python\Python37-32\lib\site-packages\scrapy_selenium\middlewares.py", line 43, in __init__
  65. for argument in driver_arguments:
  66. TypeError: 'NoneType' object is not iterable
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement