Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- 2016-11-27 13:35:47 [scrapy] INFO: Scrapy 1.2.1 started (bot: rekrute)
- 2016-11-27 13:35:47 [scrapy] INFO: Overridden settings: {'NEWSPIDER_MODULE': 'rekrute.spiders', 'ROBOTSTXT_OBEY': 'False', 'DUPEFILTER_CLASS': 'scrapy.dupefilters.BaseDupeFilter', 'SPIDER_MODULES': ['rekrute.spiders'], 'BOT_NAME': 'rekrute', 'LOGSTATS_INTERVAL': 0}
- 2016-11-27 13:35:47 [scrapy] INFO: Enabled extensions:
- ['scrapy.extensions.telnet.TelnetConsole',
- 'scrapy.extensions.corestats.CoreStats']
- Traceback (most recent call last):
- File "/usr/local/bin/scrapy", line 11, in <module>
- sys.exit(execute())
- File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 142, in execute
- _run_print_help(parser, _run_command, cmd, args, opts)
- File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 88, in _run_print_help
- func(*a, **kw)
- File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 149, in _run_command
- cmd.run(args, opts)
- File "/usr/local/lib/python2.7/dist-packages/scrapy/commands/shell.py", line 65, in run
- crawler.engine = crawler._create_engine()
- File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 97, in _create_engine
- return ExecutionEngine(self, lambda _: self.stop())
- File "/usr/local/lib/python2.7/dist-packages/scrapy/core/engine.py", line 68, in __init__
- self.downloader = downloader_cls(crawler)
- File "/usr/local/lib/python2.7/dist-packages/scrapy/core/downloader/__init__.py", line 88, in __init__
- self.middleware = DownloaderMiddlewareManager.from_crawler(crawler)
- File "/usr/local/lib/python2.7/dist-packages/scrapy/middleware.py", line 58, in from_crawler
- return cls.from_settings(crawler.settings, crawler)
- File "/usr/local/lib/python2.7/dist-packages/scrapy/middleware.py", line 36, in from_settings
- mw = mwcls.from_crawler(crawler)
- File "/usr/local/lib/python2.7/dist-packages/scrapy/downloadermiddlewares/robotstxt.py", line 33, in from_crawler
- return cls(crawler)
- File "/usr/local/lib/python2.7/dist-packages/scrapy/downloadermiddlewares/robotstxt.py", line 24, in __init__
- if not crawler.settings.getbool('ROBOTSTXT_OBEY'):
- File "/usr/local/lib/python2.7/dist-packages/scrapy/settings/__init__.py", line 129, in getbool
- return bool(int(self.get(name, default)))
- ValueError: invalid literal for int() with base 10: 'False'
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement