Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- C:\Users\Ari\Documents\xsscrapy>scrapy crawl xsscrapy -a url='http://www.google.com'
- Traceback (most recent call last):
- File "C:\Python27\lib\runpy.py", line 162, in _run_module_as_main
- "__main__", fname, loader, pkg_name)
- File "C:\Python27\lib\runpy.py", line 72, in _run_code
- exec code in run_globals
- File "C:\Python27\Scripts\scrapy.exe\__main__.py", line 9, in <module>
- File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 143, in execute
- _run_print_help(parser, _run_command, cmd, args, opts)
- File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 89, in _run_print_help
- func(*a, **kw)
- File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 150, in _run_command
- cmd.run(args, opts)
- File "C:\Python27\lib\site-packages\scrapy\commands\crawl.py", line 57, in run
- crawler = self.crawler_process.create_crawler()
- File "C:\Python27\lib\site-packages\scrapy\crawler.py", line 87, in create_crawler
- self.crawlers[name] = Crawler(self.settings)
- File "C:\Python27\lib\site-packages\scrapy\crawler.py", line 25, in __init__
- self.spiders = spman_cls.from_crawler(self)
- File "C:\Python27\lib\site-packages\scrapy\spidermanager.py", line 35, in from_crawler
- sm = cls.from_settings(crawler.settings)
- File "C:\Python27\lib\site-packages\scrapy\spidermanager.py", line 31, in from_settings
- return cls(settings.getlist('SPIDER_MODULES'))
- File "C:\Python27\lib\site-packages\scrapy\spidermanager.py", line 22, in __init__
- for module in walk_modules(name):
- File "C:\Python27\lib\site-packages\scrapy\utils\misc.py", line 68, in walk_modules
- submod = import_module(fullpath)
- File "C:\Python27\lib\importlib\__init__.py", line 37, in import_module
- __import__(name)
- File "C:\Users\Ari\Documents\xsscrapy\xsscrapy\spiders\xss_spider.py", line 21, in <module>
- import requests
- ImportError: No module named requests
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement