How To Build Scrapy Environment
I have already install Twisted, zope.interface, w3lib, libxml2, etc,but it
still can not be built,here is the error message:
Traceback (most recent call last):
File "C:\Python27\lib\runpy.py", line 162, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "C:\Python27\lib\runpy.py", line 72, in _run_code
exec code in run_globals
File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 167, in
execute()
File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 142, in execute
_run_print_help(parser, _run_command, cmd, args, opts)
File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 88, in _run_print
_help
func(*a, **kw)
File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 149, in _run_comm
and
cmd.run(args, opts)
File "C:\Python27\lib\site-packages\scrapy\commands\crawl.py", line 47, in run
crawler = self.crawler_process.create_crawler()
File "C:\Python27\lib\site-packages\scrapy\crawler.py", line 142, in create_cr
awler
self.crawlers[name] = Crawler(self.settings)
File "C:\Python27\lib\site-packages\scrapy\crawler.py", line 23, in __init__
self.spiders = spman_cls.from_crawler(self)
File "C:\Python27\lib\site-packages\scrapy\spidermanager.py", line 35, in from
_crawler
sm = cls.from_settings(crawler.settings)
File "C:\Python27\lib\site-packages\scrapy\spidermanager.py", line 31, in from
_settings
return cls(settings.getlist('SPIDER_MODULES'))
File "C:\Python27\lib\site-packages\scrapy\spidermanager.py", line 22, in __in
it__
for module in walk_modules(name):
File "C:\Python27\lib\site-packages\scrapy\utils\misc.py", line 66, in walk_mo
dules
submod = __import__(fullpath, {}, {}, [''])
File "blog_crawl\spiders\dmoz_spider.py", line 1, in
class DmozSpider(BaseSpider):
NameError: name 'BaseSpider' is not defined
someone tells me why, pls
--
https://mail.python.org/mailman/listinfo/python-list
Re: How To Build Scrapy Environment
在 2013年9月23日星期一UTC+8下午4时12分21秒,YetToCome写道:
> I have already install Twisted, zope.interface, w3lib, libxml2, etc,but it
> still can not be built,here is the error message:
>
>
>
> Traceback (most recent call last):
>
> File "C:\Python27\lib\runpy.py", line 162, in _run_module_as_main
>
> "__main__", fname, loader, pkg_name)
>
> File "C:\Python27\lib\runpy.py", line 72, in _run_code
>
> exec code in run_globals
>
> File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 167, in
>
>
> execute()
>
> File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 142, in execute
>
> _run_print_help(parser, _run_command, cmd, args, opts)
>
> File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 88, in
> _run_print
>
> _help
>
> func(*a, **kw)
>
> File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 149, in
> _run_comm
>
> and
>
> cmd.run(args, opts)
>
> File "C:\Python27\lib\site-packages\scrapy\commands\crawl.py", line 47, in
> run
>
> crawler = self.crawler_process.create_crawler()
>
> File "C:\Python27\lib\site-packages\scrapy\crawler.py", line 142, in
> create_cr
>
> awler
>
> self.crawlers[name] = Crawler(self.settings)
>
> File "C:\Python27\lib\site-packages\scrapy\crawler.py", line 23, in __init__
>
> self.spiders = spman_cls.from_crawler(self)
>
> File "C:\Python27\lib\site-packages\scrapy\spidermanager.py", line 35, in
> from
>
> _crawler
>
> sm = cls.from_settings(crawler.settings)
>
> File "C:\Python27\lib\site-packages\scrapy\spidermanager.py", line 31, in
> from
>
> _settings
>
> return cls(settings.getlist('SPIDER_MODULES'))
>
> File "C:\Python27\lib\site-packages\scrapy\spidermanager.py", line 22, in
> __in
>
> it__
>
> for module in walk_modules(name):
>
> File "C:\Python27\lib\site-packages\scrapy\utils\misc.py", line 66, in
> walk_mo
>
> dules
>
> submod = __import__(fullpath, {}, {}, [''])
>
> File "blog_crawl\spiders\dmoz_spider.py", line 1, in
>
> class DmozSpider(BaseSpider):
>
> NameError: name 'BaseSpider' is not defined
>
>
>
> someone tells me why, pls
在 2013年9月23日星期一UTC+8下午4时12分21秒,YetToCome写道:
> I have already install Twisted, zope.interface, w3lib, libxml2, etc,but it
> still can not be built,here is the error message:
>
>
>
> Traceback (most recent call last):
>
> File "C:\Python27\lib\runpy.py", line 162, in _run_module_as_main
>
> "__main__", fname, loader, pkg_name)
>
> File "C:\Python27\lib\runpy.py", line 72, in _run_code
>
> exec code in run_globals
>
> File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 167, in
>
>
> execute()
>
> File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 142, in execute
>
> _run_print_help(parser, _run_command, cmd, args, opts)
>
> File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 88, in
> _run_print
>
> _help
>
> func(*a, **kw)
>
> File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 149, in
> _run_comm
>
> and
>
> cmd.run(args, opts)
>
> File "C:\Python27\lib\site-packages\scrapy\commands\crawl.py", line 47, in
> run
>
> crawler = self.crawler_process.create_crawler()
>
> File "C:\Python27\lib\site-packages\scrapy\crawler.py", line 142, in
> create_cr
>
> awler
>
> self.crawlers[name] = Crawler(self.settings)
>
> File "C:\Python27\lib\site-packages\scrapy\crawler.py", line 23, in __init__
>
> self.spiders = spman_cls.from_crawler(self)
>
> File "C:\Python27\lib\site-packages\scrapy\spidermanager.py", line 35, in
> from
>
> _crawler
>
> sm = cls.from_settings(crawler.settings)
>
> File "C:\Python27\lib\site-packages\scrapy\spidermanager.py", line 31, in
> from
>
> _settings
>
> return cls(settings.getlist('SPIDER_MODULES'))
>
> File "C:\Python27\lib\site-packages\scrapy\spidermanager.py", line 22, in
> __in
>
> it__
>
> for module in walk_modules(name):
>
> File "C:\Python27\lib\site-packages\scrapy\utils\misc.py", line 66, in
> walk_mo
>
> dules
>
> submod = __import__(fullpath, {}, {}, ['
Re: How To Build Scrapy Environment
在 2013年9月23日星期一UTC+8下午4时37分22秒,Peter Otten写道:
> YetToCome wrote:
>
>
>
> > I have already install Twisted, zope.interface, w3lib, libxml2, etc,but it
>
> > still can not be built,here is the error message:
>
> >
>
> > Traceback (most recent call last):
>
> > File "C:\Python27\lib\runpy.py", line 162, in _run_module_as_main
>
> > "__main__", fname, loader, pkg_name)
>
> > File "C:\Python27\lib\runpy.py", line 72, in _run_code
>
> > exec code in run_globals
>
> > File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 167, in
>
> >
>
> > execute()
>
> > File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 142, in
>
> > execute
>
> > _run_print_help(parser, _run_command, cmd, args, opts)
>
> > File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 88, in
>
> > _run_print
>
> > _help
>
> > func(*a, **kw)
>
> > File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 149, in
>
> > _run_comm
>
> > and
>
> > cmd.run(args, opts)
>
> > File "C:\Python27\lib\site-packages\scrapy\commands\crawl.py", line 47,
>
> > in run
>
> > crawler = self.crawler_process.create_crawler()
>
> > File "C:\Python27\lib\site-packages\scrapy\crawler.py", line 142, in
>
> > create_cr
>
> > awler
>
> > self.crawlers[name] = Crawler(self.settings)
>
> > File "C:\Python27\lib\site-packages\scrapy\crawler.py", line 23, in
>
> > __init__
>
> > self.spiders = spman_cls.from_crawler(self)
>
> > File "C:\Python27\lib\site-packages\scrapy\spidermanager.py", line 35,
>
> > in from
>
> > _crawler
>
> > sm = cls.from_settings(crawler.settings)
>
> > File "C:\Python27\lib\site-packages\scrapy\spidermanager.py", line 31,
>
> > in from
>
> > _settings
>
> > return cls(settings.getlist('SPIDER_MODULES'))
>
> > File "C:\Python27\lib\site-packages\scrapy\spidermanager.py", line 22,
>
> > in __in
>
> > it__
>
> > for module in walk_modules(name):
>
> > File "C:\Python27\lib\site-packages\scrapy\utils\misc.py", line 66, in
>
> > walk_mo
>
> > dules
>
> > submod = __import__(fullpath, {}, {}, [''])
>
> > File "blog_crawl\spiders\dmoz_spider.py", line 1, in
>
> > class DmozSpider(BaseSpider):
>
> > NameError: name 'BaseSpider' is not defined
>
> >
>
> > someone tells me why, pls
>
>
>
> Assuming you are working your way through the tutorial at
>
>
>
> http://doc.scrapy.org/en/latest/intro/tutorial.html#our-first-spider
>
>
>
> you probably forgot to import the BaseSpider class with
>
>
>
> from scrapy.spider import BaseSpider
>
>
>
> as shown in the code snippet in the "Our first Spider" section of the
>
> tutorial.
it had an another error: No module named queuelib, but i have installed all the
libs mentioned in that passage...
2013-09-23 16:44:17+0800 [scrapy] INFO: Scrapy 0.18.2 started (bot: tutorial)
2013-09-23 16:44:17+0800 [scrapy] DEBUG: Optional features available: ssl, http1
1, libxml2
2013-09-23 16:44:17+0800 [scrapy] DEBUG: Overridden settings: {'NEWSPIDER_MODULE
': 'tutorial.spiders', 'SPIDER_MODULES': ['tutorial.spiders'], 'BOT_NAME': 'tuto
rial'}
2013-09-23 16:44:17+0800 [scrapy] DEBUG: Enabled extensions: LogStats, TelnetCon
sole, CloseSpider, WebService, CoreStats, SpiderState
Traceback (most recent call last):
File "C:\Python27\lib\runpy.py", line 162, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "C:\Python27\lib\runpy.py", line 72, in _run_code
exec code in run_globals
File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 167, in
execute()
File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 142, in execute
_run_print_help(parser, _run_command, cmd, args, opts)
File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 88, in _run_print
_help
func(*a, **kw)
File "C:\Python27\lib\site-packages\scrapy\cmdline.py", line 149, in _run_comm
and
cmd.run(args, opts)
File "C:\Python27\lib\site-packages\scrapy\commands\crawl.py", line 50, in run
self.crawler_process.start()
File "
Re: How To Build Scrapy Environment
在 2013年9月23日星期一UTC+8下午5时25分25秒,Peter Otten写道: > YetToCome wrote: > > > > [snip] > > > > No need to quote the whole turd -- just confirm that it worked... > > > > > it had an another error: No module named queuelib, but i have installed > > > all the libs mentioned in that passage... > > > > > ImportError: Error loading object 'scrapy.core.scheduler.Scheduler': No > > > module n amed queuelib > > > > Googling "queuelib site:scrapy.org" leads to > > > > http://doc.scrapy.org/en/latest/news.html > > > > containing > > > > """ > > 0.18.0 (released 2013-08-09)¶ > > [...] > > Moved persistent (on disk) queues to a separate project (queuelib) which > > scrapy now depends on > > """ > > > > which in turn leads to > > > > https://github.com/scrapy/queuelib it works, and i also add some mudules, thank you very much!!! -- https://mail.python.org/mailman/listinfo/python-list
I have a problem when creating a django project
I can't create a django project, I think the usage of the commend is correct...here is the error information. h:\jcode>django-admin.py startproject mysite Usage: django-admin.py subcommand [options] [args] Options: -v VERBOSITY, --verbosity=VERBOSITY Verbosity level; 0=minimal output, 1=normal output, 2=verbose output, 3=very verbose output --settings=SETTINGS The Python path to a settings module, e.g. "myproject.settings.main". If this isn't provided, the DJANGO_SETTINGS_MODULE environment variable will be used. --pythonpath=PYTHONPATH A directory to add to the Python path, e.g. "/home/djangoprojects/myproject". --traceback Print traceback on exception --version show program's version number and exit -h, --helpshow this help message and exit Type 'django-admin.py help ' for help on a specific subcommand. -- https://mail.python.org/mailman/listinfo/python-list
Re: I have a problem when creating a django project
在 2013年10月1日星期二UTC+8下午1时47分05秒,YetToCome写道: > I can't create a django project, I think the usage of the commend is > correct...here is the error information. > > > > > > h:\jcode>django-admin.py startproject mysite > > Usage: django-admin.py subcommand [options] [args] > > > > Options: > > -v VERBOSITY, --verbosity=VERBOSITY > > Verbosity level; 0=minimal output, 1=normal output, > > 2=verbose output, 3=very verbose output > > --settings=SETTINGS The Python path to a settings module, e.g. > > "myproject.settings.main". If this isn't provided, the > > DJANGO_SETTINGS_MODULE environment variable will be > > used. > > --pythonpath=PYTHONPATH > > A directory to add to the Python path, e.g. > > "/home/djangoprojects/myproject". > > --traceback Print traceback on exception > > --version show program's version number and exit > > -h, --helpshow this help message and exit > > > > Type 'django-admin.py help ' for help on a specific subcommand. And what happened now... h:\jcode>django-admin.py help Type 'django-admin.py help' for usage. -- https://mail.python.org/mailman/listinfo/python-list
Re: I have a problem when creating a django project
在 2013年10月1日星期二UTC+8下午2时54分53秒,YetToCome写道: > 在 2013年10月1日星期二UTC+8下午1时47分05秒,YetToCome写道: > > > I can't create a django project, I think the usage of the commend is > > correct...here is the error information. > > > > > > > > > > > > > > > > > > h:\jcode>django-admin.py startproject mysite > > > > > > Usage: django-admin.py subcommand [options] [args] > > > > > > > > > > > > Options: > > > > > > -v VERBOSITY, --verbosity=VERBOSITY > > > > > > Verbosity level; 0=minimal output, 1=normal output, > > > > > > 2=verbose output, 3=very verbose output > > > > > > --settings=SETTINGS The Python path to a settings module, e.g. > > > > > > "myproject.settings.main". If this isn't provided, > > the > > > > > > DJANGO_SETTINGS_MODULE environment variable will be > > > > > > used. > > > > > > --pythonpath=PYTHONPATH > > > > > > A directory to add to the Python path, e.g. > > > > > > "/home/djangoprojects/myproject". > > > > > > --traceback Print traceback on exception > > > > > > --version show program's version number and exit > > > > > > -h, --helpshow this help message and exit > > > > > > > > > > > > Type 'django-admin.py help ' for help on a specific subcommand. > > > > And what happened now... > > > > h:\jcode>django-admin.py help > > Type 'django-admin.py help' for usage. ok...i solve the problem, clean the Registry and reinstall python -- https://mail.python.org/mailman/listinfo/python-list
