Skip to content Skip to sidebar Skip to footer

Cannot Import My Own Module In Scrapy Crawler

I'm writing a crawler using Scrapy. I've built a crawler and it works very well. Now I want to create my own modules, but I always receive this error: File 'D:\Projects\bitbucket\

Solution 1:

You can do several things:

First

from crawl1.spiders.moduletestimport mythings

As suggested by @elRuLL

Second

from .moduletestimport mythings

This generally a bad and brittle solution but possible.

Third

You can package it as package and do.

init.py:

from spiders.moduletestimport *
__all__ = [<Putyourclasses, methods, etchere>]

samplecrawler.py

import moduletest

Solution 2:

you need to include the full module path:

from crawl1.spiders.moduletestimport mythings

Solution 3:

you have to include the name of the folder as the module name

import crawl1.spiders.moduletest

Solution 4:

Found, after some hours:

from scrapy.spidersimportCrawlSpider, Rulefrom scrapy.linkextractorsimportLinkExtractorimport crawl1.spiders.moduletestclassSamplecrawlerSpider(CrawlSpider):

import crawl1.spiders.moduletest

Post a Comment for "Cannot Import My Own Module In Scrapy Crawler"