Question
Asked By – L Lawliet
I am trying to pass a user defined argument to a scrapy’s spider. Can anyone suggest on how to do that?
I read about a parameter -a
somewhere but have no idea how to use it.
Now we will see solution for issue: How to pass a user defined argument in scrapy spider
Answer
Spider arguments are passed in the crawl
command using the -a
option. For example:
scrapy crawl myspider -a category=electronics -a domain=system
Spiders can access arguments as attributes:
class MySpider(scrapy.Spider):
name = 'myspider'
def __init__(self, category='', **kwargs):
self.start_urls = [f'http://www.example.com/{category}'] # py36
super().__init__(**kwargs) # python3
def parse(self, response)
self.log(self.domain) # system
Taken from the Scrapy doc: http://doc.scrapy.org/en/latest/topics/spiders.html#spider-arguments
Update 2013: Add second argument
Update 2015: Adjust wording
Update 2016: Use newer base class and add super, thanks @Birla
Update 2017: Use Python3 super
# previously
super(MySpider, self).__init__(**kwargs) # python2
Update 2018: As @eLRuLL points out, spiders can access arguments as attributes
This question is answered By – Steven Almeroth
This answer is collected from stackoverflow and reviewed by FixPython community admins, is licensed under cc by-sa 2.5 , cc by-sa 3.0 and cc by-sa 4.0