python - Pass extra values along with urls to scrapy spider -
i've list of tuples in form (id,url) need crawl product list of urls, , when products crawled need store them in database under id.
problem can't understand how pass id parse function can store crawled item under id.
initialize start urls in start_requests()
, pass id
in meta
:
class myspider(spider): mapping = [(1, 'my_url1'), (2, 'my_url2')] ... def start_requests(self): id, url in self.mapping: yield request(url, callback=self.parse_page, meta={'id': id}) def parse_page(self, response): id = response.meta['id']
Comments
Post a Comment