Great point about separation of concerns. As you stated, the scraper should only be concerned with getting data and saving data. I am curious what other use cases would be compatible with scrapy’s pipelines. Would pipelines be a good place for things like “save to this OTHER database”, or “upload to S3”, or “ping this api”? Will be diving into this myself soon but curious about your thoughts here.
@JohnWatsonRooney3 ай бұрын
yes absolutely, you could use an item field to decide whether to upload to X DB or Y DB, and certainly uploading to S3 would come here too. pinging an API you mean like to notify another system? I think that would be a great use case for pipelines (not thought of that before)
@alexdin15653 ай бұрын
Hi Johne i have a question can we use scrapy with django? i mean make the webscraper as online tool
@RicardoPorteladaSilva3 ай бұрын
I think you could create script to scrape separately and load de result to django databases. The processing occurs in separated moments. I hope you understand my English, I'm from Brazil, learning English. if you need more specific please feel free to getting in touch. its a great pleasure to help you
@JohnWatsonRooney3 ай бұрын
this is pretty much it!
@HitAndMissLab3 ай бұрын
@@RicardoPorteladaSilva what is the advantage of using Django DB?
@piercenorton15443 ай бұрын
What if we want to take a full page so we can give it to an LLM to parse? For example, what if we were parsing financial filings or contracts. We want chunks or pages to pass to an LLM to structure outputs. I think splitting the text on a tag and then joining the items together would be best, but maybe there is a better way.
@HitAndMissLab3 ай бұрын
Do you have any videos on how to use proxies in Python?
@JohnWatsonRooney3 ай бұрын
I don’t specifically but that’s a good idea I will create a video on proxies inc how to use
@jjeffery1293 ай бұрын
What’s wrong with scrapping them as string and change them in the end in your output file?