You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It would be useful to have a dynamic object that is available only for the duration of the Crawl call e.g.
IWebCrawler.Crawl(Uri uri, dynamic localConfig)
I am currently using the CrawlBag but it's a little bit messy as I want to pass a business object to the crawler that really should only be valid for that single call to Crawl, subsequent calls will pass different localConfig objects. These objects handle the building and processing of the DOM according to my business logic and construction of extracted hiererchical data.
I can see the _crawlContext persists for the lifetime of the IWebCrawler, which is great as I need some configuration valid for the entire IWebCrawler existence i.e. multiple subsequent calls to Crawl, but I also need configuration scoped only to an individual call to Crawl which I'd imagine is best controlled as a method parameter. Let me know if there is a better way of accomplishing my tasks or if you need better info.
The text was updated successfully, but these errors were encountered:
Abot is designed to crawl once per instance so I would recommend only calling Crawl once. I may add a check to only allow the Crawl() method to be crawled once. Given that design focus the CrawlBag should suite your needs?
OK, I understand, its definitely worth adding the check and something to the docs.
My app is a data scraper/aggregator. It hits a single webserver (per crawler instance) for <1000 pages once per day and then the same server multiple times for 10's of pages throughout the day. Therefore it would be nice to have the crawler instance persist for the day and maintain the politeness state.
The abot architecture takes care of the crawling (scheduling, threading, requesting etc) much better than my own current app so I am keen to integrate abot. I will try and persist the poilteness state between crawler instances in my own code via the CrawlBag.
It would be useful to have a dynamic object that is available only for the duration of the Crawl call e.g.
IWebCrawler.Crawl(Uri uri, dynamic localConfig)
I am currently using the CrawlBag but it's a little bit messy as I want to pass a business object to the crawler that really should only be valid for that single call to Crawl, subsequent calls will pass different localConfig objects. These objects handle the building and processing of the DOM according to my business logic and construction of extracted hiererchical data.
I can see the _crawlContext persists for the lifetime of the IWebCrawler, which is great as I need some configuration valid for the entire IWebCrawler existence i.e. multiple subsequent calls to Crawl, but I also need configuration scoped only to an individual call to Crawl which I'd imagine is best controlled as a method parameter. Let me know if there is a better way of accomplishing my tasks or if you need better info.
The text was updated successfully, but these errors were encountered: