Added optional source url restrictions to Rule #1
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There is no easy way (that I can find) to restrict
Rule
s to specific response URLs [1]. For example, I may want to limit aRule
to only apply to a response from acategory.php
page [2]. It appears the current way to do this is to be overly specific with your link extractor's xpath so it only matches links oncategory.php
pages. This has the downside of being fragile and unprovable.This PR:
Rule
optional argumentsallow_sources
anddeny_sources
(mimicking the implementation of LinkExtractor'sallow
anddeny
)Rule.source_allowed(url)
to check if url is allowed/deniedCrawlSpider._requests_to_follow
checksRule.source_allowed
for each rule[1] Example of someone else trying to solve this issue: http://stackoverflow.com/questions/22653656/how-to-make-rules-of-crawlspider-context-sensitive
[2] In my use case, I need to scrape a mix of ecommerce product pages, category pages & "super-category" pages.