Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

10: Added ability to parse a given robots.txt file into exclusion rules #11

Merged
merged 1 commit into from
Feb 24, 2024

Conversation

philipnorton42
Copy link
Contributor

#10
Changes:

  • Added the ability to supply a robots.txt file that is then downloaded and the data extracted to add more exclusion rules to the sitemap checker.

…aded and the data extracted to add more exclusion rules to the sitemap checker.
@philipnorton42 philipnorton42 merged commit 4df169c into main Feb 24, 2024
1 check passed
@philipnorton42 philipnorton42 deleted the 10_robots_parse branch February 24, 2024 22:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant