Skip to content

lFourl/FourBot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

22 Commits
 
 
 
 
 
 
 
 

Repository files navigation

My Web Crawler "FourBot"

A concurrent and performant web crawler using goroutines and channels.

Equipped with:

  • Proper Network Error Handling
  • Robots.txt Compliance
  • Concurrency Control
  • Rate Limting
  • Enhanced Logging
  • Graceful Shutdown

To run the web crawler, run the code below and input comma-separated URLs

go run cmd/goCrawl/main.go

About

Concurrent Web Crawler

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages