Golang-based distributed web crawler management platform, supporting various languages including Python, NodeJS, Go, Java, PHP and various web crawler frameworks including Scrapy, Puppeteer, Selenium. Please use docker-compose to one-click to start up. By doing so, you don't even have to configure MongoDB database. The frontend app interacts with the master node, which communicates with other components such as MongoDB, SeaweedFS and worker nodes. Master node and worker nodes communicate with each other via gRPC (a RPC framework). Tasks are scheduled by the task scheduler module in the master node, and received by the task handler module in worker nodes, which executes these tasks in task runners. Task runners are actually processes running spider or crawler programs, and can also send data through gRPC (integrated in SDK) to other data sources, e.g. MongoDB.

Features

  • Task Scheduling
  • Worker Node Management and Communication
  • Spider Deployment
  • Frontend and API Services
  • Task Execution (you can regard the Master Node as a Worker Node)
  • Integration with Other Frameworks

Project Samples

Project Activity

See All Activity >

Categories

Web Scrapers

License

BSD License

Follow Crawlab

Crawlab Web Site

You Might Also Like
Get Avast Free Antivirus, our award-winning protection for all Icon
Get Avast Free Antivirus, our award-winning protection for all

Get advanced privacy protection beyond antivirus software

Avast Free Antivirus protects your computer against viruses and malware, and it helps you protect your home network against intruders.
Free Download
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of Crawlab!

Additional Project Details

Operating Systems

Linux, Mac, Windows

Programming Language

Python, PHP, Java, Go

Related Categories

Python Web Scrapers, PHP Web Scrapers, Java Web Scrapers, Go Web Scrapers

Registered

2023-01-05