A python project to sniff the internet traffic and stored it into MongoDB database.
tmux attach
./make.sh-run.sh &
./kill &
tmux detach
- Use tmux to create a session
- Run make-run.sh script to keep the python script running
- Run kill.sh script to kill the python script every 30 minutes.
- sniff.py: Use scapy library to sniff the data. Insert the sniffed data into a MongoDB.
- db_rolling.py: Aggregate the data in the last two minutes
- db_rolling2.py: delete the aggregated data 1 week before
- config.py: the MongoDB Address. This file should not be pushed to GitHub. Use config-example.py as an example. MONGO_DB_ADDRESS = '<MONGO_DB_ADDRESS>'.
- addDevices.py: Read the device mac and name information from a file in the router. Store the device information into the MongoDB
* * * * * python3 /home/ubuntu/pypcap-monitor/db_rolling.py # every minute
*/5 * * * * python3 /home/ubuntu/pypcap-monitor/db_rolling2.py # every 5 minutes
Ask Daniel for which iface should be listened to in the router
# sniff iface en0 of all tcp and udp packets
sniff(iface='en0', prn=http_header, filter="tcp or udp")
# sniff iface en0 of tcp port 80 and 443 packets
sniff(iface='en0', prn=http_header, filter="tcp port (80 or 443)")
# sniff iface en1 of tcp port 80 and 443 packets
sniff(iface='eth1', prn=http_header, filter="tcp port (80 or 443)", store=0)
# sniff iface eth1 of all tcp and udp packets
sniff(iface='eth1', prn=http_header, filter="tcp or udp")