- Published on
Ping to websites periodically with cronjob
Viewed
times
- Authors
- Name
- Introduction to Cron Jobs
- How to setup Cron Jobs with a Conda Environment
- Ping to websites periodically with cronjob
Welcome to the final part of our series on Automating Periodic Website Pinging. In this post, we'll write a Python script to ping websites and set up a cron job to run this script periodically in our Conda environment.
Prerequisites
Before we begin, ensure you have:
- Basic command line familiarity
- Access to a Unix or Unix-like operating system (such as Linux or macOS)
- Miniconda or Anaconda installed
- Completed the setup from the previous parts of this series
Installing Required Python Packages
First, let's install the necessary Python packages. Activate your Conda environment and install the following packages:
conda activate env_ping_website
pip install requests openpyxl
Python Script for Pinging Websites
Now, let's create our Python script to ping websites and record their status. Let us name it as ping_websites.py
in our ping_website
directory:
import requests
from openpyxl import Workbook, load_workbook
from datetime import datetime
import os
websites = ["https://karpathy.ai/", "https://github.com/", "https://www.google.com/"]
excel_file = "website_status.xlsx"
def ping_website(url):
try:
response = requests.get(url, timeout=5)
return response.status_code == 200
except requests.RequestException:
return False
def write_to_excel(file_path, timestamp, status_data):
if os.path.exists(file_path):
workbook = load_workbook(file_path)
sheet = workbook.active
else:
workbook = Workbook()
sheet = workbook.active
sheet.append(["Timestamp"] + websites)
row_data = [timestamp] + status_data
sheet.append(row_data)
workbook.save(file_path)
def main():
timestamp = datetime.now().strftime("%Y-%m-%d %H:%M:%S")
status_data = ["Up" if ping_website(website) else "Down" for website in websites]
write_to_excel(excel_file, timestamp, status_data)
if __name__ == "__main__":
main()
This script does the following:
- Pings a list of websites
- Records their status (Up or Down) along with a timestamp
- Writes the results to an Excel file
Updating the Bash Script
Let's update our bash script to run the Python script. Modify ping_website.sh
(refer to part 2 of series) as follows:
#!/bin/bash
source ~/miniconda3/etc/profile.d/conda.sh
conda activate env_ping_website
which python
cd /home/username/ping_website
echo $PWD
python ping_websites.py
Make sure to replace "username" with your actual username.
Setting Up the Cron Job
Now, let's set up the cron job to run our script every 5 minutes. Open the crontab editor:
crontab -e
Add the following line:
*/5 * * * * /home/username/ping_website/ping_website.sh >> /home/username/ping_website/ping_website.log 2>&1
Again, replace "username" with your actual username.
Checking the Output
After the cron job runs for a while, we should see an Excel file named website_status.xlsx
in our ping_website
directory. The content will look something like this:
Timestamp | https://karpathy.ai/ | https://github.com/ | https://www.google.com/ |
---|---|---|---|
2024-08-07 19:31:28 | Up | Up | Up |
2024-08-07 19:36:28 | Up | Up | Up |
... | ... | ... | ... |
Conclusion
In this series, we've accomplished the following:
- Set up a Conda environment for our website pinging project
- Created a bash script to activate our Conda environment
- Written a Python script to ping websites and record their status
- Set up a cron job to run our script periodically
This setup allows us to automatically monitor the status of multiple websites and keep a record of their uptime.