Monitor Your Websites with Notifications

If you’ve ever been surprised to find out that one of your websites is down, you’ve no doubt tried to figure out a good way to monitor their health. There are many services on the internet that will check in on your sites periodically and let you know when there’s been a problem, but often these aren’t real time enough. After a recent DDOS to one of the sites I run, I put together a simple Python script to check in on my sites and let me know if any of them return anything besides “HTTP 200” as the response code.


I’ve built in support for Growl notifications and iOS push notifications via Boxcar. You could also modify the script to send emails, etc.


If you’re running Growl on your Mac (maybe Windows as well, I haven’t checked), you can receive Growl notifications by setting the USE_GROWL variable to True in the script. You can optionally then use Prowl or Howl to pass the same notices on to your iOS device, following either app’s provided instructions.


Unlike Prowl and Howl, Boxcar doesn’t require Growl (meaning it could work on a Linux box) and doesn’t cost anything. To use Boxcar, you need to become a Boxcar provider. Once you’ve registered your provider, fill in the script’s BOXCAR_API_KEY and BOXCAR_API_SECRET variables with the ones Boxcar gives you. Finally, set BOXCAR_EMAIL to the email address you registered with Boxcar and USE_BOXCAR to True.


#!/usr/bin/env python
sites = [
import sys, os, urllib, urllib2, hashlib, datetime
now =
errors = []
def send_notice(site, code):
api_url = '' % BOXCAR_API_KEY
data = {
'email': hashlib.md5(BOXCAR_EMAIL).hexdigest(),
'notification[from_screen_name]': site,
'notification[message]': code,
'notification[from_remote_service_id]': u'%s-%s' % (now.strftime('%c'), site)
urllib2.urlopen(api_url, urllib.urlencode(data))
os.system('growlnotify -n "Website Monitor" -p 2 "%s" -m "%s" -s' % (site, code))
for site in sites:
urllib2.urlopen('http://%s' % site, None, 30)
except urllib2.HTTPError, code:
send_notice(site, code)
except urllib2.URLError, error:
error = str(error)
if error.find('Errno 8') > -1:
send_notice(site, u'Does not appear to exist.')
send_notice(site, u'%s' % error)
if len(errors):
print u'%s encountered errors.' % ','.join(errors)
print u'All sites checked out.'
view raw hosted with ❤ by GitHub

Save the contents of the script to a file called “check_sites” somewhere in your path) and make the file executable (chmod +x check_sites). Edit the sites variable, creating an entry for each site you want to check. You can test the script by running check_sites on a command line. Finally, you need to set up a cron job to periodically run the script. For example 0-59/5 * * * * /path/to/check_sites >& /dev/null will run the script every five minutes. (You need to edit “/path/to/check_sites” to be the actual path of the script.)

That’s it. Now you’ll be alerted whenever the script finds an error with one of the sites in the list.

Cross-posted at

Like it? Share it

Share on LinkedIn

Share on Google Plus


  1. Maybe my opinion is wrong, but I think an online service is much better (multiple servers in different parts of the world, detailed reports), especially if you have more sites to be monitored.

    p.s.  Link to the article on Wikipedia (Path_variable) — is broken.

  2. Thanks for your thoughts Victor. I’ve fixed the link.

    • Aaron Gustafson
    • | #