If you’ve ever been surprised to find out that one of your websites is down, you’ve no doubt tried to figure out a good way to monitor their health. There are many services on the internet that will check in on your sites periodically and let you know when there’s been a problem, but often these aren’t real time enough. After a recent DDOS to one of the sites I run, I put together a simple Python script to check in on my sites and let me know if any of them return anything besides “HTTP 200” as the response code.
Notifications
I’ve built in support for Growl notifications and iOS push notifications via Boxcar. You could also modify the script to send emails, etc.
Growl
If you’re running Growl on your Mac (maybe Windows as well, I haven’t checked), you can receive Growl notifications by setting the USE_GROWL
variable to True
in the script. You can optionally then use Prowl or Howl to pass the same notices on to your iOS device, following either app’s provided instructions.
Boxcar
Unlike Prowl and Howl, Boxcar doesn’t require Growl (meaning it could work on a Linux box) and doesn’t cost anything. To use Boxcar, you need to become a Boxcar provider. Once you’ve registered your provider, fill in the script’s BOXCAR_API_KEY
and BOXCAR_API_SECRET
variables with the ones Boxcar gives you. Finally, set BOXCAR_EMAIL
to the email address you registered with Boxcar and USE_BOXCAR
to True
.
Installation
#!/usr/bin/env python | |
sites = [ | |
'google.com', | |
] | |
USE_BOXCAR = True | |
USE_GROWL = True | |
BOXCAR_EMAIL = '' | |
BOXCAR_API_KEY = '' | |
BOXCAR_API_SECRET = '' | |
import sys, os, urllib, urllib2, hashlib, datetime | |
now = datetime.datetime.now() | |
errors = [] | |
def send_notice(site, code): | |
if USE_BOXCAR: | |
api_url = 'http://boxcar.io/devices/providers/%s/notifications' % BOXCAR_API_KEY | |
data = { | |
'email': hashlib.md5(BOXCAR_EMAIL).hexdigest(), | |
'secret': BOXCAR_API_SECRET, | |
'notification[from_screen_name]': site, | |
'notification[message]': code, | |
'notification[from_remote_service_id]': u'%s-%s' % (now.strftime('%c'), site) | |
} | |
urllib2.urlopen(api_url, urllib.urlencode(data)) | |
if USE_GROWL: | |
os.system('growlnotify -n "Website Monitor" -p 2 "%s" -m "%s" -s' % (site, code)) | |
for site in sites: | |
try: | |
urllib2.urlopen('http://%s' % site, None, 30) | |
except urllib2.HTTPError, code: | |
send_notice(site, code) | |
errors.append(site) | |
except urllib2.URLError, error: | |
error = str(error) | |
if error.find('Errno 8') > -1: | |
send_notice(site, u'Does not appear to exist.') | |
else: | |
send_notice(site, u'%s' % error) | |
errors.append(site) | |
except: | |
errors.append(site) | |
if len(errors): | |
print u'%s encountered errors.' % ','.join(errors) | |
else: | |
print u'All sites checked out.' | |
sys.exit() |
Save the contents of the script to a file called “check_sites” somewhere in your path) and make the file executable (chmod +x check_sites
). Edit the sites
variable, creating an entry for each site you want to check. You can test the script by running check_sites
on a command line. Finally, you need to set up a cron job to periodically run the script. For example 0-59/5 * * * * /path/to/check_sites >& /dev/null
will run the script every five minutes. (You need to edit “/path/to/check_sites” to be the actual path of the script.)
That’s it. Now you’ll be alerted whenever the script finds an error with one of the sites in the list.
Cross-posted at http://dryan.com/articles/monitor-your-websites/