I wanted to be able to check my DIY security camera from my phone and archive the captures remotely. I had a spare webcam and set of speakers so I cobbled together a second system for the interior.
The concept for this camera (and retroactively the exterior camera) is to take a photo every five minutes, upload it to Twitter and Dropbox and clear the file off the SD card. This requires no special hardware and only a little bit of scripting.
My exterior camera uses Raspberry Pi Camera which has some non-traditional software interfaces. As that website outlines, you enable the camera via raspi-config. From there, you can use the command raspistill to take JPEG's.
The interior camera uses an older Microsoft Lifecam which is only somewhat supported on Linux. The application fswebcam can pull 640x480 JPEG's, however, and so that was used for the interior. In the future I may transition it to the much better quality and support of the Pi camera.
After successfully installing n00bs, I had to install Tweepy. This is the python library used to interface with Twitter. The instructions for this and how to operate it via Python is better written up on Raspi.TV. Below I have included my version of the tweetpic.py script.
#!/usr/bin/env python2.7 # tweetpic.py take a photo with the Pi camera and tweet it # by Alex Eames http://raspi.tv/?p=5918 import tweepy from subprocess import call from datetime import datetime i = datetime.now() #take time and date for filename now = i.strftime('%Y%m%d-%H%M%S') photo_name = now + '.jpg' cmd = 'fswebcam --resolution 2048x1024 --title "Living Room" --save /home/pi/snapshots/inside-' + photo_name call ([cmd], shell=True) #shoot the photo # Consumer keys and access tokens, used for OAuth consumer_key = 'REDACTED' consumer_secret = 'REDACTED' access_token = 'REDACTED' access_token_secret = 'REDACTED' # OAuth process, using the keys and tokens auth = tweepy.OAuthHandler(consumer_key, consumer_secret) auth.set_access_token(access_token, access_token_secret) # Creation of the actual interface, using authentication api = tweepy.API(auth) # Send the tweet with photo photo_path = '/home/pi/snapshots/inside-' + photo_name status = 'Photo auto-tweet from Interior: ' + i.strftime('%Y/%m/%d %H:%M:%S') api.update_with_media(photo_path, status=status)
The only other trick is my external camera is running a UV4L streaming server that prevents raspistill from accessing the camera. It's easy to solve simply by including the following three lines after defining the cmd variable.
call ('sudo service uv4l_raspicam stop', shell=True) #stop live streaming service call ([cmd], shell=True) #shoot the photo call ('sudo service uv4l_raspicam start', shell=True) #start live streaming service
Once the script was installed in place, I had to set my Pi to automatically tweet out the photo. In Linux, this is achieved by adding a command and a schedule in the cron file. The command crontab -e allowed you to modify it. Below is the command I have in mine to tweet every five minutes. This website lets you generate a different schedule.
*/5 * * * * /home/pi/snapshots/tweetpic.py
I configured a private Twitter account to be private so only my own normal account could see it. The last step was to set up TweetDelete to auto-purge old photos sometimes blow away everything without having to recreate the account.
I use Dropbox Uploader written by Andrea Fabrizi to handle uploading the photos to a private Dropbox account. Most of the work is installing Dropbox and getting your API keys setup. This is all documented on the GitHub for the project.
The work of taking the photo is already done by the Twitter account, so there should be a convenient JPEG waiting for you. I included this in a file I called sync.py; the code is below.
#!/usr/bin/python # -*- coding: utf-8 -*- import os path = '/home/pi/snapshots/' dest = '/snapshots' def upload_files(): if not os.path.exists(path): return dir_list = os.listdir(path) for file_name in dir_list: if 'jpg' in file_name: print 'Upload Pending...' cmd = '/home/pi/Dropbox-Uploader/dropbox_uploader.sh upload ' + path + file_$ os.system(cmd) if __name__ == '__main__': upload_files() os.system('rm /home/pi/snapshots/*.jpg')
You'll notice the last line deletes the JPEG file. This is important because the script (lazily) syncs all the JPEG files and were it not purged the time to sync would climb linearly. A more sophisticated coder would only run the command on new files, but as I said, this was cobbled together. With this in place I modified the cron file to execute sync.py after tweetpic.py.
Since I was going on vacation, I wanted the home to seem occupied. I retrieved several episodes of old radio from Archive.org and stored them in ~/audio/. Then I installed VLC (type "sudo apt-get install vlc") and modified my crontab to play it on schedule. Below is an example of one entry that plays at 9 PM.
0 21 * * * cvlc --volume 350 /home/pi/audio/1.mp3 >/dev/null 2>&1
I tested this setup over a two week vacation. From my phone I could see the interior and exterior of my house from my regular Twitter account. Dropbox continued to archive photos as planned as I periodically cleared it to make room. So far Twitter has not complained about rate or cumulative size, but the service does limit your ability access older tweets. I would rely primarily on Dropbox to preserve evidence and Twitter for checking in. Having both gives some good redundancy.
Overall I'm happy with how it worked out. Perhaps it's excessive, but it's also a fun project that requires minimal hardware.