One of my first Python projects was automating a daily desktop background. I think it's a good project for beginners since it's fun (new desktop delight!), it's a good exercise in breaking down a vague idea (desktop delight?) into manageable chunks (one of the biggest wisdoms in programming!), and you can learn about a bunch of different useful things: interacting with the web and APIs, interacting with files, and scheduling jobs on your computer.

Tutorial assumptions

  • You have some version of Python installed on your computer.
  • You know how to run a Python script from the command line.
  • You have a text editor (e.g. Sublime) that you like.

My setup is: I'm using a pyenv virtual environment on OSX with Python 3.6.0 installed. I also would recommend installing ipython, since it provides such a nice shell.

pip install ipython

Break it down

One of the biggest wisdoms of programming is that it teaches you to break large tasks down into manageable parts. As I've gotten more experienced, I've noticed that this process happens faster: I can break things down more quickly, and I can hold more complex pieces in my head. In the beginning, though, it was more of a challenge - yet it was always very useful - to break down a project into a bunch of sub-tasks, and tackle each sub-task bird by bird. Don't be afraid to make those sub-tasks very, very sub (e.g. "Figure out how to turn on Python").

Here, our main objective is write a Python script that automatically changes our desktop background to something cool every morning. We can break this down into the following steps:

  1. Find a place that has lots of cool pictures.
  2. Grab pics.
  3. Set pic as desktop background.
  4. Have this happen on a regular schedule.

Here's what you learn at each step:

  1. Well, not much, but browsing pics is fun.
  2. How to interact with APIs.
  3. How to interact with your OS.
  4. How to schedule stuff on your computer.

Step 0: Setup

For a small script like this, I would probably have my Favorite Text Editor open, in addition to a shell window with ipython's shell running. On my machine, this might mean I'll use the ipython shell to test stuff out, and build the script out in Sublime.

me@machine $ ipython
Python 3.6.1 (default, Jul 10 2017, 14:13:01)
Type 'copyright', 'credits' or 'license' for more information
IPython 6.1.0 -- An enhanced Interactive Python. Type '?' for help.

In [1]:

Step 1: Find a place that has cool pics

NASA's Astronomy pic of the day, I choose you!

Other good options:

One thing that might inform where you grab your daily picture from is: does this place have an API? Or will I be scraping?

When grabbing stuff from the web, you usually have two choices: an API, or scraping. An API - or application programming interface - is generally preferable, in that the web developers have already prepared an endpoint for programmatic interactions: e.g. for Python scripts pinging it every day for a picture! Web scraping, where you ping the front end of the website, is not ideal - since you're trawling through HTML and hoping that the HTML tags don't change.

Of the above options, Reddit and NASA's APOD have APIs, but I can't seem to find one for The Atlantic. This might tip the balance in favor of Reddit or NASA - interacting with APIs is usually way easier.

Step 2: Grab pics

EASIER SAID THAN DONE, you may say. Never fear. Let's break this down even further:

  • 2.1. Find the API endpoint for grabbing pictures.
  • 2.2. Use Python to ping said endpoint.
  • 2.3. Save the returned image file somewhere.

NASA's API documentation is here. The endpoint we'll be hitting is where you'll need to register with NASA and have them give you an API key. (This lets API developers track the number of pings you hit their API with - they want to ensure people don't completely kill their servers by pinging a lot. Usually these "rate limits" are pretty high - thousands per day or whatever. Not something we have to worry about when writing a simple personal script - but definitely something that comes up for industrial strength API work.)

Anyway, here's what gets returned by the NASA APOD API when you hit their endpoint with the example key:

  "date": "2017-12-12",
  "explanation": "What's up in the sky this winter?  The featured graphic gives a few highlights for Earth's northern hemisphere.  Viewed as a clock face centered at the bottom, early winter sky events fan out toward the left, while late winter events are projected toward the right.  Objects relatively close to Earth are illustrated, in general, as nearer to the cartoon figure with the telescope at the bottom center -- although almost everything pictured can be seen without a telescope.  Highlights of this winter's sky include the Geminids meteor shower peaking this week, the constellation of Orion becoming notable in the evening sky, and many planets being visible before sunrise in February.  As true in every season, the International Space Station (ISS) can be sometimes be found drifting across your sky if you know just when and where to look.",
  "hdurl": "",
  "media_type": "image",
  "service_version": "v1",
  "title": "Highlights of the Winter Sky",
  "url": ""

Ah, beloved JSON, lingua franca of the programmatic internet! Hello!

What's nice about the JSON data format is that, since it's structured in key: value pairs, you can access the value you want if you know the key. Here, we want to download a cool space picture - looks like we can find one at the url key.

To the Python! I like the requests library for its wonderful, readable documentation, and its wonderful, readable levels of abstraction. Interacting with APIs is molto painless with requests.

import requests

url = ''
r = requests.get(url).json()

This will print out our desired .jpg. To actually download that .jpg, we can use the Python standard library's urllib.request module.

import urllib.request
from datetime import date

# Create a string variable that prettifies today's date
today ="%Y%B%d")
todays_file = "{}_nasa_apod.jpg".format(today)

# Grab the image URL from the earlier API request
image_url = r['url']

# Download said file into current working directory
urllib.request.urlretrieve(image_url, todays_file)

I'm doing a few things above. Beyond just downloading the file (urllib.request.urlretrieve), I'm also formatting the filename such that it'll have today's date in there (for a handy reference on string formatting, I recommend

3. Set pic as desktop background

This is the bit that depends on your OS - and, honestly, this is the bit that I still have no idea what it's doing. I just found this code somewhere on the intertoobs and plugged it in:

from AppKit import NSWorkspace, NSScreen

ws = NSWorkspace.sharedWorkspace()
file_url = NSURL.fileURLWithPath_(todays_file)
for screen in NSScreen.screens():
  (result, error) = ws.setDesktopImageURL_forScreen_options_error_(file_url, screen, {}, None)

This uses Apple's AppKit Python library to set the image on the currently active desktop. I've tried fiddling with the arguments in setDesktopImageURL_forScreen_options_error_ to have it, for example, scale and resize images appropriately, and apply the image to all desktops, not just the active one. But no luck - and, okay, not much will either. It works 90% of the time, and I'm happy with that!

4. Have this happen on a regular schedule

You can schedule things to run on your machine using cron, a job scheduler for Unix. One key thing about cron is that scheduled jobs won't run if your computer is asleep - so schedule it to happen when your computer is always likely to be awake (e.g. during work hours). Otherwise, on a Mac, you can use launchd to have a background daemon run your script - even when your computer is sleeping. I haven't bothered with this.

To have cron run your script, I found this tutorial super helpful. You basically need to edit your crontab by adding a line like the following:

30  7   *   *   *   /path/to/cool/script/

The above says: run every morning at 7:30AM (your computer's time zone). You can check to see if any cron jobs are currently scheduled on your machine by running crontab -l in Terminal.


I don't actually use NASA's APOD for my own script. Instead, I have a list of image-heavy subreddits that I randomly select from. I've had this script running for ~3 years (!) now, and it has given me about 70% delight, 25% meh, and 5% embarrassment (be careful what you automatically download!). Here are some recent desktops that were really top-notch:

kasai, by joseph biwald Kasai, by Joseph Biwald found via r/ImaginaryLandscapes

nasa apod, aurora borealis Aurora Borealis over Norway, by Max Rive, from NASA's APOD, found via r/spaceporn

emerald lake Emerald Lake, Yoho National Park, British Columbia, found via r/waterporn

USS enterprise Star Trek- USS Enterprise NCC-1701-D cutaway- Rusted Gear Art, found via r/StarShipPorn