[Update 2019-11-08: sources moved to GitHub.]
[Update 2019-03-27: The Bing People (the Crosbys?) changed the format of their JSON. That's their perfect right, but it required some slight changes to the pic-getting script.]
For a few years now, I've made the Important Life Choice about my computer's desktop backgrounds (aka "wallpaper"): downloaded photos of spectacular vistas, amazing animals, breathtaking architecture, … I'm not particular. Rotate them every so often to avoid boredom. This is often called a "slideshow".
This, even though my open windows usually obscure the background. I know it's there though, and it makes me happy. (And the Start-D key combo works to minimize all windows if I really want to peruse it.)
The OS environments I use (Windows 10, Fedora Linux/Cinnamon) make it easy to configure a slideshow: just find the configuration page, point it to a directory containing the pictures you like, choose a switching interval, and that's it. (If your environment doesn't let you do this easily, maybe you should find a better environment.)
That leaves only one issue: setting up the picture directory. My
personal choice is to have my Windows "Pictures" directory shared via
feature to the Linux guest. (Detail: to allow me to write to this
directory from Linux, my account must be added to the
vboxsf group. It's on my "things to do" list when creating a new
Linux guest.) I keep 400 pictures in this directory; when more
new pictures are added, the same number—the oldest ones—are removed.
I used to download daily pictures from the National Geographic site, but they made that difficult awhile back; I don't remember the details, and I haven't checked recently to see if they relented. Instead I grab Bing's home page picture; there's a new one every day, and downloading, while not exactly a breeze, is not too difficult.
The Perl script I use to download is
at Bing that can be queried (with proper parameters)
to divulge the recent Bing pictures and their names. Specifically, the
page will contain (at most) the eight most recent. The query I use asks
For some reason, I request the JSON version of the picture data. This is
decoded (naturally enough) into a Perl data structure with the
decode_jsonfunction from the
For the available images, the script checks each to see if it has
already been downloaded. For each image not previously downloaded, it uses the
getstorto download to the shared directory.
Although I typically run this script daily, this design allows me to skip up to eight days without missing any pictures. (For example, if I'm on vacation.)
I run this script out of
anacrondaily, details left as an exercise for the reader.
The other part of this equation is getting rid of older pictures. That's
accomplished by the
It's pretty simple.
Its claim to geekery is using the
Transform to obtain a list of JPEG files in the picture directory in
order by modification time. Sweet!
The code can be easily tweaked to change directories, the types of files
examined, and how many "new" ones to keep.
This too is run daily via
OK, so how many of you out there are shaking your heads at this and saying: "Doesn't this boy realize he needs professional help?" Let's see a show of hands…