New mini project — Downloading Wallpaper

I wrote a script to parse Reddit’s API for images in the “r/earthporn” subreddit. It downloads the pictures to my local machine, which I will then randomly serve as wallpapers for my laptop.

One hurdle I came across involved submissions from the “” domain. The submitter was not linking the image directly, but linking to the album the picture was in. This obviously led to the actual picture not being downloaded. I just added a check to ensure that “jpg” or “jpeg” were included in the URL, otherwise it won’t save to file.

class GetEarthPorn
require ‘rubygems’
require ‘httparty’
require ‘net/http’
require ‘open-uri’

include HTTParty

attr_reader :picture_array

def initialize
@picture_array = [] #this will hold the URL of each submission

def get_pics
#navigate thru API to get URL from each submission
#if the URL doesn’t contain ‘jpg’ or ‘jpeg’ it won’t be saved to array
server_response = get(“”)[‘data’][‘children’]
server_response.each do |image|
if image[“data”][“url”].include?(“jpg”) || image[“data”][“url”].include?(“jpeg”)
@picture_array << image["data"]["url"] end end p @picture_array p "*********************" end def save_to_disk @picture_array.each do |image| file = open(image) file_name_to_save = image.split('/').last open(file_name_to_save,'w') do |file| open(image){|picture| file.write(} end end end private def get *args, &block response = self.class.get *args, &block raise WebserverError, response.code unless response.code == 200 response end end e = e.get_pics e.save_to_disk [/ruby]

Leave a Reply