After what, 2 weeks of Ruby/Rails study, I finally broke away from the books to actually build something of my own.
Saturday, I redid one of my Python/Scrapy web scrapers in Ruby using mechanize and nokogiri. This web scraper goes out to my local library, retrieves all of my checkouts and prints it out in a way more user friendly format (IMHO).
On the plus side, this script wasn't just a total rehash of what I had done with Python/Scrapy because the script had stopped working a month or two ago when the library updated their site. And what's an analogy... throwing Scrapy at the problem was like dropping a house on a spider, instead of just using a shoe.
Next was a rewrite of something I had done with Python/Django/Google App Engine: a local library search engine. Here's the use case. If I'm at a bookstore and see something I'm interested in, I now just do a quick search of my local library to see if they own it. Well, the purpose behind the django app was to make it easier to search several of the local libraries at once. It's not even a web scraper. All it does is set up the appropriate iframes, but it saves me a bunch of time compared to having to enter that search query on a bunch of different web sites.
And now you know my dirty little secret; I go to places like Borders to find out what interesting books my library owns.