Since I also follow financials quite a bit, I started wondering if there is a correlation between the inclination of programmers to experiment with alternatives to mainstream languages/frameworks and the economy. Empirically, it seems like there may be some correlation -- I'd love to see data to correlate the two. Maybe O'Reilly or some of these code archiving sites can trend the information against economic indicators.
For the moment, let's just consider the hypothetical of there being some correlation.
Consider Java. Java's increase in popularity seems directly tied to the time of Bubble 1.0. Using Java was, at one time, as speculative as buying Pets.Com stock. It gained enough popularity during that 1995-2000 era to become "mainstream", but pretty much only within one demographic: the net itself. Didn't really get a lot of momentum after that, and today very few desktop applications are with Java, even though it runs a massive number of sites on the net (including blogger).
So here's where the curiosity starts. I think Java became mainstream because it was applied so much to Bubble 1.0. Did Python, Ruby and these others gain enough steam during Bubble 2.0 (end of 2004-2007) to persist as "mainstream"?
Furthermore, it was announced today that California's unemployment rate has hit 12.2%. Fortune 1000 companies have been laying off people in droves for 24 months with no end in sight. Do we expect these large companies to start taking chances with some of these alternative technologies when their entire businesses are at risk? Did Python garner enough steam for an intranet site at a Windows company to be written in Django instead of ASP.NET?
tl;dr: Java, C#, C++, C are really good skills to have right about now. And if you think a recruiter at a large company will get excited with Erlang skills on your resume, you've been in startups too long. [Note, not a rip on Erlang]