Sunday, June 21, 2009

Computer industry ~= fashion industry

I’m sure that has been pointed out before but it seems that recent trends back this up:  the computer business has become the fashion business, all the way from computers and products sold to the development level.

It used to be that fashion in computer science was a kick ass piece of machinery.  Take this little number for example:

crayy

I think most of us agree that that computer was awesome when it came out in 1988? 

Yes, it had an atypical look and was not to be confused with a PDP-11, but it’s not the physical design that made it awesome, it was the sheer power.  Vector units!  Gigabytes of RAM!   They use it to compute weather simulations!  The NSA uses it to crack encryption!   This is what one of my friends told me when she was working at SGI long past its usefulness -- “I just have to keep working on the big iron”.  Yes, big iron is addictive.

Point is, I don’t think people would have thought a Cray Y-MP was awesome if it was a 486 PC inside the box. 

Contrast that to today, where many people think a Mac is awesome even though that’s exactly what’s in the box: a PC.

That’s at the consumer level, so what does it mean when the super geeks are going ga-ga over the Mac when all that’s in the box is a PC?  What’s happening is that the attitude of depeche mode is trickling down into everything we do with computers.  The best example would be programming languages.

Like fashion, programming languages seem to go in and out of style.  In the 1990s, you probably couldn’t find a programmer other than Paul Graham who would advocate using lisp for a serious production project.  Today we’re seeing new dialects of languages thought dead for a long time.  Lisp has now become mainstream again with Clojure and arc.  ML is getting a resurgence with backing from Microsoft via F#.  Erlang is suddenly everyone’s best friend because it can do some nice stuff with concurrency (but itself is pretty slow on a single thread).    Ruby came back from the dead when a Web 2.0 framework was developed on it.  Then there are the “new” languages, like Scala (IMO a similar language to ocaml).

What’s funny is that all of these languages were around when I was in college, years ago, and I thought they were mostly as pointless then as I do today.  They’re fashionable.  They’ll come in style, be screwed around with for a while, programmers with write mission critical code with it, and abandon them (and usually your company) as fast as they came to them. 

Look at Twitter.  They must have a bunch of people working there who are pathological language adopters.  First, they chose Ruby on Rails to develop their website – a framework and language with known scaling problems.  They ran into scaling problems.  Then instead of just taking their Ruby on Rails problems and fixing it with C, or C++, or just plain old Java, they begin using Scala.  So now they are still using the JVM, but can’t hire in people who know the primary language that runs in it.

I am not saying I don’t get caught up in this too.  I’m always interested in and reading up on languages… but that doesn’t mean I’m going to use them in production.   Writing production code in these languages is like writing a collective book in Esperanto.  Someone who is a linguist can work with you, and maybe the 3 or 5 of you who start the project are linguists.  Good for you.  But then some of your linguists quit, and the next person who has add to it is not a linguist, they’re just a writer (or in our case, a programmer).  You must be able to work with these people if you have any intention of long term success and collaboration.   At some point, you’re just going to need someone who is a language domain expert, not a general programming expert who can pick up anything.  And if you think you can always hire generalist expert programmers, it is very, very difficult to find (unless you cruise by the Google campus) and hire these people (unless you have Google money).

The additional problem you have is that these languages never interface well with others – or at least easily.  Just about every general library written in the world is exposed to C or C++.   It’s rare to see one that is not, and those that are not are generally not successful (WPF anyone?).  Java has a lot of libraries, and so does C#.  But I am just not sure I’m ready to say that these are completely workable without C/C++ (for example, you still have to use Microsoft.Win32 if you want a lot of functionality in C#, or SWT for a decent UI toolkit in Java). 

One of the things that’s immensely annoying about using a language like Python and C# on a daily basis is this need for C libraries.  Python tries to make this easier, but it still requires matching up Python API versions, compiling plugins on Windows that were designed for Linux, etc.  Like one of the guys on my team was trying to figure out how to do a full-screen eyedropper tool using PyQT.  It took him days to figure this out, and he’s a smart guy. 

This kind of incompatibility between the language and the platform is just painful, and if our code was in C instead of Python, we’d spend a lot less time dealing with compatibility issues and more time dealing with problem area issues.

That said, I’m announcing here that I will be porting Dylan to the JVM and CLR.  Then I’m going to write some books on it and make lots of money.  Since languages from the 70s and 80s have come back into style lately, we need some languages from the 90s next.

2 comments:

michjo said...

You must be able to work with these people if you have any intention of long term success and collaboration.

Very good point, one that I think programmers tend to forget. And I'm one to talk - working in localization, I use Perl daily and almost exclusively, but I can do so much with it so easily on so much data at a time that I can't imagine giving it up :-). But has Perl ever really gone out of style only to come back into style?

Esperanto is perhaps not the best example of this totally valid point, however. Although relatively few in number, the world's 2,000,000 or so Esperanto speakers represent a surprisingly diverse taxonomic and geographic swath of the world's population that includes few linguists. That's largely due to Esperanto being easy enough to learn that anyone can master it, making it attractive to the everyday Joe. I work in localization, and am interested in languages, but am by no means a linguist. None of the other Esperantists I know and interact with personally are linguists, either - all are programmers or writers. Extrapolating from my experience, you would be most likely to start your book in Esperanto with all quite ordinary folk and no linguists on your team. Just some food for thought.

Trimbo said...

Interesting! Thanks for the feedback michjo. I am ignorant of Esperanto's popularity, only that it is used for translation.

Also, I found out that Larry Ellison called the computer industry the fashion industry when talking about cloud computing. So my thought was definitely not original, but it's oh soooo true.