Thursday, September 19, 2013

The most productive UIs have steep learning curves

A friend posted on Facebook this video of a baby using an iphone with the tag "If there was ever any doubt that the iPhone interface is so intuitive that a baby could use it, here's proof."

I replied "The problem is the UI doesn't mature beyond that age group", to which she replied "Why should it?"

The answer is, because babies using iPhones to play music and look at pictures doesn't imply that UX can lead to super productive tools for more demanding needs that grownups have.

Look at it this way: we have computers in our pocket that are more powerful than probably half of the computers I've used in my lifetime. Hampering their potential in the name of the initial learning curve is a waste. Apple won't allow things like widgets, or replacement launchers, or any of the UI hacks I've come to rely on for Android. And that's just scratching the surface of what you should be able to do in order to make productive UIs for people.

Furthermore, on either platform, we spend WAY too much time on design and not enough on functionality. The scroll bounces when it gets to the bottom? The little thing flies into the icon? OMGWTFBBQ. Somehow, we've become a development culture obsessed with "delighting" instead of providing them with awesome things they couldn't do before. [Sub-topic: have we run out of things we couldn't do before?]

When I worked in 3D, the most productive tools I ever used -- before or since -- had really daunting learning curves.

The first I used was Softimage Creative Environment, circa 1993. Here's a screenshot:

This was a ridiculously productive 3D environment -- best I ever used -- and yet had a horrific learning curve. And this is the 3.0 version (1995-ish). The 2.x version I learned on was even less polished.

And if you look at what Softimage did with XSI, where it started to get 3DSMax-like, I felt like I ended up hunting around more than being productive. The change was kind of like when Microsoft moved to the Ribbon.

Another was Discreet Logic's Flame software. Doesn't this look intuitive?!

When I started learning Flame in 1999 or 2000, I would come home every day just drained. And yet, over time, it became one of the fastest, most fluid pieces of software I've ever used and I feel like I did great work with it.

Back to iPhone Baby. Trading learning curve for functionality might be more important for consumer-level products. I mean... there are a lot of those out there, right? Google Docs, for example.

But can you imagine Photoshop shipping with the iPhone mentality towards UIs? That's absurd. Even Lightroom already pisses me off because it tries so hard to be intuitive than useful.

What's my point? 

 - I think we put too much emphasis on ease and not enough on productivity when it comes to UIs for complicated tools in recent times. Maya is a good example of this (Lightroom too, as mentioned). And these fancy UIs take a lot of time to develop. Development time that could be spent on something else.

 - Babies playing with phones is no evidence that the device is at all better / more useful / more productive than UIs that babies can't use. So I don't get why it's important other than as marketing for Apple / Google / Samsung.

Sunday, September 15, 2013

How The RBS Incident Relates to Choosing Clojure for Your Startup

Couple of Hacker News/proggit articles worth combining today: "where my mouth is" and "RBS Mainframe Meltdown: A Year On".

The first article is about a long-time LISP guy who has bet his company on using Clojure. The second is about how the Royal Bank of Scotland has had a complete computing meltdown with their ancient mainframe running COBOL and is going to be spending half a billion pounds trying to fix it.

The RBS article has comments attached to it to the effect of "well, it's hard to find people who know COBOL these days" and "the people who built it are retired". It's true, right? Those COBOL guys get paid a lot of money to come in to fix these old systems.

But let's put this in the context of the we're-using-Clojure article.

The blog post mentions some Silicon Valley hive-mind discussion points like "with good programmers, language doesn't matter" and "it's hard to find people, but it's hard anyway".

But it does matter, doesn't it?

Arguably, the only reason you're getting people to do Clojure now is because either:

a) The problem you're working on is interesting, in which case the language doesn't matter at all, as the OP said.

b) The problem is not interesting, but the chance to develop in Clojure is handed to people and they want to do that.

Let's assume it's not (b). Because if it's (b), your company is screwed. The only people you've attracted are those who only ever want to use new-and-shiny and have no interest in the problem. They'll jump at the next new and shiny technology as soon as that comes along and leave you with a pile of disaster as they figured out (or didn't figure out) best practices. EA inherited one of these while I was there, an Erlang service that was acquired in that took years to remove.

So let's say it's (a) in this case. Great, you've got an interesting problem that attracts people... for now. What happens when your problem is no longer interesting?

If you put that in the context of the RBS problem, this is similar. Banking used to be interesting for programmers (I think, since there was no Snapchat back then). What happens if your problem is not interesting next year? Or in two years? Or even in 15 years? Who are you going to hire then?

Additional thing is, this RBS story is for a language that was mainstream. With Clojure, the chances are extremely poor that it will enjoy mainstream success -- Lisps have had 50 years and still haven't.

Has the OP thought about that long term prognosis? At what point would the company port away from Clojure? Or, if the company is wildly successful, are they prepared to be the only major example using it 10 years from now, like Jane Street is for OCaml?

RBS could have had that COBOL been ported to C at any time after 1972, RBS would have no problem finding folks right now. Had it been ported to C++ at any time after 1980-something... or Java after 1995... you get the idea. But apparently it never made financial sense for them to do it -- and that's a bank with big time resources.

When we're talking about a startup, at what point does it make financial sense to port your stuff off of whatever you chose initially? Sometimes, never. Whatever you choose may have to live with your company forever, as RBS and others have shown us. And that can be a huge problem if you choose to go with something that's not mainstream, maybe sooner than you think.

Thursday, September 12, 2013

Pretend Grace Hopper is in the audience

There's been a ton of flap about some presentations made at TechCrunch "Disrupt", deservedly so. The "Titstare" presentation has generated the most backlash. There's also the "Circle Shake" presentation. ValleyWag has the details on them both, in case you haven't heard already.

Enter these into the annals of now dozens of "tech" conference presentations that were inappropriate or, even worse, misogynistic, homophobic and otherwise prejudiced. There are even serial inappropriate presenters, such as Matt Van Horn (google search).

I have a tip I just thought of for anyone who is making a presentation and wants to figure out if what they're going to present is inappopriate: pretend Grace Hopper is in the audience. So you're up there joking around about your app staring at breasts, you'd look out in the audience and see this:

I never knew Grace Hopper and I doubt you did either, but here's a rough guideline. She was:

a) A great mind in computing
b) A professional presenter
c) A woman in computing.

Would she approve of your presentation? Would she think what you've produced is worthy? Would she be happy with your professionalism? Would she laugh at your jokes?

Ask yourself these questions, and if the answers are "no", then don't do the presentation you had in mind.