Thursday, April 03, 2014

Why Node is the Future of the Web Tier

Everyone I know hates Javascript. Including people who do it professionally.

I hate Javascript. I long for the day where it's been completely destroyed in favor of something else, I don't even care what. Typescript and Dart both look really promising, though I question whether either will ultimately make a dent in the dominance of Javascript.

Node is a gigantic hack. A browser Javascript engine pulled into the server layer? Single threaded?

Node is slower than most alternatives. Even the most rudimentary JVM-based framework will blow it away.

And Node is the future? Yeah, it is. I told my coworkers this the other day in our internal IRC and they couldn't believe it. I thought I should explain my position a little bit more clearly in a blog post.

The reason is a people reason

The server-side web tier is quickly becoming the place where no one specializes. At our company, we have "Front End" and "Back End" positions to hire for. What does that mean?

  • Front End: Javascripty, CSSy and HTMLly stuff.
  • Back End: Servery, Databasey stuff
As the front end becomes more sophisticated and contains more logic, the Back End folks are no longer interested in writing a simple DAL/CRUD web-tier for Front End people to call into. That kind of work is a solved problem, and if the real interesting application logic lives in Javascript, it's no fun. Rather, they're more interested in working on scalable internal services and systems that the lightweight web tier can call back into and work on.

This was our problem when I was back at Groupon. No back-end systems people wanted to take on new work in our Rails stack. The API layer, and related search and database services, sure. But not the web tier. So when it came to do a rewrite, who would own this and what tech would thye use?

The answer is, the front end people need to own this web tier. It cuts down on iteration time and makes for clear ownership. Then back-end people will focus their efforts on scalable systems in Java that the web tier calls out to.

Small companies have been cutting this loop for a long time with Node, but now you see major companies making this transition. Wal-mart. Groupon. Front end tooling relies on Node: Twitter relies on Node for Bower. Microsoft is supporting Node for Windows and has several projects that use it. Everyone everywhere uses it for unit testing their JS.

And it's trivial to get started -- something that seems to lead to adoption in the modern age. NPM is really easy to use and getting a basic Node site set up is easy. There's a lot of hosting as well.

Anyway, I hope this shows why Node and Javascript will continue to eat the future, even though everyone hates it and it's a gigantic hack. Don't ever forget the wisdom of Stroustrup: "There are only two kinds of languages: the ones people complain about and the ones nobody uses".

Wednesday, February 05, 2014

Learn to math, not to code

The "learn to code" meme has gotten so strong that even our President was weighing in on it late last year:


I love his message, I do. I think kids should be learning to program games. I love what code.org is doing and the idea of the "hour of code". We absolutely need more programming classes in K-12.

But the resources made available are almost like a trade school for "code". If someone goes onto codecademy and learns to program Javascript, CSS, HTML and makes the next Instagram, that's not learning to be a computer scientist.

The great software engineers I've ever met all excelled at math and science in high school and college. These are the foundation for learning anything that comes along next, including Computer Science.

Long story short, we've started confusing "code" with all of the core education that we really need to emphasize. And that core tool is MATH.

Let's take a look at an introductory computer science game from when I was a kid: Rocky's Boots. Rocky's Boots is a fantastic logic game that, if you purchased in the App Store today, you'd pay $70,000 in microtransactions to complete (/snark)


This kind of logic gate problem solving is absolutely core to "coding". Rocky's Boots teaches kids to think in the terms that computers work in: 1s and 0s. Binary. Logic gates.

This is math.

Do you want to program a game? Awesome! Let's talk about 3D animation... wait, that requires trig, linear algebra, calculus...

This is math.

I've got it! You want to be a big data expert, coming up with recommendation engines for...

Hmmmmm.... I'm seeing a pattern here.

It's really simple. If you want to do any one of these things, and not just throw together a website with your limited knowledge of node.js, become an expert with math. I have a degree in math and even today, wish I was much better at it.

The message I want to put out to kids is: if you become an expert in math, it really opens up the world for you to all the possibilities. So do that... and also mess around with code.

Sunday, December 15, 2013

The Case for Cutler


All of my friends on Facebook are calling for the Bears to sit Cutler and play McCown instead. McCown has had a great 5 game starting streak -- in fact, he has the best single season stats of any QB who has started more than 4 games for the Bears ever.

That said, the law of averages is not on his side for being a futuristic miracle QB for the Bears. His average QB rating is around 70 for his career and his average INT % is around 3.5. McCown's stats in Arizona and Oakland as the starter were far below the league average. He is also 4 years older than Cutler.

And I don't recall anyone asking for McCown to replace Cutler forever when, just two years ago, he played in three games and threw 4 picks in those games. How do you know that his recent hot streak is not a fluke?

Cutler was, is and should continue to be the Bears QB.

Some stats for you about Cutler:
  • Of the top 10 QB single seasons by rating since 1980, on the Bears, Cutler has four of them. McMahon and Kramer both have two, Harbaugh and McCown have one each.
  • Cutler has thrown more TDs than anyone else in franchise history except Sid Luckman.
  • Sid Luckman had a 1.06 TD/INT ratio however. Cutler and Erik Kramer have the best TD/INT ratio--1.34 and 1.4--other than McCown's recent streak. If you throw out Cutler's horribad first season, his ratio is actually 1.51 -- the best of any Bear.
  • Josh McCown has thrown 15 TDs in 275 attempts -- Cutler has thrown 95 TDs in 2000 attempts. This makes McCown's overall TD percentage 5.45% compared to Cutler's 4.75% -- of course, with 9x the passes. 
  • For comparison, Sid Luckman's TD % is 7.86. 
  • Gale Sayers has a 5.56% TD percentage. It's true, he threw the ball 18 times and made a TD once. Higher than McCown.
  • Henry Burris has a higher TD percentage than McCown. Remember him?
The point of these stats is to show you that McCown's hot streak does NOT prove he's a consistent starting quarterback. The only head to head info we have is that, McCown played two teams that Cutler played this year -- Cutler won one that McCown did not (against Minnesota).

Futhermore, McCown's games came against the worst defenses in the league: Dallas, Minnesota, Lions, Packers and Redskins. The Rams are a passable defense and the only game that had a tough defense was Baltimore. This does not bode well for proving his awesome streak was not a fluke.

I hope the front office this time is not going to be swayed by Chicago fans. Bears fans (including myself) have a notorious history of thinking the backup is a better QB than the starter. History tells us that's almost never worked out.


Thursday, September 19, 2013

The most productive UIs have steep learning curves

A friend posted on Facebook this video of a baby using an iphone with the tag "If there was ever any doubt that the iPhone interface is so intuitive that a baby could use it, here's proof."


I replied "The problem is the UI doesn't mature beyond that age group", to which she replied "Why should it?"

The answer is, because babies using iPhones to play music and look at pictures doesn't imply that UX can lead to super productive tools for more demanding needs that grownups have.

Look at it this way: we have computers in our pocket that are more powerful than probably half of the computers I've used in my lifetime. Hampering their potential in the name of the initial learning curve is a waste. Apple won't allow things like widgets, or replacement launchers, or any of the UI hacks I've come to rely on for Android. And that's just scratching the surface of what you should be able to do in order to make productive UIs for people.

Furthermore, on either platform, we spend WAY too much time on design and not enough on functionality. The scroll bounces when it gets to the bottom? The little thing flies into the icon? OMGWTFBBQ. Somehow, we've become a development culture obsessed with "delighting" instead of providing them with awesome things they couldn't do before. [Sub-topic: have we run out of things we couldn't do before?]

When I worked in 3D, the most productive tools I ever used -- before or since -- had really daunting learning curves.

The first I used was Softimage Creative Environment, circa 1993. Here's a screenshot:


This was a ridiculously productive 3D environment -- best I ever used -- and yet had a horrific learning curve. And this is the 3.0 version (1995-ish). The 2.x version I learned on was even less polished.

And if you look at what Softimage did with XSI, where it started to get 3DSMax-like, I felt like I ended up hunting around more than being productive. The change was kind of like when Microsoft moved to the Ribbon.

Another was Discreet Logic's Flame software. Doesn't this look intuitive?!


When I started learning Flame in 1999 or 2000, I would come home every day just drained. And yet, over time, it became one of the fastest, most fluid pieces of software I've ever used and I feel like I did great work with it.


Back to iPhone Baby. Trading learning curve for functionality might be more important for consumer-level products. I mean... there are a lot of those out there, right? Google Docs, for example.

But can you imagine Photoshop shipping with the iPhone mentality towards UIs? That's absurd. Even Lightroom already pisses me off because it tries so hard to be intuitive than useful.

What's my point? 

 - I think we put too much emphasis on ease and not enough on productivity when it comes to UIs for complicated tools in recent times. Maya is a good example of this (Lightroom too, as mentioned). And these fancy UIs take a lot of time to develop. Development time that could be spent on something else.

 - Babies playing with phones is no evidence that the device is at all better / more useful / more productive than UIs that babies can't use. So I don't get why it's important other than as marketing for Apple / Google / Samsung.

Sunday, September 15, 2013

How The RBS Incident Relates to Choosing Clojure for Your Startup

Couple of Hacker News/proggit articles worth combining today: "where my mouth is" and "RBS Mainframe Meltdown: A Year On".

The first article is about a long-time LISP guy who has bet his company on using Clojure. The second is about how the Royal Bank of Scotland has had a complete computing meltdown with their ancient mainframe running COBOL and is going to be spending half a billion pounds trying to fix it.

The RBS article has comments attached to it to the effect of "well, it's hard to find people who know COBOL these days" and "the people who built it are retired". It's true, right? Those COBOL guys get paid a lot of money to come in to fix these old systems.

But let's put this in the context of the we're-using-Clojure article.

The blog post mentions some Silicon Valley hive-mind discussion points like "with good programmers, language doesn't matter" and "it's hard to find people, but it's hard anyway".

But it does matter, doesn't it?

Arguably, the only reason you're getting people to do Clojure now is because either:

a) The problem you're working on is interesting, in which case the language doesn't matter at all, as the OP said.

b) The problem is not interesting, but the chance to develop in Clojure is handed to people and they want to do that.

Let's assume it's not (b). Because if it's (b), your company is screwed. The only people you've attracted are those who only ever want to use new-and-shiny and have no interest in the problem. They'll jump at the next new and shiny technology as soon as that comes along and leave you with a pile of disaster as they figured out (or didn't figure out) best practices. EA inherited one of these while I was there, an Erlang service that was acquired in that took years to remove.

So let's say it's (a) in this case. Great, you've got an interesting problem that attracts people... for now. What happens when your problem is no longer interesting?

If you put that in the context of the RBS problem, this is similar. Banking used to be interesting for programmers (I think, since there was no Snapchat back then). What happens if your problem is not interesting next year? Or in two years? Or even in 15 years? Who are you going to hire then?

Additional thing is, this RBS story is for a language that was mainstream. With Clojure, the chances are extremely poor that it will enjoy mainstream success -- Lisps have had 50 years and still haven't.

Has the OP thought about that long term prognosis? At what point would the company port away from Clojure? Or, if the company is wildly successful, are they prepared to be the only major example using it 10 years from now, like Jane Street is for OCaml?

RBS could have had that COBOL been ported to C at any time after 1972, RBS would have no problem finding folks right now. Had it been ported to C++ at any time after 1980-something... or Java after 1995... you get the idea. But apparently it never made financial sense for them to do it -- and that's a bank with big time resources.

When we're talking about a startup, at what point does it make financial sense to port your stuff off of whatever you chose initially? Sometimes, never. Whatever you choose may have to live with your company forever, as RBS and others have shown us. And that can be a huge problem if you choose to go with something that's not mainstream, maybe sooner than you think.

Thursday, September 12, 2013

Pretend Grace Hopper is in the audience

There's been a ton of flap about some presentations made at TechCrunch "Disrupt", deservedly so. The "Titstare" presentation has generated the most backlash. There's also the "Circle Shake" presentation. ValleyWag has the details on them both, in case you haven't heard already.

Enter these into the annals of now dozens of "tech" conference presentations that were inappropriate or, even worse, misogynistic, homophobic and otherwise prejudiced. There are even serial inappropriate presenters, such as Matt Van Horn (google search).

I have a tip I just thought of for anyone who is making a presentation and wants to figure out if what they're going to present is inappopriate: pretend Grace Hopper is in the audience. So you're up there joking around about your app staring at breasts, you'd look out in the audience and see this:


I never knew Grace Hopper and I doubt you did either, but here's a rough guideline. She was:

a) A great mind in computing
b) A professional presenter
c) A woman in computing.

Would she approve of your presentation? Would she think what you've produced is worthy? Would she be happy with your professionalism? Would she laugh at your jokes?

Ask yourself these questions, and if the answers are "no", then don't do the presentation you had in mind.

Friday, August 30, 2013

On the ubiquity of chicken rotisseries

Once upon a time, long long ago -- i.e. 1995 -- rotisserie chicken was a huge business.

The trend was started with Boston Chicken (now Boston Market). The idea was making a healthier, classier, more expensive alternative to fast food. The original founder of Boston Chicken bought the idea and franchised it (not unlike Ray Kroc) because the owners were ringing up an average of $13.75 per bill.

Many clones popped up in the early-90s. The rotisserie chicken craze was big enough such that Seinfeld once featured Kenny Rogers Roasters on the show:



In 1993, Boston Chicken had a massive IPO. By 1998, the company was bankrupt. Kenny Rogers roasters was bankrupt slightly beforehand.

What happened?

The key issue with rotisserie chicken is that it takes almost no space and no overhead to do it. A 10 or 20 chicken rotisserie will cost you somewhere in the neighborhood of $5K. So every Safeway, Dominicks, Costco and Walmart in America soon had their own rotisserie installed and were undercutting the crap out of these businesses that made it their sole business to cook rotisserie chicken.

Is this at all sounding like a lesson that might be applicable to the tech world in Silly Valley? You betcha.

I happened to watch Morning Joe yesterday, and they were broadcasting from Detroit. In talking about the history of the city, Packard cars and various other aspects, someone mentioned "Detroit was the Silicon Valley of its time." And holy crap, they nailed it. Detroit had attracted the brightest minds. They were innovating on cars, and dominating the landscape from the early 1900s to the 1960s.

Then... what happened? They got rotisseried. The technology for making cars, especially autonomously (i.e. robotics) became ubiquitous. And margins dropped. And they got crushed. And high-end cars like BMW started building a better brand in the states, and they crushed the high end.

A lot of people blame the union pensions at these companies, and I'm sure that has something to do with the cost structure problems. But the overall cause of the issue was that the technology to provide the product consumers wanted (cars, or rotisserie chickens) became easy to replicate.

And now it's time to evaluate for yourself: what's a business's rotisserie machine? What's the component that makes it valuable, and how does that component become easily replicable? Apply this metric to companies that are building tech especially -- what happens when their custom tech becomes commodity?