Sunday, December 15, 2013

The Case for Cutler


All of my friends on Facebook are calling for the Bears to sit Cutler and play McCown instead. McCown has had a great 5 game starting streak -- in fact, he has the best single season stats of any QB who has started more than 4 games for the Bears ever.

That said, the law of averages is not on his side for being a futuristic miracle QB for the Bears. His average QB rating is around 70 for his career and his average INT % is around 3.5. McCown's stats in Arizona and Oakland as the starter were far below the league average. He is also 4 years older than Cutler.

And I don't recall anyone asking for McCown to replace Cutler forever when, just two years ago, he played in three games and threw 4 picks in those games. How do you know that his recent hot streak is not a fluke?

Cutler was, is and should continue to be the Bears QB.

Some stats for you about Cutler:
  • Of the top 10 QB single seasons by rating since 1980, on the Bears, Cutler has four of them. McMahon and Kramer both have two, Harbaugh and McCown have one each.
  • Cutler has thrown more TDs than anyone else in franchise history except Sid Luckman.
  • Sid Luckman had a 1.06 TD/INT ratio however. Cutler and Erik Kramer have the best TD/INT ratio--1.34 and 1.4--other than McCown's recent streak. If you throw out Cutler's horribad first season, his ratio is actually 1.51 -- the best of any Bear.
  • Josh McCown has thrown 15 TDs in 275 attempts -- Cutler has thrown 95 TDs in 2000 attempts. This makes McCown's overall TD percentage 5.45% compared to Cutler's 4.75% -- of course, with 9x the passes. 
  • For comparison, Sid Luckman's TD % is 7.86. 
  • Gale Sayers has a 5.56% TD percentage. It's true, he threw the ball 18 times and made a TD once. Higher than McCown.
  • Henry Burris has a higher TD percentage than McCown. Remember him?
The point of these stats is to show you that McCown's hot streak does NOT prove he's a consistent starting quarterback. The only head to head info we have is that, McCown played two teams that Cutler played this year -- Cutler won one that McCown did not (against Minnesota).

Futhermore, McCown's games came against the worst defenses in the league: Dallas, Minnesota, Lions, Packers and Redskins. The Rams are a passable defense and the only game that had a tough defense was Baltimore. This does not bode well for proving his awesome streak was not a fluke.

I hope the front office this time is not going to be swayed by Chicago fans. Bears fans (including myself) have a notorious history of thinking the backup is a better QB than the starter. History tells us that's almost never worked out.


Thursday, September 19, 2013

The most productive UIs have steep learning curves

A friend posted on Facebook this video of a baby using an iphone with the tag "If there was ever any doubt that the iPhone interface is so intuitive that a baby could use it, here's proof."


I replied "The problem is the UI doesn't mature beyond that age group", to which she replied "Why should it?"

The answer is, because babies using iPhones to play music and look at pictures doesn't imply that UX can lead to super productive tools for more demanding needs that grownups have.

Look at it this way: we have computers in our pocket that are more powerful than probably half of the computers I've used in my lifetime. Hampering their potential in the name of the initial learning curve is a waste. Apple won't allow things like widgets, or replacement launchers, or any of the UI hacks I've come to rely on for Android. And that's just scratching the surface of what you should be able to do in order to make productive UIs for people.

Furthermore, on either platform, we spend WAY too much time on design and not enough on functionality. The scroll bounces when it gets to the bottom? The little thing flies into the icon? OMGWTFBBQ. Somehow, we've become a development culture obsessed with "delighting" instead of providing them with awesome things they couldn't do before. [Sub-topic: have we run out of things we couldn't do before?]

When I worked in 3D, the most productive tools I ever used -- before or since -- had really daunting learning curves.

The first I used was Softimage Creative Environment, circa 1993. Here's a screenshot:


This was a ridiculously productive 3D environment -- best I ever used -- and yet had a horrific learning curve. And this is the 3.0 version (1995-ish). The 2.x version I learned on was even less polished.

And if you look at what Softimage did with XSI, where it started to get 3DSMax-like, I felt like I ended up hunting around more than being productive. The change was kind of like when Microsoft moved to the Ribbon.

Another was Discreet Logic's Flame software. Doesn't this look intuitive?!


When I started learning Flame in 1999 or 2000, I would come home every day just drained. And yet, over time, it became one of the fastest, most fluid pieces of software I've ever used and I feel like I did great work with it.


Back to iPhone Baby. Trading learning curve for functionality might be more important for consumer-level products. I mean... there are a lot of those out there, right? Google Docs, for example.

But can you imagine Photoshop shipping with the iPhone mentality towards UIs? That's absurd. Even Lightroom already pisses me off because it tries so hard to be intuitive than useful.

What's my point? 

 - I think we put too much emphasis on ease and not enough on productivity when it comes to UIs for complicated tools in recent times. Maya is a good example of this (Lightroom too, as mentioned). And these fancy UIs take a lot of time to develop. Development time that could be spent on something else.

 - Babies playing with phones is no evidence that the device is at all better / more useful / more productive than UIs that babies can't use. So I don't get why it's important other than as marketing for Apple / Google / Samsung.

Sunday, September 15, 2013

How The RBS Incident Relates to Choosing Clojure for Your Startup

Couple of Hacker News/proggit articles worth combining today: "where my mouth is" and "RBS Mainframe Meltdown: A Year On".

The first article is about a long-time LISP guy who has bet his company on using Clojure. The second is about how the Royal Bank of Scotland has had a complete computing meltdown with their ancient mainframe running COBOL and is going to be spending half a billion pounds trying to fix it.

The RBS article has comments attached to it to the effect of "well, it's hard to find people who know COBOL these days" and "the people who built it are retired". It's true, right? Those COBOL guys get paid a lot of money to come in to fix these old systems.

But let's put this in the context of the we're-using-Clojure article.

The blog post mentions some Silicon Valley hive-mind discussion points like "with good programmers, language doesn't matter" and "it's hard to find people, but it's hard anyway".

But it does matter, doesn't it?

Arguably, the only reason you're getting people to do Clojure now is because either:

a) The problem you're working on is interesting, in which case the language doesn't matter at all, as the OP said.

b) The problem is not interesting, but the chance to develop in Clojure is handed to people and they want to do that.

Let's assume it's not (b). Because if it's (b), your company is screwed. The only people you've attracted are those who only ever want to use new-and-shiny and have no interest in the problem. They'll jump at the next new and shiny technology as soon as that comes along and leave you with a pile of disaster as they figured out (or didn't figure out) best practices. EA inherited one of these while I was there, an Erlang service that was acquired in that took years to remove.

So let's say it's (a) in this case. Great, you've got an interesting problem that attracts people... for now. What happens when your problem is no longer interesting?

If you put that in the context of the RBS problem, this is similar. Banking used to be interesting for programmers (I think, since there was no Snapchat back then). What happens if your problem is not interesting next year? Or in two years? Or even in 15 years? Who are you going to hire then?

Additional thing is, this RBS story is for a language that was mainstream. With Clojure, the chances are extremely poor that it will enjoy mainstream success -- Lisps have had 50 years and still haven't.

Has the OP thought about that long term prognosis? At what point would the company port away from Clojure? Or, if the company is wildly successful, are they prepared to be the only major example using it 10 years from now, like Jane Street is for OCaml?

RBS could have had that COBOL been ported to C at any time after 1972, RBS would have no problem finding folks right now. Had it been ported to C++ at any time after 1980-something... or Java after 1995... you get the idea. But apparently it never made financial sense for them to do it -- and that's a bank with big time resources.

When we're talking about a startup, at what point does it make financial sense to port your stuff off of whatever you chose initially? Sometimes, never. Whatever you choose may have to live with your company forever, as RBS and others have shown us. And that can be a huge problem if you choose to go with something that's not mainstream, maybe sooner than you think.

Thursday, September 12, 2013

Pretend Grace Hopper is in the audience

There's been a ton of flap about some presentations made at TechCrunch "Disrupt", deservedly so. The "Titstare" presentation has generated the most backlash. There's also the "Circle Shake" presentation. ValleyWag has the details on them both, in case you haven't heard already.

Enter these into the annals of now dozens of "tech" conference presentations that were inappropriate or, even worse, misogynistic, homophobic and otherwise prejudiced. There are even serial inappropriate presenters, such as Matt Van Horn (google search).

I have a tip I just thought of for anyone who is making a presentation and wants to figure out if what they're going to present is inappopriate: pretend Grace Hopper is in the audience. So you're up there joking around about your app staring at breasts, you'd look out in the audience and see this:


I never knew Grace Hopper and I doubt you did either, but here's a rough guideline. She was:

a) A great mind in computing
b) A professional presenter
c) A woman in computing.

Would she approve of your presentation? Would she think what you've produced is worthy? Would she be happy with your professionalism? Would she laugh at your jokes?

Ask yourself these questions, and if the answers are "no", then don't do the presentation you had in mind.

Friday, August 30, 2013

On the ubiquity of chicken rotisseries

Once upon a time, long long ago -- i.e. 1995 -- rotisserie chicken was a huge business.

The trend was started with Boston Chicken (now Boston Market). The idea was making a healthier, classier, more expensive alternative to fast food. The original founder of Boston Chicken bought the idea and franchised it (not unlike Ray Kroc) because the owners were ringing up an average of $13.75 per bill.

Many clones popped up in the early-90s. The rotisserie chicken craze was big enough such that Seinfeld once featured Kenny Rogers Roasters on the show:



In 1993, Boston Chicken had a massive IPO. By 1998, the company was bankrupt. Kenny Rogers roasters was bankrupt slightly beforehand.

What happened?

The key issue with rotisserie chicken is that it takes almost no space and no overhead to do it. A 10 or 20 chicken rotisserie will cost you somewhere in the neighborhood of $5K. So every Safeway, Dominicks, Costco and Walmart in America soon had their own rotisserie installed and were undercutting the crap out of these businesses that made it their sole business to cook rotisserie chicken.

Is this at all sounding like a lesson that might be applicable to the tech world in Silly Valley? You betcha.

I happened to watch Morning Joe yesterday, and they were broadcasting from Detroit. In talking about the history of the city, Packard cars and various other aspects, someone mentioned "Detroit was the Silicon Valley of its time." And holy crap, they nailed it. Detroit had attracted the brightest minds. They were innovating on cars, and dominating the landscape from the early 1900s to the 1960s.

Then... what happened? They got rotisseried. The technology for making cars, especially autonomously (i.e. robotics) became ubiquitous. And margins dropped. And they got crushed. And high-end cars like BMW started building a better brand in the states, and they crushed the high end.

A lot of people blame the union pensions at these companies, and I'm sure that has something to do with the cost structure problems. But the overall cause of the issue was that the technology to provide the product consumers wanted (cars, or rotisserie chickens) became easy to replicate.

And now it's time to evaluate for yourself: what's a business's rotisserie machine? What's the component that makes it valuable, and how does that component become easily replicable? Apply this metric to companies that are building tech especially -- what happens when their custom tech becomes commodity?

Saturday, August 10, 2013

Returned Chromebook.... again

Earlier in the year, I bought my wife a Chromebook that we then returned because Citrix did not work for her on the Chromebook. I didn't use it much so... I decided to give one a try when I found myself without a laptop last week.

But.. I returned it just now. Here's why:

  • Dreadfully slow.

    The $250 Samsung ARM-based Chromebook is very, very slow for Google's own web properties. I would be unable to type this blog post on that machine -- it cannot keep up with my typing within the Blogger window, or Gmail, or Google docs. Yes, I type faster than most folks but so do Google's developers.

    Probably the most damning thing is how badly this machine performs on G+. It's unusably slow. It's even worse than Facebook on the machine.

    Google: shipping Chromebooks that can't keep up with G+ seems like a lost cause given your goals on both fronts.
  • Chromecast support (lack of)

    Netflix, because it uses a specialized player on Chromebook, does not support Chromecast on the platform. This is a must-have given our newfound Chromecast dependence in the house.

    I have yet to understand Google's Chromecast strategy when it comes to Netflix. They shipped a version of the Netflix app that would freeze your phone and even after that only kinda-sorta works.They don't support Chromecasting Netflix from their own hardware (which is still the #1 laptop on Amazon, BTW).

    If the idea is "iterate on it", that doesn't work with consumer products. This is something that Steve Jobs understood well. Consumers want something that works as its supposed to out of the box.
I think the problem here for Google is one of dogfooding. I was around the Google campus a couple times recently and saw people using Chromebook Pixels and Macs. I didn't see people using the Samsung Chromebook.

A really good rule of thumb when shipping a product is not to ship something you wouldn't use yourself. I think this is where the Samsung Chromebook lies in Google's strategy. The hardware is nice though -- if it had an intel chip in it, that might have worked.

By the way, can IBM or Intel please save us from ARM already? C'mon guys. Get it together!

Thursday, August 08, 2013

Hubris: the biggest security risk

A couple days ago on Hacker News, the thread about Chrome's security for stored password erupted and culminated with one of Chrome's security mavens posting responses. Within that chain, he posted this quip:

I appreciate how this appears to a novice, but we've literally spent years evaluating it and have quite a bit of data to inform our position. And while you're certainly well intentioned, what you're proposing is that that we make users less safe than they are today by providing them a false sense of security and encouraging dangerous behavior. That's just not how we approach security on Chrome.

This response rubbed me really the wrong way, I couldn't put my finger on exactly why. The obvious thing that rubbed me the wrong way was the patronizing "novice" bit, but it was more than that. Patronizing responses are par for the course on the internet. Why did this particular one stick in my craw?

Then it dawned on me: prior experience should have nothing to do with evaluating security. It's essential to evaluate any and all opinions about security, over and over and over. Security is the ultimate area requiring a meritocracy to succeed. No subject can ever be put to rest, having been decided forever and ever.

Case in point for Google: my brand new Chromebook comes configured out of the box to allow someone to view all of my saved Chrome passwords. It doesn't require a password when I open it up by default (edit: when I close it and re-open it later after having logged in). Google itself is shipping Chrome, on their own box, with no security for saved passwords. If they believed in these best practices for securing web passwords by securing the machine, maybe they should re-evaluate their own products.

The funny thing is, all you need to do to see the same claim made about "putting users in control of their own machine security" is go back to 1996. Check out this thread about ActiveX. It's up to the user to secure their machine by clicking no, of course! How'd that work out?

My advice to the Chrome team is to take a long, hard look at how you form your opinions about users, be open to changing those opinions. And if you have security practices that you believe in, you have to make sure your entire org is following them. Because right now, any user with a Chromebook, by default, is absolutely unsecured when they close it and leave it.

Tuesday, July 30, 2013

Brain Drain and Microsoft

I've been known to say silly things like "everyone must learn to code" and "STEM education trumps all other education". BUT, hear me out on this one.

The future is pretty clear: the knowledge-worker economy. In the future, people are never cogs. We've automated everything cog-like, long ago. The people who do that automation -- who facilitate it, create it -- those people are gold. Their knowledge and ability to create is what empowers the companies that employ them.

And this is why companies like Google and Facebook put so much emphasis on hiring smart people who can create things, even if it means they sit around doing very little (or non-bottom-line-affecting "research"). It's better to have the knowledge on your side, at all times, forever. Give them money, free food, free drinks, gym, whatever it takes to keep them there and keep building the future.

The counter to this is the brain drain. In the modern world of technology and the future world of... well, everything... brain drain is going to be the #1 indicator of a company's demise. Once the net knowledge at a company begins to decrease, their long term prospects are over. It also begins to cascade. Smart people leave and other smart people leave because of it. It's like dominoes. A company may survive well after that brain drain -- it may even grow -- but when it can no longer innovate and respond to market change, then it's dead.

The thing to consider is that you cannot codify what we're talking about here. Codification does not mean innovation. Innovation requires people, always.

In any case, that brings us to Microsoft.

Microsoft is in deep, deep trouble with all of their recent product screwups, like the XBone, Surface RT sales and Windows 8 itself (as I described in my last blog post). BUT, the awesome news is that Microsoft has not significantly lost its great people. It continues to employ many of the smartest people on the planet. And they continue to do great things, one of the most interesting of which (to me) is Typescript.

So no matter what's wrong with Microsoft, they are not seriously suffering from brain drain. So you can't count them out.

Other companies though, it's worth taking a look to see what the influx/outflux looks like when evaluating their long term viability.

Tuesday, July 23, 2013

The Unmitigated Disaster That Is Windows 8 -- Part Deux

Previously: On Windows 8 (January 1, 2013)

I upgraded to Windows 8.1 preview on my home machine. It's an improvement over 8.0, for sure. The ability to actually scale Metro apps dynamically finally makes me feel like Metro has finally reached the equivalent capabilities of System 7. Maybe even GEOS.

The inspiration of this post was that I wanted to point out just how far Microsoft has gone to preserve their broken vision for this product.

1) The "start" menu



The return of the "start menu" in Windows 8.1 is no less than a pro-troll by Microsoft. 

When you click this idiot button, it gives you the same full screen crap you got in Windows 8. Just to give you an example of how ridiculous this is: let's say I have utilized Windows 8.1's split screen to set up Netflix in full screen IE on one half of the screen and Desktop on the other. When I click this "start menu" button to launch a desktop app that I might not have pinned, it stops playing my movie and takes me out of everything.

Long story short, the "start" behavior is just so, so broken. Metro has no place in the Windows desktop experience unless a user wants it there. But Microsoft doesn't care what you think. They're so convinced that this experience is right that they added back "the start menu" to troll you.


2) IE's new sandboxing breaks tons of stuff

Good news everyone, IE now has some sandboxing features that make it more secure. Bad news: people won't be able to use any legacy stuff. DirecTV, Cinemax... anyone with a custom player and DRM-y validation won't work. It also defaults to 64-bit and is (impossible?) to force into 32-bit mode. So I have not yet been able to play a video on any of my subscription TV sites through IE. Only Netflix works. 'Cos it's Silverlight, yo.


Thing is, you might think that Windows 8 is important. It's not. It's not to anyone but Microsoft. No one else cares about this thing.

We're now close to the year anniversary of Windows 8 being released -- the RTM was published on August 1, 2012. And yet if you look in the Microsoft "store", you'll see there are no official apps for tons of stuff. But one that sticks out is Facebook. Uh, isn't Facebook a partner of Microsoft's? Didn't, like, Microsoft invest in them? And yet there is no official Metro app?

So the more likely outcome of something like the above IE sandboxing is that DirecTV and Cinemax will just tell people to install Chrome. Customers who ask Facebook for an official app would probably just be told to invest in an iPad or just use the web browser. Customers who are fed up with the Start menu crap will just start buying Macs. Microsoft's vision is so poor that they're making choices that actively push people and developers towards the competition.

Truth be told, it's too late for Microsoft on this one. They needed to throw Windows 8 and its vision under the bus immediately like they did Vista. Instead, keep betting everything on it. Windows 8.1 is an attempt to actively alienate customers who hate Windows 8 with that pro-troll Start menu. They've already lost $1bn on Surface, but they're probably on the cusp of releasing another one. Their commitment to the "vision" of Windows 8 borders on zealotry.

Enterprises actively rejected Windows 8. My thought is that if Microsoft wants to hold onto any enterprise market at all through this, they do have some choices. One is that they could open source the CLR/BCL on the Apache license. At least then .NET and all of the tools that have been written on .NET would survive this whole thing. But as long as Windows and .NET are intrinsically tied to each other, and Microsoft continues to shove Metro down people's throats, the more likely even their enterprise customers are going to start jumping ship into something more reasonable. I would never recommend any company use Windows again for as long as this is their philosophy. Macs and web tools for the win right now.

Monday, July 22, 2013

Future Liability Problems of Big Data for Merchants

In the wake of the PRISM scandal, many have wondered about the trust we give to corporations that mine data. Do we trust AT&T with those same phone records? Do we trust Google with the knowledge of everything that's in our mind (by way of evaluating our searches?)

I feel fairly confident that corporations will try to act in a responsible way with that information, at least within their own corridors (maybe not when forced to give it to the government). I expect it more than I would expect unelected, unaccountable people in a black cube in Maryland to act responsibly with the same information. At least with corporations, abuse is likely easier to prove--since jailtime is not a threat for someone who exposes it--and will be countered with harsh consumer ramifications. In government, the information could be much more surreptitiously used to, for example, keep Bobby Jindal or Hillary Clinton out of the next Presidential election.

So I'm not that worried about corporations abusing our data compared to government.

However, I had a thought experiment today that started when I was buying groceries at Safeway.

Safeway has tracked every purchase I've made for probably 10 years now thanks to their rewards card. Today I got a printed offer for those yummy Mio drink thingies with my receipt, since I bought a bunch of Mio like 2 months ago and Safeway's Skynet thinks it must probably be high time to replenish (turns out, it's right! That delicious Black Cherry Mio needs replenishing)

Guess what? I also buy wine at Safeway. Occasionally, those little print-out things offer me discounts on wine. (Godawful brands I wouldn't buy, but nonetheless, it does it)

Now, most states have what's called Dram Shop liability. This is that whole thing where a bartender can't serve people who are visibly intoxicated or else risk being sued when they run over someone with their car or do something harmful under the influence.

Connecting these thoughts together. If I'm a person clearly having an alcohol abuse problem... that is, if I'm a customer buying two bottles of vodka per day or something... won't Safeway be able to tell of the person's issue from the data? I bet they can.

So could we headed towards a future where merchants like Safeway--having used "big data" to create personalized rewards shopping--can be held liable for the effects of those rewards? In essence, if I become an alcoholic and develop cirrhosis, or become obese and diabetic, can I then turn around and sue Safeway for their incentivizing these purchases?

I'd say the answer is of course, yes, their liability will someday be challenged this way if they keep it up. I'm not at all for tort law, but it does always push the boundaries of blame. And when merchants are using this kind of data to key into people's habits for something harmful or addictive, the computer may be opening them them up to lawsuits down the road.

We'll see. Food for thought (pun intended)


Wednesday, July 17, 2013

The easiest ways in the world to screen engineers

There's been a lot of talk about technical interview questions and such on the interwebs. Google has concluded that brainteasers don't work. There are libraries of interview questions out there, Glassdoor keeps a record of interview questions, etc. etc.

You know what? Technical prowess matters, but there are much, much easier signs of whether someone will be a great engineer or not. I wish I had the hard numbers to back this up but for now you'll have to take my word for it.


  1. They show up on time / take your call when scheduled

  2. So far, this is the number one indicator I've found. People cancel interviews at the last minute with no explanation or apology. They don't answer their phone when a phone screen is scheduled. They don't respond to email for odd periods of time. 

    A really good example is a candidate who did not respond at all to his offer letter for about a week. It exploded during that time. When this person finally got back to me, I had to rewrite it with a new date. In all, this behavior was absolutely indicative of work ethic. 

    In any one of these cases, it's better to err on the side of believing it's an issue and not proceed further. 

  3. Can have a conversation about code

  4. Example: "Explain to me why you chose this stack to solve this problem?"  The person talks and talks around the answer, explaining what these off the shelf components do. Ask again: "I'm familiar with the stack. But what are the requirements that led to this being the best solution?" Again, talks around and around without giving the requirements.

    I understand people feel like they need to prove how smart they are. But show me that you can actually hold a 15 minute conversation about some cool work you did, wherein both people can participate equally in said conversation. I ask a lot of "why" questions just to see if I can get people to show an opinion about something and have a conversation.

  5. Can follow directions

  6. Follow directions when it comes to solving a whiteboard problem, or whatever. There are lots of examples for this, but here's a really, really basic one. I once had someone who was 2 hours late to the interview because they had gone to an address in South San Francisco instead of San Francisco. Which brings me to my next point:

  7. Has done their homework

  8. I'm more than happy to pitch people on our company. One thing I've never understood though is why I'm pitching people about stuff they can easily have looked up before stepping in the door. People are asking me about VC funding as part of the Q/A portion of an interview. That stuff is on Techcrunch. Candidates should be reading this and making decisions before stepping in the door. If you want to ask about it, say something like "I saw So-and-so is an investor, who else is in that round that I haven't heard about and how are they helping you?" Something that at least shows you've done the homework.

    Second part is, if this really influences you, please use public info to decide before walking in. Don't ask me to convince you.

    For example, when I walked into Groupon to interview, I knew up front that the founders had taken most of a $1bn round off the table for themselves. I had judged that and decided to go forward with interviewing at Groupon anyway. Never asked about it. But while working there, I had to talk about this SO MANY TIMES with candidates at Groupon: "Well what about the founders taking money off the table?" My response was always "I knew about it. Didn't care." Then I actually questioned how much real consideration this person had given to Groupon. Are they serious at all, or just using us for a practice interview?

  9. Shows signs of life

  10. Assuming you've showed up on time, in the right place, can hold a conversation and have done some research, then the final bad thing you can do in an interview is show me you don't care. Don't care about the company, don't care about solving problems, what you're doing right now, wanting to be there, etc.

    A couple of times, I've had candidates just shrug at the whiteboard and say "I don't know." They don't want hints, they don't want to talk, they just seem uninterested.

    People get stressed, or just hit a mental block when at the whiteboard. I get that. I've had it! I've had it in interviews where I nailed all 7 and then on the 8th had a complete mental block.

    But IF you run into that, you'll need to show us something else. Show enthusiasm for the problem. WANT to solve it. Show enthusiasm for the company. Anything! Because if you put on the cap, put down the marker, say "I don't know" and leave it at that, then you might as well just walk out. Which is fine, maybe you have concluded it's not the right fit. But there's no reason to stay if you really don't want to.

    Sunday, April 14, 2013

    The understated future of Pandora

    Pandora is becoming a huge player in radio, fast. Within a few years, it is shaping up to be one of the top radio companies in the US.

    Most recently, they've announced that they captured 8.05% of the US radio listening with 1.490 billion hours listened in March. This was up 40% in listener hours YoY I decided to chart out the past year of Pandora's listener hours vs. the total market. It's... revealing.


    Why, look at that, Pandora is climbing while the overall market is stagnant. And all they had to do was add servers and bandwidth, not buy up radio stations or new content.

    I could have gone further back to get a better idea of trends, but, if you extrapolate these numbers out over the next four years, Pandora will capture somewhere between 19-35% of the market. Pandora is now shipping with cars as much as XM radio.

    Consider this for a minute:
    But wait, then why is Pandora's margin so low?

    The problem for them is in digital radio licensing. Unlike terrestrial radio, Pandora and XM get ripped off by the music labels for radio licenses. It's very, very cheap to get a radio license for a traditional radio station, but not so much for digital.

    One theory is that at the end of the day, the labels are going to have to come back to Pandora and XM with hat in hand to try to get better rates. They cannot grow the overall base unless these two media survive and do well. They don't interfere with individual marketing and sales efforts because they are radio. Radio is great advertising for music that people want to buy.

    The other lever Pandora can pull is is original programming.

    Sirius/XM has made some massive content bets over the past several years. Howard Stern, the MLB, NFL, etc. I'm guessing they paid out the nose for these rights. And I imagine they got more listeners. And they almost went bankrupt in 2009 so I'm not sure this worked as planned. XM is really a service that only makes sense in the car anymore. Everyone else is going to have internet, and Pandora is making a lot of inroads into the car.

    I guess one way to look at it is that Sirius/XM took the Groupon model: spend a lot of money to do a land grab. Pandora is taking the slow, steady route. Get land without spending as much, then start pulling levers later. I would say that Netflix followed a similar tack. Only in the past 12 months, 4 years after starting to do Netflix streaming, have they leveraged the original content lever. Over the 4 years prior, they grew huge while paying out the nose for content. Pandora is in a similar situation.

    I own Pandora stock. I bought it after I heard this discussion in a coffee shop between baristas:


    "Hey, is this Pandora?"
    "Yeah"
    "Why do we have ads, you didn't pay for it?"
    "No"
    "We should"

    The end.

    To me, $36 a year is an amazing bargain -- 1/4 the price of Spotify (which I also subscribe to) and they have more music. I realized, why isn't every small business subscribing to Pandora? Combine that with advertising and that's a huge market.

    So anyway, we shall see what happens. I thought they had a bright future, so it's nice to see the numbers sort of back that up.

    Monday, January 28, 2013

    I challenge all recruiters' claim that programmers are scarce

    Since we announced our funding at Radius on Wednesday, just about every recruiter and recruiting website in the Bay Area has spammed me because we're hiring engineers (yes, we are hiring).

    When negotiating terms with them, I'm being fairly aggressive though. I'm only offering about half of what they're getting from other firms. Why is that? Because I challenge this notion that programmers are a scarce resource that they somehow know to tap into, or that their website has cornered the market on.

    Exhibit A: the Windows 8 App store.


    There are DOZENS of password managers and anti-virus programs in here. Dozens. And every one of those required one or more programmers. This is for an app store that no one cares about on Windows 8. Now consider the hundreds of thousands of shovel-ware apps on iOS and Android as well. Or crappy websites. We're talking millions of people who can write serviceable crap-ware out there in the world.

    The reality is, there's no scarcity of people who can bang out computer programs. Not even remotely. The "scarcity" problem I think businesses face is two-fold.

    1) Public relations.

    It's not that programmers are hard to find. It's that it's hard to find ones that will work for you. Or even know about your company.

    Google, Dropbox or Twitter have no problem with this. All are huge brand names, great places to work, pay well and have compelling problems to work on. They're viewed as technology companies.

    In contrast, it was hard to get decent programmers to work with us at Groupon after we went public for several reasons. One was, of course, candidates believed the IPO took away most big upside on future stock grants (turns out, that was true). But the main ones were that Groupon "wasn't a technology company". The company had a lot of bad press, bad merchant experiences. The founders took a lot of money off the table in a late round. Things like that.

    All of that is PR and marketing, not recruiting. There's no secret sauce that some recruiting firm is going to be able to put out there to change these things. Either people know about you and want to work for you, or they don't. I'm confident that we're establishing a brand at my current company that will sell us as an awesome tech company more than the recruiter will. I'm paying them to feed me resumes, that's all.

    2) Core competencies

    Note that until now, I've been using the term "programmers". You could use this interchangeably with "hackers", "script kiddies", "coders", "developers" or whatever your choice for someone who is competent enough to open Xcode and write shovelware for the iPhone, or bang out pages with N+1 select statements in Rails.

    Another term I have for this is "Goo-coders": people who can write code only because Google exists and they can search for the answers on Stack Overflow or whatever.

    There is a higher level than this, and the name I apply to it is "software engineer". I understand that people debate this term vs. "software developer" or "software craftsman" (I'm sorry, but barf). Whatever you want to call it, here is what I mean by it: someone who has a deeper understanding of how computers work, can have a conversation about code, doesn't have an agenda for certain tools ("I must use Clojure at my next job"), can figure out best practices where there are none, and can implement a robust, well-designed, scalable software solution based on requirements without hand-holding.

    Those are the core competencies I want. Yours may be different, but those are mine.

    I have to look through a lot of resumes to find these people and recruiters aren't that good at filtering them. A lot of times, someone looks good on paper but after 10 seconds of talking to them, you know they don't fit into the above. So I end up spending more time than the recruiter on all of this for any given person. In addition to filtering through the resumes I get, I spend the time to phone screen, to set up interviews, negotiate the offer, blah blah blah.

    All of this adds up to: most recruiters are sourcing me solicited resumes, nothing more. In fact, I know one recruiter (that we've since fired) was simply doing Craigslist arbitrage. They took our job description and posted it to craigslist, then they did almost no filtering on the results before forwarding. I know this because I ask candidates where they heard about the job. If you're using a recruiter, you should do that too. This work was not even remotely worth the amount we were paying this company and I refuse to do it again. You should too. Even if you know nothing about software engineering, you should instead find someone to help you filter results when you post to craiglist yourself first.

    So my recommendation, when you're looking for people, is to:

    a) Hire a PR company instead of paying recruiters a ridiculous percentage.
    b) Sign up for something like Jobscore, which allows you to figure out which online job posting funnels are working best for you.
    c) If you truly know nothing about engineering, go make a friend at a local hackathon that can help you do phone interviews for you.

    Tuesday, January 01, 2013

    On Windows 8

    Oh crikey, now @kunikos as asked me to explain what's wrong with Windows 8. No way would that fit into 140 characters so here goes...

    I've given this a lot of thought, and have already outlined how to profit off of the Windows 8 disaster. After all of this thought, I've come up with one word to describe Windows 8:

    Pointless.

    Pointless for users. Pointless for OEMs. Pointless for ISVs.

    The only company it's not pointless for is Microsoft. Microsoft is desperate to try to find a way to get people to develop for their fledgling mobile platform and heretofore non-existent tablet platform. By merging the aforementioned fledgling platforms into the mainstream platform, they hope they can get developers for the former.

    I mean, I get it. I get what they're trying to do. The goal is admirable, just not the execution. It's also about 8 years too late. They had their chance with Windows Mobile for a decade before this. They had .NET running on it and everything. And they failed to make it work.

    So now they're sacrificing their existing userbase in order to make it work. I have not heard a single story of a business user upgrading to Windows 8. Not one, even on the internet. And who would? Any IT department worth their salt is going to recognize immediately that it's just not worth it. Why bother with all of the additional help desk calls when the non-computer literate can't find Solitaire anymore, or WebEx doesn't work, or whatever? It's a whole new set of headaches for an OS upgrade that, if you look at the features, actually has nothing on the back of the box that's interesting for enterprise. The features listed are nearly all self-referential.

    And not only that, Microsoft threw OEMs under the bus with Surface. They've thrown ISVs under the bus by making RT incompatible with all existing software. And not to mention that no one -- not even Microsoft -- sees any way to build additional value out of anything Windows 8 offers. Microsoft is charging less for this upgrade than any other upgrade, including Vista! Can OEMs charge more for their Windows 8 devices? No. Can ISVs charge more for their Metro-ized apps? Hell no. If anything, these sandboxed app stores drive the prices down.

    But let's go into more detail about Windows 8 philosophically and what went wrong.

    Windows 8's core premise is that the desktop OS UI is in decline. That the keyboard and mouse are outmoded and the touchscreen is the ideal. This is not what I believe, this is what Microsoft believes, as evidenced by Windows 8. Why do I say that? Because they have taken 18 years of perfecting the Windows 95 workflow, selling it to billions of people, then thrown it all out. Windows 8 is Frankenstein. New UI motifs gobbed onto existing ones as an act of corporate desperation, with no way to turn it off.

    The fundamental view is a full screen Metro view yet all existing apps are desktop apps. This makes usage pattern on Windows 8 is so bizarre and jarring for desktop users it's intolerable. Hit the Windows key and suddenly everything gets replaced by a monstrous full screen Metro panel. And then the funniest part is that visual discoverability is worse in Metro than it is in Windows 7. Need to find a way to add a printer? You need to know to swipe over to the right hand side of the screen, get the charms bar, hit Search and then type for "printer" AND THEN click "Control Panels" to search. There is no way to click around the screen until you find it like in W95->Win7.

    But here's the kicker: no-way, no-how can Metro supplant Windows desktop apps as they exist today. You can only see two Metro apps at a time. Even if you have a 30" monitor, as I do, you get only two. No longer do you have the cool snapping of Windows 7 -- a massive boon to usability that Win7 introduced -- but now you can only have predetermined division sizes you can use. It's as if no one at Microsoft envisioned what it might be like to develop actual productivity applications for Metro before converting the whole operating system to it. Need to see five spreadsheets at a time to cut and paste between them? Tough shit.

    The most telling part? The truly shocking thing? The "Metroized" Office that ships with Surface is a desktop app. It pops you out of the full screen Metro experience to operate. It has widgets that are flattened out like Metro, but even Microsoft realizes that nobody can use this full screen WinRT experience for anything serious.

    So there you have it. There's my 2c on why Windows 8 is an unmitigated disaster. WinRT is not in itself a bad idea. Metro is not a bad idea. The problem is that Microsoft tried to approach the problem of convergence by smashing the two onto desktop Windows instead of finding a reasonable middle-ground. Not even Apple has been brazen enough to think this is a good idea.

    Microsoft should undo the damage and revert away all of this Metro crap. Maybe people will pay for this upgrade. Then for their future strategy, simply fork Android and build Office, plus their C#, C++, .NET tooling for it. Give up on the OS market--they're screwed either way--and start working towards a stronger services market.

    FWIW, I didn't bother downgrading to Win 7 as mentioned in my tweet. I instead loaded on Mint 14. It works better on this PC (note: there is no ethernet driver support for Win 8 on a 1 year old XPS 8300!!). It has a windowing UI that works well and I can just boot into Windows 8 when I want to play games.