Friday, January 30, 2009

Stimulus package is a fraud of pork and spending

Business as usual in Washington, our government knows how to spend money and that’s exactly what they’re doing in the form of this phony-baloney “stimulus” package.

I don’t even need to tell you how ridiculous some of the things in it are, so I’ll just demonstrate via pie chart.



So, not counting tax cuts, a small piece of this pie actually buys things and builds infrastructure.  WTF do Pell Grants and preventative care have to do with stimulus?  Or food stamps?  Or Medicaid?  Or local law enforcement?

Oh, and how does fiscal relief of the states stimulate the economy?  Let me get this straight… bailing out failed governments is stimulus?

I’m not saying these things aren’t important.  I’m saying they’re not economic stimulus.  It’s a massive amount of typical spending.  Boosting of the kind of Washington cronyism that no one would ever approve of if they had any idea.  Federal spending did not hit a trillion dollars until 1987.  Now we’re spending that much in one bill.

You cannot spend your way out of a depression.  FDR proved it.  The New Deal was a failure – a huge power grab by the Federal government.   It was actually the second world war, and the money owed to the united states after the war, that brought the US out of the depression.  Politicos don’t seem to remember any of this.

But that’s ok, we all now trust in The Messiah to make things right </sarcasm>.  If this guy was legitimate he would veto the bogus “stimulus” package he helped create.

Thursday, January 29, 2009

*Chirp: Good Stuff

*Chirp is a WPF desktop widget for reading twitter. It's very slick. Not as feature filled as Twittastic or Twhirl but the UI is much faster and more aesthetically pleasing.

I do wish they'd make the logo in the upper left corner a bit smaller.

Be sure to install it for .NET 3.5 SP1, it runs a lot faster than the 3.0 version:

Get it here.

Tuesday, January 27, 2009

Uh... why is offline Gmail interesting?

The number one item on Twitter earlier was offline Gmail... I just have to ask, who cares?

I have offline Gmail, it's called IMAP. Windows Live Mail or Outlook downloads all of the messages locally and it doesn't require some random custom web server running in the background on my machine. This is how I back up my Gmail to a local disk in the event they lose all of the email (deleting the account, etc. It happens.)

Here are the upsides and downsides for both parties:

Upside for you: you can use the Gmail web interface if you prefer it and still read msgs offline.
Downside for you: you have a random service running on your machine that hosts a web server.

Upside for Google: you continue to use the web interface all the time, therefore placing their ads in front of you.
Downside for Google: none.

Sunday, January 25, 2009

ur Premise is Wrong.


Here’s one of the thing’s I’m not sure how to reconcile in my head.  When someone is making a point, does an invalid premise at the beginning of the argument always invalidate everything else they have to say?


Example, for Christmas I was given this book.  In it, the fundamental premise of his argument--which is to show the rest of the world is growing at a break-neck pace and will outgrow America--is that oil demand has risen dramatically, hence the rest of the world is catching up.  How does he know that?  Well, prices of course!

Today, maybe 6-12 months after he wrote that, it’s been well shown that oil prices jumping 100% YOY was not, in fact, driven by demand.  At the time, when compared to international demand going up maybe 5% YOY, this should have been obvious on its face, but apparently not.  Only now are articles coming out that show speculation’s role in driving up oil prices over the last two years.  At one point last year, Vitol, a swiss company owned more than 11% of the contracts on the NYME.

In any case, I find it difficult to read the rest of the book when this premise is so flawed.

Here’s another example… the one that spawned this blog post in fact.  Andrew Sullivan linked to this article, which tells us there is no crisis!  We all make $2400 more than 2004, adjusted for inflation.

Except, what does “adjusted for inflation” mean?  Is he using the bogus government numbers, or estimates that haven’t been tinkered with?   Because I guarantee you, the 9-14% inflation rate we had between 2004 and 2008 certainly does not end up with people being better off.  Back of the envelope, using with my investments and token salary bump, I lost well over six digits over that time from inflation alone.

The final example is one everyone has an opinion on… how about Iraq?  Was it wrong to topple Saddam Hussein because the WMD intelligence was wrong?

History will tell in all of these cases, but I tend to believe that an argument founded on a faulty premise becomes entirely invalid.  Even if something in the midst of that argument makes sense, how can it be proven on a rocky foundation?

Thursday, January 22, 2009

Thought Experiment: Social Security

I was playing with Budget Hero today, and while this little Flash game is incredibly biased such that the only viable solution is to end the war immediately to save our country, I ended up doing some thinking about Social Security.  What is it, and what should it be?

What it is:  Social Security takes money directly from working people and directs an essentially unlivable wage to those who are disabled and unable to work or have turned 65.

That’s all it is.  I think when people classify it in other ways, it’s false or dishonest for the purposes of swaying your opinion.  Let me show you.

  • “It’s a retirement fund, therefore I should get some when I retire.”  No, because Social Security does not save your money (though it does have enough surplus for the Federal Government to have borrowed funds against SS – another story for another time).  Social Security is not a government-run IRA.  If it was, it wouldn’t pay out a substantially lower amount of money than was put in by each person.
  • “It’s wealth redistribution.” It’s not.  Social Security is paid out to disabled and people over 65 regardless of whether they need the money or are still working.  65 year old multi-millionaires who continue working after “retirement” can still get Social Security checks.
  • It helps the elderly pay their bills.”  I think this is misleading as well.  It picks an arbitrary age to start paying benefits to people, whether or not they need it or could continue working, and regardless of their employment (after you “retire”, you can go back to work and still collect SS).

Continuing the thought experiment, what should Social Security be?

I think social security should be defined as this:  Social Security takes money directly from working people and directs a more livable amount to those who are disabled and unable to work.

Notice the difference?  There’s no age.

A SS recipient should be required to prove that he cannot work any longer in order to get paid Social Security, otherwise we are paying able people to leave and stay out of the workforce.  Raising the age doesn’t solve the underlying problem, which is workers are giving money to people who don’t need it.  For many, the SS check is a bonus check for staying home.  That’s absurd.

Unlike the cold hearted capitalist many of you probably think I am, I believe the handicapped and elderly who cannot function should be helped out when they don’t have the means themselves.  As people continue to live longer and retirement age doesn’t change, the problem of paying those who don’t need it will become more substantial.

And what about those who don’t save for retirement?  My take is, those who don’t save for retirement would have to prove they aren’t able to work to collect.  That’s why there’s no age.

On the topic of fairness—which many will probably cry foul over on this type of plan, calling it unfair to pay some people and not others.  I ask,  is it fair for the workforce to keep paying trillions to millions of people who don’t actually need the money, and provide less for those who do?

Tuesday, January 20, 2009

I love the internet


This was on a slashdot thread about ASP.NET being used for the new website.

Inauguration Choosing Silverlight

The internet is abuzz because Microsoft’s Silverlight was chosen to stream the Presidential Inauguration today.  Some open source people are complaining that “the popular and/or open source alternative” should have been chosen.

I don’t know of any OSS alternative, so I’m going to assume they mean Flash.

Flash is crap.  I spent a day learning it and it’s horrible.  I felt like I was becoming stupider while working in it in this class I took from an Adobe representative.  If I had a site with a lot of Flash, I’d start porting everything to Silverlight as soon as possible. 

Silverlight is technically superior to Flash.  The primary language is C#, not actionscript.  The development environment is Visual Studio – so basically the best out there.  You can even write client side code in IronPython or IronRuby.   The only thing I’d say Silverlight is missing out on is the animation aspect that Videoworks Macromind Director Shockwave Flash is good at.  The Expression Blend tool is just not as fleshed out yet as Flash is for doing animation.  And Silverlight is missing H.264.

Anyway, don’t you think that our government should be deciding on its suppliers based on merit?  Do we no longer live in a society that believes in encouraging creation of and using the best tool for the job, or did they outlaw that with the Anti-Dog-Eat-Dog Rule?

Monday, January 19, 2009

Why I won't run Windows 7 (for now)

I've been pegged by many as a chronic early adopter.  And, well, I am.  I had to create a partition to try Windows 7, and I've tried it for several hours for a couple weeks.
After trying it, I've decided I'm not going to run it.  For now.

  • Visual Studio has some issues on it.  I can’t run it if an app I use every day has problems.
  • It's a pain to re-install everything just the way I want it.  I think my personal limit is about two Windows reinstalls a year.  I've already done one at home and one at work in the last six months.  While I'm pretty confident that Windows 7 is only going to see this one public beta before release in the summer, I don't really want to deal with any potential screwups that require a reinstall before release.  On this note… I'd like to see a more sandboxed system, as outlined on UnsolicitedDave.
  • There really aren't any features I'm dying to use.  Vista has a life-changing feature for me:  hit the Windows key and start typing to search.  I now can’t stand sitting in front of XP because I can’t do that.  So far, Windows 7 hasn’t revealed any features like that yet.  I’m not yet sold on the new taskbar.  I kind of like the new widget (replacement of sidebar) stuff, but wouldn’t really run Windows 7 until I wanted to develop for it.  Some of the window-snapping stuff is nice.
  • It’s Vista, and I like Vista.  Vista was a failure and Windows 7 is looking like a success already.  That’s all marketing.  The thing people don’t realize about Windows 7 is that it’s just a rehashed Vista.  Vista needs help – mostly UAC and some of the more arcane management and control panel UIs they decided to add – but I’ve run it every day for 3 years and think it’s fine.  I insist on it now because I run 64-bit and Vista x64 has better driver support than XP x64, plus that Windows-to-search key I insist on.

Anyway, Windows 7 is looking promising, but it’s really just Windows 6.1 (Vista is 6.0, for those not aware).  In fact, 6.1.7000 is the actual version number of the beta build. 

Sunday, January 11, 2009

Programmers' bill of rights

I thought this was insightful...

Especially :

There's a messed up philosophy-- a kind of offshoot of the 'eat your own
dogfood' school of thought -- that insists developers should have the same
desktop rig as end users. This is nonsense.

End users aren't running three virtual machines, a local database server, three instances of visual studio, and two instances of blend. Devs need a hot rig with massive RAM. Anything less is ridiculous.

One of my very first managers insisted on this philosophy for our company's R&D department and I've always found it to be absurd[1]. The people with the most specialized knowledge should be given the best machines in the house. They should be able to work as efficiently as possible. If said programmers are given great machines and can't write efficient code, they shouldn't be doing that work anyway, whether their machines suck or not.

[1] - One of his other philosophies is a great one I still use today: "1 hour of human work is worth 10 hours of machine work." Meaning: if you can dump something onto a machine farm that takes 10 hours and call it a day, or stay for another hour to optimize it to take less time, you should do the former. He came up with this by looking at people's salaries and the leasing rates of machines (our entire farm used to be leased -- back when rackable HP-PA machines were like $40K a piece)

The equation's a little messed up in this day and age because hardware is so much cheaper and faster than it used to be, but the philosophy is still a great one. I like to throw hardware at problems as much as possible. Granted, this can't always work--the philosophy has been tried by many Ruby on Rails developers and has failed often--but since the work I do is not public-facing, it works out pretty well. $50K can buy you an enormous amount of computing power... if applied correctly, it could pay for itself in shaved off engineer time in a two months? a month? less?

Sunday, January 04, 2009

Cross-Section of 2008's Trimbo Blog Visitors

I always like to dig down into data and see what the true nature of things is like. So here's a sampling of what people ended up on Trimbo's blog searching for this year:

  • A user from Livingston, Montana (population 7000): "att more bars?" Yes, they lie. And given that you live out in the middle of nowhere (albeit a beautiful nowhere), your phone probably doesn't work.

  • A user from Whistler, BC: "visual studio professional vs standard". Ugh, I regret posting that even though it gets a lot of traffic for this blog. The answer is... standard will probably do everything you need. It's missing a few higher end features but my initial post was wrong. I corrected it, and people seem to not read the correction and still post inane comments about my being wrong.

  • A user from Kenya (city unknown): "iphone critics". Oh, you came to the right place pal!

  • A user from Colombo, Sri Lanka: "a call to bind was not well formatted. please refer to documentation". Yup, that one is still a pain in the ass.

And now some favorite keywords:

  • "trimbo torrent". Why do people google for this? What the hell are people trying to torrent that's a "trimbo"?
  • "why does detroit always play on thanksgiving". I'm so glad people are asking and ending up here.
  • "kelsey road house". Nice! Except, how come no one from Barrington googled it?
  • "crispin porter sucks". This is the ad agency I wrote about that did the ill-fated Bill Gates/Jerry Seinfeld campaign.
  • "flipside wauconda". No wai, someone was searching for that? And from Berkeley no less! Hey, drop a comment if you end up here again.
  • "aliens that are stopid". I'll stop there.

Saturday, January 03, 2009

Digital Cinematography coming along

As someone who has had to pull mattes off of the CRAP film called Vision 500T (5279) that cinematographers loved to shoot blue and greenscreens with, I'm glad to see the number of films shooting with digital is on the rise.

Aside: I've previously predicted the end of mainstream (i.e. Walgreens) C-41 film developing in the United States will come sooner than most think. E-6 went from being developed widely to being developed at maybe one or two labs per major city practically overnight. K-14 was reduced from several labs in the country to just one in about the span of two years (2001-2003). So I'd say we're on our way to seeing no mainstream consumer film processing within the next 2-3 years.

In any case, the sooner filmmakers ditch film, the better off they'll be (and we'll be -- if anyone cares, I can explain why in detail sometime). So I decided to check out on IMDB how many filmmakers have used Red One. Turns out, it's a lot. And they've been using it as early as 1927! (Ok....I think that's some kind of error in IMDB)

Thursday, January 01, 2009

Stop capturing my mouse

As I type this in a textfield on a webpage, I can tell you that consistent user interaction on the web sucks.

Here's an example... ever try to use the scroll wheel on a page with embedded videos? Try this experiment. Go on over to MakeZine, then scroll down with your cursor over the articles. Eventually you'll get your cursor over some embedded video that won't let you use the scroll wheel anymore.

Another one. Go to any YouTube video in Firefox. Put your cursor over the video and hit "Control-T" for a new tab.

Had these apps shipped on Windows or Mac, people would be outraged if they couldn't do an OS-wide keystroke (e.g. Expose, or the Windows key to open the start menu) in some application. The sooner browser makers or Adobe come up with some standards for this stuff, the sooner the web will be better to use. For example, why doesn't Flash itself fix the scroll wheel when focus is on the browser pane, or reserve and respond to control-T?