Friday, December 28, 2007

Maggiano's Sucks

There's a relatively new chain of Italian restaurants out there called Maggiano's Little Italy. I went there last night with the impression that it's supposed to be an upscale chain.

Wrong. The restaurant charges upscale prices (~$16 for gnocchi), but it's anything but upscale.

For one thing, don't bother getting a reservation. They take unlimited reservations. So even with a reservation, we waited 45 minutes to get seated. Restaurants like this seem to forget that reservations are supposed to be for you as much as they are for the restaurant. Yet, you know this is probably a daily event there. They don't care.

They also give you a massive amount of food. The guy next to me ordered spaghetti and meatballs, which was about 2 pounds of pasta and two meatballs. The food is good like McDonald's is good. It tastes good and is loaded with calories to make it good. The reality of food is that it's not hard to make good tasting food in this world. You can go to Safeway or Dominick's and buy a boxed spaghetti dinner that tastes good because they pile in a bunch of MSG. At Maggiano's, the food is good because they add a stick of butter and a quarter pound of Kraft paramesan to beef to make their "Little Italy" beef dish. Not exactly a culinary experience worth waiting for when you have a "reservation".

Maggiano's is typical Rich Melman fare... a nifty concept restaurant with huge tasty portions for the mindless Chicago suburban eater. I'm not sure what's more shameful, that Melman keeps opening worthless restaurants like this, or that people are falling all over themselves to go back there.

Wednesday, December 26, 2007

What Vista Needs

The reality of Vista is that, even though it sucks, does it matter? All operating systems suck. Vista sucks more than XP and MacOS X and less than say, Windows Me and MacOS 9. But since all OSes suckj, the reality is it doesn't make that big of a difference in your day to day life whether you use Vista or XP.

That doesn't mean that Microsoft doesn't have a big problem on their hands. The hatred of Vista leaves an opening for competition to move in and sell a lot of units.

What Microsoft has to do is look at what Intel did a few years ago.

The Pentium 4 was a hugely flawed chip. Intel shipped millions of them, they made a lot of money, sure. But they knew that the Pentium 4 was flawed very early in its life. So while they were selling Pentium 4s, they scrapped plans to try and build on that architecture and instead went back to the last architecture (Pentium 3/Pentium M) and started from that.

Microsoft should do the exact same thing with Vista. They'll ship millions of copies. Don't be fooled into thinking that this is because the public really likes your product. The only reason people bought Vista is because it's what their $599 Dell came with (usually) . So take a long hard look in the mirror: all operating systems suck, and Vista happens to suck more than the last Windows version.

What's the fastest way out of this problem?

Basically, go back to XP. Rework the UI a little bit, and throw out most of the crap you've messed up with Vista. Drop the insane shutdown menu that took months to develop and is still more complicated than XP. In fact, drop everything except stuff that can easily be installed on top of XP, e.g. WPF, .NET 3.5, etc. If you want to add features to the Windows Explorer, keep the ideas that worked (the Windows-key search box) and drop the things that didn't (Aero glass, the new networking menus, the new save as box, the new path browser at the top of windows.).

And bring back hardware accelerated GDI+ drawing... oh wait, XP has that.

This was done really well, it's a tongue in cheek review of Microsoft's "new" OS, Windows XP, and how great it is to upgrade from Vista.

Saturday, December 15, 2007

Why your fancy new programming language choice is probably going nowhere

This week I was CCed on bunch of email threads discussing new/old programming languages. The general idea thrown around in these threads is switching to a completely different language because it has an inkling of a high level concept that might help you. One example would be a concurrency-oriented functional language like Erlang in a new space, like, for example, desktop Windows applications that could use multithreaded computation.

Ok, here's the problem: once a language is established in a space, it's really difficult to dislodge it. So if one application developer switches to Erlang for Windows programming just because it can deal with concurrency better, that won't start a trend worldwide. That person will write a ton of extra code just to get it to work -- hooking Erlang to the Windows SDK (C++/VB) -- and then who supports it? They do! The problem is that it's not up to you, the application developer as to whether or not the primary language gets dislodged. That's up to the platform provider.

The only new language lately that's making inroads on C++ for Windows application development is C#. Why? Because it's Microsoft, of course. They're the only ones who can push on that sort of thing and make a difference because they'll be the ones to support it, not leaving it to the application developer to either buy third party support or support it themselves.

Java is a really great example of failing to move in on established spaces. In 1996 or 1997, Java was hugely hyped and being adopted by many. Yet the original "write once, run anywhere" promise made back in 1996 or 1997 has been destroyed by the same old C and C derivatives (C++, Objective-C) on all platforms (Windows, Mac, Linux, Unixes). Because really... why would you bother with Java when your platform API is in a C-language? You'd get the chance to try to shoehorn Java into working in environments constructed entirely around C.

However, Java did make some good inroads into unexplored spaces, namely the web and mobile devices. If a website isn't running PHP in this world, it's probably running Java. And most mobile devices (except iPhone, Qualcomm's BREW and Windows Mobile devices) have Java runtimes.

Speaking of the web. The web is probably the only current area that won't resolve the language issue anytime soon -- at least on the back end. A web application back end interface to the client is comprised of text layout instructions sent to a user client (HTML). You can interface with this easily in any language you want. Open a socket, send HTML through it, done. This is why it's really pretty simple to have a language like Ruby come out of nowhere and get a lot of steam in the web space.

One thing is settled, I think, and that is Javascript is the language we'll be using on the client side. Yes, Flash and actionscript will still be used for widgets and stuff. But Javascript is what you use for applications on the web, like Gmail, because all clients interface with it. Plus, actionscript and Javascript are supposed to merge with ECMAScript 4.

By the way.... the threads included a bonus of the age-old LISP angst that is well known to Slashdot readers. Get over it already. This isn't because of some worldwide anti-LISP conspiracy. The world won't write new code in LISP for the same reason they wouldn't use COBOL. There's no momentum there anymore. So drop the LISP persecution complex.

Sunday, December 09, 2007

Wii vs. Xbox 360



On my walk today, I was wondering why people are so crazy about the Wii that they'd pass up buying an Xbox 360 for almost same price. It makes no sense to me since I feel like I've gotten a ton of value out of my Xbox 360 and I paid $400 when i bought it.
Excel was still open from the last post, so I decided to make yet another chart outlining some of the differences between the Xbox 360 and the Wii. Included are online multiplayer, number of metacritic games over 90 and 80, and some fun items. Enjoy.
























































































































































TopicWiiXbox 360Advantage
Cost$250 $280/$350Wii
90+ Metacritic Games411Xbox
90+ Non-Nintendo Titles111Xbox
80+ Metacritic Titles1284Xbox
Oversized PersonalityMiyamotoBallmerPush
Internet Gripe"Two Gamecubes Duct-Taped together""Red Ring of Death"Wii
Major ProMotion-based controllerNo Motion-based controllerXbox
Major ConCan't find oneSupports evil empireXbox
Supports HDNoYesXbox
Online MultiplayerSomeMost
Games
Xbox
Downloadable GamesYesYesPush
Plays Mp3NoYesXbox
Plays DVDsNoYesXbox
Plays Videos from PCNoYesXbox
Media ExtenderNoYesXbox
Cute Mii charactersYesNoXbox
Lame NameYesNoXbox
WeatherYesNoWii
Requires ExerciseYesNoXbox
Thankfully Not Made ByAppleApplePush

Why you should use H.264 to encode video

I've been doing my homework on the byzantine world of codecs for video and I've come to the conclusion that if you want to compress your video for the widest range of software and hardware, you should do it with the H.264 codec.

Here's a table that demonstrates why.


The orange "Yes" boxes means it requires an additional plugin, but can play it.

H.264 is compatible with pretty much every device and program I use (and most people use). For media playback and sharing, I've switched from WMP to Zune (software) since it can play H.264 out of the box.

Even if you're on Windows, this can show you what a losing proposition it is to encode with Divx and Xvid. You might be able to play them back on your Windows machine, but the only stock machine that plays it back without an additional plugin is your Xbox 360! I'm not really sure why you'd encode with either of these anymore except that they're faster than H.264. Maybe someone can enlighten me in the comments.

WMV is a slightly better choice if you plan on always expecting your audience to be using Microsoft devices, and Silverlight to view the content on the web. It's easy to encode with Windows Movie Maker. However, Flash just published their update the other day that provides H.264 video in the flash plugin. I was using Silverlight video for a website, but I think I'm going over to Flash now that H.264 is provided, and all of my devices support it so well.

Friday, November 30, 2007

SQL Server Compact 3.5... still no x64 support

It has LINQ. Who cares.

What I want to know is why did Microsoft release yet another version of SQL Server Compact edition where the server tools don't install on 64-bit IIS? Do people actually run x64 servers and then run their IIS in WOW64 mode?

Microsoft's x64 support has been horrendous so far. WOW64... with a whole new registry and Program Files directory? And how about the decision to require Vista x64 drivers be level 3 signed? We've been ready to deploy x64 for like 5 years.... stop jacking around and fix these things already.

Fixing Outlook

If you get this error in Outlook 2007: "Could not install the custom actions the object could not be found", it's because somehow you have a custom form installed on your machine that has gone south. I think the custom form might be an Exchange-based distribution, since I've never installed any custom forms.

Anyway, first close Outlook. If you're on Vista, go to "C:\Users\[UserName]\AppData\Local\Microsoft\FORMS" or on XP "C:\Documents and Settings\[UserName]\Local Settings\Microsoft\FORMS". You may have to make hidden files and folders visible to see this directory. Once you find the directory, blow everything in it away. Then relaunch Outlook.

Now enjoy your time in Outlook. It's so aw3some!!11!

Monday, November 19, 2007

How Come Print Isn't Dead Yet?

The consumer market rewards inefficient behavior if it's easier than change. That's the summary of why print isn't dead.

More on that in a minute, but let's first look at where print actually is dead: businesses. How many businesses still own typewriters? How many use paper adding machines? None, at least on this side of the world. Most people carry now Treos instead of Filofaxes.

Every day, we read and write enormous amounts of text on screens when we go to work. We do this because businesses (well... some businesses) cannot tolerate inefficiencies that are simple to fix. Migrating from typewriters to computers, Filofaxes to Treos and slide shows to Powerpoint are good examples of easy efficiencies.

But when we get home, we have racks of books sitting on shelves, collecting dust and taking up space. How did those books get there?

  1. Trees cut down using oil-powered machinery
  2. Trees then transported using oil-powered trucks
  3. Paper mills process tree using coal power
  4. Paper transported to printer via trucks
  5. Printed books transported to warehouse via trucks
  6. Warehouse has an HVAC using coal power
  7. Books then distributed via trucks and planes to stores or direct to consumer
  8. (Optional) Consumer drives car to store to find book and buy it in an air conditioned store
That's got to be one of the most inefficient delivery systems for intellectual property besides chemically-processed film. Print can be transmitted anywhere in the world in seconds. With my broadband connection, I could download whole books in a few seconds.

Print distribution is inefficient because it's easier to keep it that way than to fix it. Publishers want the same locked-in business of publishing the same content as an expensive hardcover followed by a cheaper paperback. To convince readers to do something different and keep the margins just as high sounds too risky. Plus, book store owners don't want to go out of business like Tower Records, so they probably pressure publishers not to change anything.

Today Amazon launched their Kindle device -- a wireless device for reading and buying books. You would think that people interested in the environment would be very keen on Kindle and digital distribution of books. But instead, what I have read a lot is:

a) "I could buy 40 books for the price of Kindle"
b) "When I buy a book, I want my right to give it to someone else" (the age-old DRM complaint)
c) "Why should I buy something so ugly?" (this is my favorite)

The trick to marketing to consumers is you have to give them something they don't have now and convince them they need it. Of course, it doesn't matter if they actually need it. Kindle doesn't do a very good job of this, hence the complaints you see above.

Let's look at a marketing success that changed distribution: the iPod. For music, Apple has already convinced people of the first two by overcoming the last one. Buying a iPod -- which some people have done a few times over -- costs about 20-30 CDs. Why buy the iPod when you can get a CD player for like $10? The iTunes Music Store gives you DRMed music and no right to transfer -- and they've sold a billion songs and videos or so.

All of the downside was initially overcome by the iPod being a slick looking, hip device. And now that people have used it and ripped their CDs -- I actually had to go find my CDs the other day -- most people realize the benefit of having a digital music collection and a portable device to play it. Even if Apple stumbles, digital music isn't going away.

People forget this when thinking about Kindle, an ugly duckling. No one considers the space they dedicate to hundreds of books in their house? They don't consider carrying hardcover books on airplanes or on the train? They don't consider the amazing waste of space called "public libraries"? No. Actually it seems more important that the device be attractive.

But, you know...I could also be the only one who, when buying a book, thinks that I'll have to find space for that book and inevitably lug it around in the future. Sometimes I have to remind myself that it's almost 2008 and weren't books supposed to be a thing of the past like 30 years ago? I've been trying to get rid of books forever. I usually give away books I would have once kept, simply because I don't want to dedicate the space to all of these books.

Well, I commend Amazon for trying to push forward with Kindle. I think their recent movie (Amazon Unbox) and music (Amazon MP3) offerings have been great. A friend ordered a Kindle and I can't wait to see what he thinks of it. I agree it's pricey, but I also think if one believes in something, the most effective vote for an idea is with cash.

Friday, November 16, 2007

This Cracked Me Up

As spotted over on Don Gerard's blog.

Awareness Ribbons Customized - ImageChef.com

I found this blog because Don used to be the bassist in a band I often went to see when I lived in Champaign called The Moon Seven Times. I still like the first album a lot. 7=49 was ok too. It would be nice to see some of this older, obscure music get digitized and put up on Amazon MP3, Zune, Rhapsody, iTunes, whatever. The cost of doing it is nearly zero, so shouldn't the long tail still apply to old, obscure music as well?

Zune 2: Another Paper Launch

Hey, Zune 2 "came out" on Tuesday. Or... did it?

Basically there are about maybe seventeen 80 GB Zunes available in the US, and all were snatched up within minutes. Many major stores, even Best Buy, didn't get any at all.

Trust me, I believe in the idea of artificial scarcity as a marketing tool. Nintendo has been doing exactly that with the Wii for over a year. The "scarce" Wii has sold about 13 million units. But I'm starting to wonder, are paper launches near Christmas effective? What's the point of putting a hot item out there and not having enough available for Christmas? The Wii still had at least a few million under trees by December 25 last year.

Apparently this was a snafu on Microsoft's part, not some intentional marketing ploy. But let's have a reality check here. When's the last time Apple had a shortage for Christmas? How about, like, never. They plan to release their stuff earlier in the year to avoid that kind of thing.

Tuesday, November 13, 2007

How to bail from Hotmail to Gmail using IMAP

Tired of Hotmail? Want to transfer from Hotmail to Gmail?

Here's how, in just 5 easy steps!!


  1. Install Windows Live Mail. This is the windows application that supplants Outlook Express and Windows Mail (in Vista)

  2. Set up your Hotmail account in Windows Live Mail

    * Go to Tools->Accounts
    * Click "Add..."
    * Select "Email Account"
    * Type in your Hotmail email address and password and hit next

    In the right panel of Windows Live Mail, you should now have your hotmail folders and be able to read emails there.

  3. Create a Gmail account at the Google website, if you don't have one already.
  4. Set up your Gmail account for IMAP. The instructions are very similar to the Outlook Express setup instructions on Google.

    * Go to Tools->Accounts
    * Click "Add..."
    * Select "Email Account"
    * Type in your Gmail address and password
    * Check "Manually set account settings for this account" and hit "Next"
    * Set your incoming mail server to IMAP
    * Your login ID should be your gmail address
    * Incoming server is "imap.gmail.com", then check "this server requires a secure connection (SSL)"
    * Outgoing server is "smtp.gmail.com" and set the port number to 465
    * Lastly, check "My outgoing server requires authentication" and "This server requires an SSL connection

    In the right panel of Windows Live Mail, you can now see your Gmail folders. If you do not see the gmail folders, right click on the envelope icon that says "Gmail", go to "Imap Folders..." and double click on the folders you need to show them.

  5. Now comes the fun part. Click on your Hotmail Inbox, or any folders on Hotmail you want to transfer. Select all the messages you want to keep -- or just hit "Control-A" to select all of them in that folder, then right-click and select "Copy to Folder...". Select the folder you wish to copy them to from the Gmail folder list. If you don't wish to sort them, instead relying on Gmail's search to find them again, just copy them to [Gmail]/All Mail. You can also move the messages, thereby deleting them from Hotmail, by dragging and dropping them to the Gmail folder of your choice.

One of the most annoying things about Hotmail or Live Mail is that there is no viewing of messages by thread, like on Gmail or even Outlook. But even if you drag over separate messages from Hotmail, Gmail will re-thread them into a conversation. Very slick!

Gmail IMAP has come in really handy. Using this technique, I've started uploading old messages to Gmail to make them faster to search (not to mention, from anywhere on the planet). I have 2001-2007 covered. I'm not sure if it's worth uploading 1990-something-2001. Is there any email I want to search to find from, say 1995? Doubtful.

Friday, November 09, 2007

Joy Division: Officially Hot Again

Ok, Joy Division is awesome and their songs haunt my mind, but isn't it a little bizarre that the entire world suddenly has resurrected Joy Division this year?

There's a new documentary about them, as well as Anton Corbijn's movie "Control". Peter Hook even came out in NME to rip the latter.

What's even weirder is I didn't know any of this was going on. I just was searching for stuff about Joy Division today because I randomly thought of the movie "24 Hour Party People" (which I reviewed a couple years ago on my movie blog). I thought the scene mentioned in there where they get up on stage and sing "Digital". Here it is, thanks to YouTube.






I like this version better than some of the actual footage of Joy Division playing this song. It makes me feel like I was there, when "Warsaw" (their name before JD) was playing a small club and was an unknown band. The real films of this song being played live are very crappy in quality (at least on YouTube). Plus, this guy does a really good Ian Curtis imitation. However, there are some quality versions of JD playing "Transmission" and "Love Will Tear Us Apart", as well as "Shadowplay" on YouTube if you're interested.

Maybe JD is hot again because it's the 30th anniversary of them starting out?


Hey, here's something... in just 3 more years, both Ian Curtis and John Lennon will have been dead for 30 years. Who do you think will have more influence on young muscians in another 30? (My hunch is Ian Curtis, which, IMO, is the case today as well)

Monday, November 05, 2007

People aren't getting Android

Google's open platform for mobile computing is widely misunderstood.  Joe Wilcox pretty much summed up the ignorance of Android's importance by calling it a "FUD announcement".   The iPhone mentality has sunk in too much.  Android is not just about being able to develop 3rd party applications.  Apple's the only vendor who doesn't allow that.

Android is an open source operating system and API for mobile computing.  It is free for companies to take and extend and use commercially.  It's not forced to Google and is not going to have forced Google advertising or whatever.

Why would Google make this?

Let's make an analogy to the yesteryear of broadcasting.

In the 1940s, NBC was created by RCA to sell televisions.  The belief was that RCA needed some kind of content to sell these new-fangled TVs.  Good thinking.  RCA used its muscle to force a broadcast standard down the FCC's throat, and made insane money by selling the TVs.

Then what happened?  Everyone started making TVs compatible with the standard.  RCA was the loser, and NBC was the winner.

Google is currently trying to "broadcast" content to a market where every "television" (OS vendor or web browser maker) has a different "broadcast standard" (CSS/Javascript/HTML compliance).  Since there is no FCC, the television vendors (Apple, Microsoft) are willing to change formats anytime they like, and in a closed source fashion.  In the case of the iPhone's locked browser, Apple has complete control over the user's web experience.

You really think that Google wants to play this game for the next 30 years when cell phones become the predominant internet platform instead of the PC?  (See next section)

So Google will give away the TVs, essentially, because it makes their core content business that much more efficient.  Google is playing the NBC game.  Apple is playing the RCA game in every market they're in.  Microsoft and Symbian are closer to the RCA game, since they're closed, licensed OSes, though they don't sell hardware.

What I really think we've seen today is a failure by the Google PR department.  Not only have they done a poor job at representing the reasons for making Android, but they let months of rumors grow that a magical gPhone would compete against the iPhone.  So of course there was a lot of disappointment by gadget freaks.  What would be better than a $600 glitzy gadget they could show off to their friends in time for Christmas?

I'm glad that someone out there is at least trying to fight against the consumer restrictions that we have in the cell market.  Now is the time, while the idea of browsing the web from your phone is still relatively young.  I wish them luck and I'm looking forward to seeing the SDK.

By the way, how much more do you think Linux would be adopted today if it had been developed and backed by a corporation like Google originally?  What if OS/2 had been open-sourced in the early 90s?  Companies usually do open source as a desperation move when their closed-source model isn't working.  If IBM had realized that their revenue growth was entirely in services, they could have open sourced OS/2 very early in its life and never lost a penny.  Google at least has that insight thanks to the success of their core business (AdWords).

The Death of the PC

This is the section I mentioned above.  Apologies for the long post.

Today a co-worker and I were discussing the imminent release of Visual Studio 2008.  He said something along the lines of "Can you imagine ever trying to write a large application in WPF?"

Wow.. so true.  WPF, if you haven't used it, is Microsoft's really big and really slow .NET-based UI framework.  It's kind of neat for small applications -- and I like the markup language nature of it -- but like my co-worker said, utterly useless for large ones.  Try even using the Orcas UI editor for it sometime.  It's very, very slow.

WPF will be useful for large apps someday, on a machine with 16 x86 cores, 16 GB of RAM and a quad-SLI 8800.  But before then, what are we paying for to buy Orcas?  What are we paying for to buy Vista, or Leopard?

This all comes back to my "Leopard v. Vista: Both Are a Scam" post.  We already didn't need Vista or Leopard, and at some point soon, the essential features are not going to require a PC anymore.  Cell phones can already do email and web, and that's 90% of what people want.  Eventually they'll be able to do our photos and videos too.  Just put a USB port on a cell phone and maybe you can edit videos on your MyBook.  Better yet, why can't I just set a powerful cell phone down on my desk here and start working?  It wirelessly connects to my 24" monitor, my keyboard, my mouse, my extra HDD.   PCs will be like workstations were 10 years ago:  niche devices used by people who need to do 3D graphics.

Don't worry, I have doubts about this too.  But I'm probably wrong when I think "No, it's not possible, I'll always need a PC."  Soon, those of us who believe that are going to be the old generation falling out of favor.  We'll be those old people who talk about how it used to be.  Meanwhile, the MySpacers who SMS instead of speak will be doing everything on their phone.

Wednesday, October 31, 2007

Leopard vs. Vista... both are a scam

This week spawned yet more articles about MacOS X "Leopard" vs. Vista. Here's the bottom line: it doesn't matter, both are a scam.

Neither of these upgrades provides "must have" features to the end user that couldn't be easily written by a third party. Time Machine? Free backup programs come with extra hard drives you buy. Vista Search? We've got Google Desktop. This trend started a long time ago. XP didn't do much over Windows 2000. Neither did Tiger over Panther, or whatever feline genus came before that.

The problem is, OS upgrades generally don't matter to users anymore -- only developers. Remember the heady days of upgrading from Windows 95 to Windows NT4 or 2000 and having protected memory (or MacOS 9 to MacOS X)? Or how about upgrading from System 6 to System 7 and getting multitasking?

Those days are gone. Today, what are we paying for? More transparency? A search index?

I think we're mostly paying for the right of the OS developer to correct past mistakes that are more than skin deep. But what does that benefit the user? If an OS works, and we're not a developer, and we like the apps we've bought, what do we care if the OS changes things for developers?

In the past, changes for developers made more of an impact than they do now. Has .NET 3.0 given us so many great Windows apps that weren't there before that it was worth the cost of Vista? Has Objective-C 2.0 for Leopard? If the APIs had stayed the same, we probably would have better apps because people wouldn't be fooling around with new APIs all the time. Each time Microsoft releases a new OS, they push out a a bunch of new APIs that takes years to catch on (if ever). I can't speak for the Mac side, but I suspect it's the same. How many apps that users fire up every day actually use CoreImage, for example?

But here's the trick... we all must upgrade eventually if we want to keep buying new stuff for our computers. Apple sunsets older versions of MacOS faster than you can say "obsolescence." Hey, with a fanbase that will snap up 2 million copies of Leopard at retail in the first weekend, why would you keep legacy OSes on support?

Microsoft puts in a bit more time -- 7 years, I think -- but you're still on the road to an upgrade much earlier than that. For example, are you going to buy a camera, a printer, a hard drive, a DVD writer? Sometime soon you won't be able to get XP drivers for those. Or are you going to buy a game? Drivers aren't the only culprit. Most software vendors won't support XP at some point -- probably before it gets officially sunset by Microsoft.

This is the part of the scam that gets me confused when people say things like "I'm not going to upgrade to Vista." Really. Then what will you do? Will you never buy a PC again and never install or buy new software/hardware? Or will you switch to Mac, which provides the exact same hamster wheel of paid upgrades except more often?

If you really don't want to pay upgrade costs, maybe the only way out is Linux. But if you buy your camera, hard drive, DVD writer, printer, etc. for Linux, be prepared to write the driver yourself.

Sunday, October 28, 2007

Apple back to the good old days... of crashing?

A friend just mailed that he was one of the people who ran into the disk corruption problem in Leopard. That sucks.

I never even think of something like that happening with an OS upgrade. Complete disk corruption!? How does that happen? OS upgrades are just supposed to swap out files in your "System" or "Windows" folders, right?

In my estimation, an OS upgrade should be a slam dunk for Apple. They have complete control over the hardware they're shipping to, so they don't have to worry about crazy driver conflicts. Plus there just aren't that many apps in use for their OS -- especially the customized business apps that Windows-centric businesses develop -- so they can QA a lot more easily against those apps. Given the Mac demographic, I would guess that the top 50 or 100 apps installed on MacOS represent some massive percentage of the apps installed anywhere.

If Apple is back to the old days, where MacOS was a crashy, aesthetically pleasing OS that everyone tolerated just because it was better than Windows 3.1, that would really suck for them. MacOS X -- based on my once-beloved NeXTSTEP -- is supposed to be the OS that Windows aspires to be. If it can't do the basics, like, for example, not corrupt people's hard drives, then I think they're headed back to that old reputation.

By the way, this is the third major PR blunder by Apple this year (by my count). How many times are Mac fans going to give these guys a free pass until customers start taking their money elsewhere?

Saturday, October 27, 2007

BootCamp bait and switch

Last night, around 11pm, I was about ready to leave a flaming bag of dogshit on Steve Jobs' doorstep.

See, Apple programmed BootCamp to remove itself from your Mac on the day that Leopard shipped. That's right, it just disappeared from the "Startup Disk" menu in MacOS X. I noticed this when I was getting tired of trying to print an airplane boarding pass in MacOS for the third time and having it screw up because Macs apparently don't ship with decent HP deskjet drivers and I guess I need a real operating system for such an arduous printing task.

Thanks for the head's up there, Apple. Way to warn someone that their software is going to delete itself. Hey, Microsoft's stealth Windows update at least that didn't delete functionality.

I searched around on the web and it turns out that you can still boot into your BootCamp partition, but you have to use the option key to do it. You can't do any kind of adjustments to BootCamp with the BootCamp manager. Apparently I have to drop $130 on Leopard if I ever want to change my BootCamp partition even if I don't intend to ever boot into MacOS.

I'm not going to argue that Apple can never stop giving away BootCamp and start charging for it. That's fine. The problem here is the fact that everything just disappears on the day their new OS launched without any warning or "hey, this is going away, do you want to set it to boot into Windows so you can get what you need?".

This is just one of the many ways in which MacOS X is so non-user friendly that I'm astonished when people say it's more user friendly than Windows. Can you imagine if this had happened to someone who was not technically savvy?

Anyway, Microsoft doesn't get completely off the hook in this post. After seeing that BootCamp was going to require an upgrade to MacOS X, I wanted to wipe the entire MacOS partition and install Vista on there, move the existing Vista data to there, then reformat the other partition of the drive for data. Sounds simple, right?

Bzzzt.

Enter Microsoft draconian licensing. I can't install the same license of Vista twice on the same machine. It actually tells me "You already have a partition with this license, boot into that to upgrade." Yeah, thanks for the tip idiots, I'd do that if it wasn't for freakin BOOTCAMP!

Maybe I should buy Leopard after all and dump Vista, not the other way around. I want some of that CoverFlow coolness if I need to drop $130 to keep BootCamp going.

Monday, October 22, 2007

Buyer's Remorse

Most buyer's remorse I've had has come with the purchase of expensive gadgetry. But I've figured out something about buyer's remorse lately: I only experience it when the thing I buy isn't directly used in something I do creatively -- and necessary to achieve that level of creativity.

For example, my new camera. It was expensive. Buyer's remorse? Zero. I knew I needed to take better pictures of my family and this was what I needed to buy to do it.

Another example: Xbox 360. Buyer's remorse? Maybe a little. I barely use the thing. I like it a lot, but when I play it I feel like I'm wasting time with no upside for doing it.

Probably the items I've felt the most amount of buyer's remorse about are all Apple products I've ever bought. My Centris 650, which I spent $4500 on back in 1992. My Powerbook G3 -- $3100 in 1999. An iPod I bought and returned a few years ago. I have always ended up with that feeling of being ripped off when I've bought Apple equipment because I know I can get the same job done with a cheaper PC solution.

Maybe I'm a weirdo because I don't think MacOS X helps creativity on the computer. That's why I end up having buyer's remorse about it. But after this thought experiment about remorse, I can see why some people would not have it even when they buy a really expensive Mac. If you're a Final Cut Pro guy, it's only way you can get your creative tool. So that makes sense.

BTW, when Bubble 2.0 pops, Apple is going down too: all of these dot coms that have been buying their people Apple equipment will stop. I know that Apple needs to be shorted, especially since their stock had a $12 pop after hours today. The problem is "when"? Usually the best time to short a stock is when I thought it was overhyped all along and then finally conclude it's time to buy. I'm not there yet, so hold on a while, shorties.

Sunday, October 21, 2007

Why Would You Shoot Film Anymore? (Also: Always Shoot Your Digital Pictures in RAW.)

I just got a Canon 40D and downloaded a free trial of Adobe Lightroom.   The combination is amazing.  I snap a photo and it has flaws -- then I bring it into Lightroom and make it beautiful.  My camera shoots in 14-bit color, so a ton of detail is recoverable from under- and over- exposed photos.

Meanwhile, I have massive binders and many cases of slides that I shot in 1999-2001, before I bought my D30 and when I was really getting into photography.  Many have exposure problems, composition problems, etc..  What the hell do I do with all of these?  I'd love to get them in the computer and work on them there, but scanning slides is expensive and I don't care to buy a scanner to do it myself.

So, I have to ask, why would anyone shoot film anymore?  What possible gain would you find in doing it?

By shooting film you:

  • Can't take more than a few dozen pictures without changing rolls.
  • Can't check your work/exposure/focus on the spot.
  • Have less ability to modify the image after the fact (unless scanned).
  • Have no record of your film settings unless you buy an extra data recorder.
  • End up having to store photos physically, as well as rely on analog processes to distribute your photos.
  • Oh yeah, you process dozens of rolls of film using a bunch of chemicals.  Like THAT's good for the environment.

Here's the kicker, it's not even cheaper!  Seven years after Canon released the first reasonably priced digital SLR body (the Canon D30), Canon is still charging  ~$1700 for an EOS 1-V!  Newlab, here in San Francisco, still charges $10 for a roll of E-6 slide film mounted.  Ususally, the old and busted technology should get cheaper.

I snapped off 250 photos this weekend -- most end up getting deleted.  If I had done that on my film camera it would have cost me $70.  Wow.. that much cash for photos that mostly aren't keepers.  I wish I had gotten into photography AFTER the digital revolution.

By the way, I include motion picture companies in this question about shooting on film.  Why on earth shoot film instead of using one of the several digital cinematography solutions that are out there?   I can understand distributing on film, since the infrastructure is there.  But shooting on it?  Unless you have special needs like high frame rates, why?!

Getting to the meaningful part of this post:  I just want to advise all digital photographers out there --  DSLR and point and shoot alike -- if you take this stuff seriously, shoot RAW.  Reason being, in the future you can take old RAW images and put them through improved software for even better results than you see now. 

Years ago, I almost always shot RAW on my Canon D30.  I'm so glad I did.  Pulling these into Lightroom gives me the ability to easily adjust imagery like I've never been able to before.  At the time I shot these pictures, Photoshop barely had 16-bit.  So all images had to be converted laboriously from RAW to linear TIFF to some Photoshop Curves, etc.  Just to get some basic color work done, it was a slow painful process only reserved for the best photos of the bunch.

Lightroom also has filters that weren't straightforward in Photoshop, like removal of chromatic aberration.  Why buy an expensive lens if you can remove artifacts like that digitally?!  (This question is a bit sarcastic, but for some people, that solution may make a lot of sense instead of buying really expensive lenses.)

In any case:  shoot RAW and get software that makes it easy to manipulate RAW data.  There are a few point and shoots that can do it.  If you plan on keeping your digital photos forever, it's worth it.

Thursday, October 18, 2007

How to Get Foreclosed On

The last couple of days have seen a lot of articles about the plight of the homeowner in trouble and the irresponsible lenders who got us there. Yesterday's Wall Street Journal had a pretty good article about the paper trail of one guy's mortgage as it was sold from company to company and then repackaged into trusts and whatnot. The article was called "Behind Subprime Woes, A Cascade of Bad Bets". Today's WSJ had an article about real estate speculators walking away from mortgages and losing their primary home in the process.

So let me give a bit of advice to homeowners who think they might be foreclosed on sometime soon: never have any equity.

You see, the bank just wants their money back and they have the right to get paid first in a foreclosure. So if you owe $200K on an $800K home, foreclosing on you is more likely to get back all the loaned money than someone who owes $700K on an $800K home. Now if these two homeowners live in an area where that $800K home has dropped to $600K, you can bet your ass the bank is going after the guy who only owes $200K one first. Otherwise they'd be selling the home of the guy who owes $700K at a loss, and that's $100K they can never recover after the house is sold.

So if you have an interest only ARM right now with zero equity and you can't make your payments, congratulations! You will lose nothing except a home you don't actually own. You're smarter than most people who were silly enough to get any equity through a down payment.

Maybe I'm the only idiot who doesn't see why borrowing to buy into an inflated market is a good thing, because basically every financial crisis goes something like this:
  1. Something (call it a Widget) appears to be valuable
  2. Widgets become a hotly traded item, raising the price
  3. People start borrowing additional money to buy Widgets
  4. Widgets keep going up
  5. Lenders then allow people to borrow money in structures that assume the price of Widgets will never drop
  6. Surprise, the Widget price drops a little
  7. Borrowed money gets called in, economy goes south, or people just stop buying Widgets
  8. Panic selling begins. Widget price plummets and becomes illiquid.

That's every panic in a nutshell:

  • Stock Market Crash of 1929: extreme buying on margin, small drop triggers margin calls.
  • 1987: "Portfolio insurance" combined with computer trading trigger step #7.
  • Bubble 1.0
  • Beanie Babies
  • Tulip Mania

The collapse of Enron is very similar to what's happening with mortgages. The quarterly earnings of Enron were pushed around into structured investments called "Raptors". Raptors were dependent on Enron stock to continue going up to function. When Enron stock dropped, the whole thing came crashing down.

So the next time you wonder, "How can I avoid ever losing money?" The answer is to never have any. The government, the banks ... they'll just take it from you if you ever falter, and borrowing gives you leverage to have the cool things you've always wanted. It's best to never really own anything.

If you have equity, borrow against it to buy more gadgets*.

You'd think I'm being sarcastic, but unfortunately that seems to be the way it works.

* - Actually, did you know that something like 30% of new US debt the past few years was money loaned by China? So we manufacture our MacBooks and gadgets in their country, then they loan the money back to us so we can buy those toys. What a racket.

Sunday, October 14, 2007

I stand corrected: Kubrick did NOT compose his later films for 1.33

Contrary to what I've been led to believe, Stanley Kubrick did not compose his post-1960 (post-Spartacus) movies for the 1.33 Academy aspect ratio. In fact, it turns out the screenshots I took from Eyes Wide Shut in my earlier post on the topic of HD aspect ratios had been recomposed so a standard 1.66 crop worked in 1.33 for full frame.

So, to give this discussion some technical background, and in an effort to make this blog informative rather than just opinionated, let's diagram what "Academy" looks like on a frame of film.

Academy is a 4:3 film format specified when sound entered the picture, so to speak. Prior to the Talkies, most of the film area of a 35mm film frame was used for a motion picture (threaded vertically, of course, as opposed to horizontally like in your home still camera). If you turn a strip of 35mm film on its side, the size of that frame is the size of 4 of the perforations you see on either side of the film that the sprockets fit into to move the film along.


When Talkies appeared, they needed some way to get the sound matched to the picture in theaters. The obvious way to do this was to put the soundtrack onto the film optically. Until the advent of digital sound in theaters, this was done by printing the sound onto the film to the left of the movie's image (the yellow area in the diagram to the right). The centered 4:3 area to the right of that soundtrack was called "Academy". Most films done before the Widescreen gimmick took off in the 60s were filmed and projected in Academy. Academy 1.33 is the gray area in the diagram you see here. If I recall correctly, Academy lenses are slightly offset to account for the change in perspective.


Most early mainstream widescreen films like Lawrence of Arabia and Spartacus were filmed on 70mm film, which gave a 2.2 aspect ratio. These films could only be distributed on 35mm prints by squeezing the wider aspect into the additional frame area above and below Academy, represented by the bluish tinted rectangle. This distribution format is and was called Anamorphic, Cinemascope, or just 'Scope. It requires using an anamorphic lens on the projector as well to get the full widescreen.

However, shooting on 70mm (or 65mm) is expensive and inconvenient. The last mainstream feature film I know of to do principal photography on 65mm was Far and Away, which was 15 years ago. So directors wanted a way to get that cool 2.2 widescreen while still using 35mm film. Enter Cinemascope (again) -- now Panavision. Anamorphic lenses that allow you to get that same squeeze-into-Academy-width that you can from a 70mm transfer, except during principal photography instead of during a transfer.

The problem? Well... the lenses are kinda wonky. For example, the bokeh on an anamorphic lens makes out of focus circles look like ovals, and a rack focus will horizontally distort subjects. Lens reflections have a distinct "Panavision" look -- fire up Knoll Lens Flare or a copy of Close Encounters on DVD for examples of that.
[Side note: personally, I prefer principal photography with anamorphic lenses because the end result has less film grain and no intermediate printing tricks while still being 2.2 aspect. I've had a much easier time working on films shot anamorphically. Oh, and I grew up on the Panavision look so I guess it's nostalgic.]

That brings us to the pink part of the diagram above: 1.85 academy. With cinematographers out there who don't like anamorphic lenses, you need a way to do widescreen with spherical lenses. So here's what you do: shoot your movie in Academy and have the film house put a 1.85 mask on their projector.

Which brings us, finally, to Mr. Kubrick.

The debate is whether his films were supposed to have that 1.85 mask or not for the "correct" composition of the frame. My understanding was they were not. I believed that actually the theatrical release of films like Eyes Wide Shut was manipulated the other way: to make the 1.33 composition work on a 1.85 or 1.66 screen. And the DVD backs that up.

But today I was watching The Shining on my Xbox 360 and complaining about how Xbox scales full frame DVDs to to the full 16:9 and can't be turned off without changing the whole Xbox config. This made me hyper aware of the composition, and after the movie was done, I flipped my Xbox into a mode where it forced it into 4:3 to feel vindicated. Instead, I wondered if maybe the movie was supposed to be 1.85 after all.

The shot that convinced me was this shot of Scatman Crothers. Here it is in full frame.


Note that there is way too much space above his head. I pondered this for a while, then took a look at some other scenes from the movie.



Again, very suspicious composition for a film that's supposed to be 4:3.

I searched on the internet and found a very interesting storyboard from the movie and thread related to this. Click here and scroll down to "Kubrick Archives".

I just went to the trouble to crop these shots for 1.85, and guess what, they look great:

So there you have it, the master filmmaker did not compose The Shining for 1.33. As others have said if you Google around for this, he actually only requested his films be transferred Full Academy to video. Now that video is becoming a 16:9 format, I guess that means we can start seeing Kubrick's films as they were actually intended to be seen. And I'll have to stop going around saying that 4:3 is the ratio of the gods and that Kubrick used it, yada yada yada.
Only two things confuse me. First off, when I watched Full Metal Jacket in HD, the composition did feel off and I thought that it was because of Kubrick's 1.33 thing. I guess I'll have to watch it again.
Second thing is, did this guy's video ever get to Stanley Kubrick? What did Kubrick think?

Saturday, October 13, 2007

The Nannification of America and How It Relates To College Football

We've become a nanny state, but not because of our government.

Ever notice how everyone around college athletics calls college players "kids?" The Oklahoma State coach's rant from a couple weeks ago is the best case in point:






He uses the words "kid" and "child" repeatedly to describe the player on his team in the newspaper article. Calling them "kids" is partly because of the aforementioned nannification, the other part is more insidious and I'll get to that last, as that is the main point of the post.

Let's start with the obvious fact that not all college football players are 18 years old. Chris Weinke was 27 when Florida State won their last national title. He had skipped college and tried to play baseball, that went nowhere so he went back and played college ball. Yet if you google for "Chris Weinke kid", you'll get about 4,000 hits.

Even if they're 18, what's up with this "kid" label? 18 year olds can go to war. They can vote. They are tried as adults in court and have passed the age of consent for sex.

I assert that a large part of this is nannification. Our society's parents have started coddling their children until they are long past the age where they should be considered adults. We let them live at home until 30. We buy them houses while they're in college, or after that. Or, in the case of this player at Oklahoma State, his mother was hand-feeding him chicken (which is what the newspaper article mentioned as the incident that made other players think that guy was a bit childish).

What drives the nannification? Is it because college has become the new high school? Almost everyone goes to college, and now that everyone goes to college, less leave there with real job skills or higher education than previously. That might be part of it.

What really pushes the nannification of America, I suspect, is commercialism. Advertisers push the idea that adults are supposed to live carefree like they're 8 year olds. Watch any commercial, ever, and this is almost always the angle that makes the product appealing to adults. Eventually this is in the public consciousness and parents begin believing their adult children should live this way.

Marketers are doing this because their objective is to allow adults to have expendable income for as long as possible. They have now created a generation gap of expendable income that wasn't there when people used to go off and have children at 18 or 20 years old. Makes sense, after all, what's sexier economically: a 25 year old that rents a shoebox apartment and spends all his cash on Xbox, football and beer and accrues lots of debt, or a sober 25 year old that is paying a low rate mortgage in the suburbs and raising a family with little debt? I don't blame marketers for targeting the former, now it's up to the public to not fall into that trap.

Which brings to the insidious part of college sports pundits always using the word "kid" to describe a college player. This is again a marketing ploy. The reason they do this is they want to drive home the concept that collegiate athletics are amateur sports. They don't want you to think of college sports as a professional endeavor when in fact, the two are almost indistinguishable. College sports are no different than the pros in all respects except one... guess which one:


  • Team owners get paid millions from TV rights and merchandising

  • Coaches get paid millions

  • Fans spend millions on tickets and merchandise

  • Millions gets spent on the stadium, parking, etc.
  • Players get paid a salary.


If you guessed #5 is the difference, you're technically correct. Players don't get paid in college sports. We all know that boosters take care of these guys while they're in college. And even without those extra ... ahem ... perks, they get free room and board, a free education, and get to be BMOC.

But they don't get paid competitive salaries, they don't get to be free agents or negotiate their pay with their bosses. If a college player plays for the best team, he gets no better treatment than the worst team except for the promise of a pro career later. That is the smokescreen subversively driven home by the term "kids". The perception you're supposed to get is exactly that: they're just kids, going to school for an education, playing for the love of the game.

In truth, the entire pipeline of college football players is artificially created. The pro football league has banned college players until they are 21. So they have no choice but to be part of this charade. This did not used to be the case with pro basketball, but the NBA has now instituted a rule that players have to be a year out of high school before joining the pros. Baseball and hockey are the only true market-based major sports, where players can join when they're 18.

And while college players begrudgingly have to be part of this, the guys in the front office are making insane money. Charlie Weiss, coach of Notre Dame, makes between $3m-$4m a year. You know how much Notre Dame's TV contract is with NBC to air 6 ND home games? At least $9m a year. And that's one team! There are 118 other teams in Division I-A football, and most of the TV money is shared on the conference level. Then there are bowl games. The Rose Bowl pays about $15m to the winning school.

One player tried to buck the trend and was quickly shut down: Maurice Clarett. He tried to go pro after just one year in college and spent another year unsuccessfully challenging the NFL in court. Now he's in jail. I'm not saying he's a good guy, but his story will deter many generations to come from challenging the NFL's rule, thus keeping the "kids" in the midst of the scam of unpaid college athletics. Don't forget, it's all really about the academics.

Anyway, just a few thoughts for you the next time you hear someone call college players "kids." Years of listening to it has made it so ingrained in my brain that I find myself slip sometimes.... hopefully the annoyance about it I've felt while writing this will right me.

Monday, October 08, 2007

Once you buy non-DRMed music, you don't go back...

I've bought a bunch of songs on Amazon's new MP3 site in the past couple weeks. They load right up into Windows media. They play on my Xbox. I even loaded them up on an iPod i had lying around so I could listen to them in the car.

So today I went to Starbucks and there are stacks of free iTunes cards there. You can get some free track from iTunes by entering a code on the card. I looked at that and wondered... why? Why bother with free DRMed tracks that I have to use iTunes to listen to? I'd rather pay for these tracks to get them without DRM than get the for free with DRM.

Freedom to use the content you buy... what a concept! It's like we're back in the 1900s again. The music companies would have never gotten in this situation if they had just opened things up much earlier. Most people are honest (in the US, at least). I would have been buying MP3s online as early as 1996 if they had moved into this area earlier.

But let's bring up a major idiocy of the music business at the moment. Forget the RIAA tactics, suing little old ladies whose kids download some tracks and all of that... hasn't anyone in the music business ... anyone... heard of the long tail? Cripes. There are a ton of singles that I want to buy that are not available for downloading. And no I don't want vinyl. And no I don't want a collection of 12 lame songs on a CD to get the one good song. I just want the one song for $1.

Guys, just digitize that stuff! If it's a good song, the only way to make money is availability. Plus, you never know when someone will use it in a movie or something and make it into a huge hit again.

And by the way, how long do people predict it will take until movies are in the same boat as music is today? 5 years, 10 years? 20 years? Why are we still paying $15 for downloads of movies with DRM, for example? The biggest puzzler is, why do we pay to rent movies online? There's no rental, we're not borrowing anything. There's no video shop that paid $200 for a VHS tape to loan to people in the neighborhood. We are getting the exact same compressed audio and video that we would buy, except that a mechanism is in place that artifically allows us to only watch it once.

Yeah, that business won't last. Riddle me this: what if booksellers today tried to start renting books, or musicians tried renting their songs? You'd laugh in their face. The only reason it does well at all is because the movie business has brainwashed you by having movie theaters for 80 years that were the only source you could see these things. Then VHS rentals and DVDs continued the trend of "pay per view". How can the rental idea continue when there's no longer a difference in ownership?

One more thing... isn't it obvious that Apple has no intention of losing DRM because their entire video library is still DRMed? If Jobs was at all serious about losing DRM, he could sway at least Disney and ABC to do it on iTunes. He's Disney's largest shareholder and a member of the board, not to mention messiah of the iPod and iTunes. You don't think they'd listen to him?