Wednesday, October 31, 2007

Leopard vs. Vista... both are a scam

This week spawned yet more articles about MacOS X "Leopard" vs. Vista. Here's the bottom line: it doesn't matter, both are a scam.

Neither of these upgrades provides "must have" features to the end user that couldn't be easily written by a third party. Time Machine? Free backup programs come with extra hard drives you buy. Vista Search? We've got Google Desktop. This trend started a long time ago. XP didn't do much over Windows 2000. Neither did Tiger over Panther, or whatever feline genus came before that.

The problem is, OS upgrades generally don't matter to users anymore -- only developers. Remember the heady days of upgrading from Windows 95 to Windows NT4 or 2000 and having protected memory (or MacOS 9 to MacOS X)? Or how about upgrading from System 6 to System 7 and getting multitasking?

Those days are gone. Today, what are we paying for? More transparency? A search index?

I think we're mostly paying for the right of the OS developer to correct past mistakes that are more than skin deep. But what does that benefit the user? If an OS works, and we're not a developer, and we like the apps we've bought, what do we care if the OS changes things for developers?

In the past, changes for developers made more of an impact than they do now. Has .NET 3.0 given us so many great Windows apps that weren't there before that it was worth the cost of Vista? Has Objective-C 2.0 for Leopard? If the APIs had stayed the same, we probably would have better apps because people wouldn't be fooling around with new APIs all the time. Each time Microsoft releases a new OS, they push out a a bunch of new APIs that takes years to catch on (if ever). I can't speak for the Mac side, but I suspect it's the same. How many apps that users fire up every day actually use CoreImage, for example?

But here's the trick... we all must upgrade eventually if we want to keep buying new stuff for our computers. Apple sunsets older versions of MacOS faster than you can say "obsolescence." Hey, with a fanbase that will snap up 2 million copies of Leopard at retail in the first weekend, why would you keep legacy OSes on support?

Microsoft puts in a bit more time -- 7 years, I think -- but you're still on the road to an upgrade much earlier than that. For example, are you going to buy a camera, a printer, a hard drive, a DVD writer? Sometime soon you won't be able to get XP drivers for those. Or are you going to buy a game? Drivers aren't the only culprit. Most software vendors won't support XP at some point -- probably before it gets officially sunset by Microsoft.

This is the part of the scam that gets me confused when people say things like "I'm not going to upgrade to Vista." Really. Then what will you do? Will you never buy a PC again and never install or buy new software/hardware? Or will you switch to Mac, which provides the exact same hamster wheel of paid upgrades except more often?

If you really don't want to pay upgrade costs, maybe the only way out is Linux. But if you buy your camera, hard drive, DVD writer, printer, etc. for Linux, be prepared to write the driver yourself.

Sunday, October 28, 2007

Apple back to the good old days... of crashing?

A friend just mailed that he was one of the people who ran into the disk corruption problem in Leopard. That sucks.

I never even think of something like that happening with an OS upgrade. Complete disk corruption!? How does that happen? OS upgrades are just supposed to swap out files in your "System" or "Windows" folders, right?

In my estimation, an OS upgrade should be a slam dunk for Apple. They have complete control over the hardware they're shipping to, so they don't have to worry about crazy driver conflicts. Plus there just aren't that many apps in use for their OS -- especially the customized business apps that Windows-centric businesses develop -- so they can QA a lot more easily against those apps. Given the Mac demographic, I would guess that the top 50 or 100 apps installed on MacOS represent some massive percentage of the apps installed anywhere.

If Apple is back to the old days, where MacOS was a crashy, aesthetically pleasing OS that everyone tolerated just because it was better than Windows 3.1, that would really suck for them. MacOS X -- based on my once-beloved NeXTSTEP -- is supposed to be the OS that Windows aspires to be. If it can't do the basics, like, for example, not corrupt people's hard drives, then I think they're headed back to that old reputation.

By the way, this is the third major PR blunder by Apple this year (by my count). How many times are Mac fans going to give these guys a free pass until customers start taking their money elsewhere?

Saturday, October 27, 2007

BootCamp bait and switch

Last night, around 11pm, I was about ready to leave a flaming bag of dogshit on Steve Jobs' doorstep.

See, Apple programmed BootCamp to remove itself from your Mac on the day that Leopard shipped. That's right, it just disappeared from the "Startup Disk" menu in MacOS X. I noticed this when I was getting tired of trying to print an airplane boarding pass in MacOS for the third time and having it screw up because Macs apparently don't ship with decent HP deskjet drivers and I guess I need a real operating system for such an arduous printing task.

Thanks for the head's up there, Apple. Way to warn someone that their software is going to delete itself. Hey, Microsoft's stealth Windows update at least that didn't delete functionality.

I searched around on the web and it turns out that you can still boot into your BootCamp partition, but you have to use the option key to do it. You can't do any kind of adjustments to BootCamp with the BootCamp manager. Apparently I have to drop $130 on Leopard if I ever want to change my BootCamp partition even if I don't intend to ever boot into MacOS.

I'm not going to argue that Apple can never stop giving away BootCamp and start charging for it. That's fine. The problem here is the fact that everything just disappears on the day their new OS launched without any warning or "hey, this is going away, do you want to set it to boot into Windows so you can get what you need?".

This is just one of the many ways in which MacOS X is so non-user friendly that I'm astonished when people say it's more user friendly than Windows. Can you imagine if this had happened to someone who was not technically savvy?

Anyway, Microsoft doesn't get completely off the hook in this post. After seeing that BootCamp was going to require an upgrade to MacOS X, I wanted to wipe the entire MacOS partition and install Vista on there, move the existing Vista data to there, then reformat the other partition of the drive for data. Sounds simple, right?

Bzzzt.

Enter Microsoft draconian licensing. I can't install the same license of Vista twice on the same machine. It actually tells me "You already have a partition with this license, boot into that to upgrade." Yeah, thanks for the tip idiots, I'd do that if it wasn't for freakin BOOTCAMP!

Maybe I should buy Leopard after all and dump Vista, not the other way around. I want some of that CoverFlow coolness if I need to drop $130 to keep BootCamp going.

Monday, October 22, 2007

Buyer's Remorse

Most buyer's remorse I've had has come with the purchase of expensive gadgetry. But I've figured out something about buyer's remorse lately: I only experience it when the thing I buy isn't directly used in something I do creatively -- and necessary to achieve that level of creativity.

For example, my new camera. It was expensive. Buyer's remorse? Zero. I knew I needed to take better pictures of my family and this was what I needed to buy to do it.

Another example: Xbox 360. Buyer's remorse? Maybe a little. I barely use the thing. I like it a lot, but when I play it I feel like I'm wasting time with no upside for doing it.

Probably the items I've felt the most amount of buyer's remorse about are all Apple products I've ever bought. My Centris 650, which I spent $4500 on back in 1992. My Powerbook G3 -- $3100 in 1999. An iPod I bought and returned a few years ago. I have always ended up with that feeling of being ripped off when I've bought Apple equipment because I know I can get the same job done with a cheaper PC solution.

Maybe I'm a weirdo because I don't think MacOS X helps creativity on the computer. That's why I end up having buyer's remorse about it. But after this thought experiment about remorse, I can see why some people would not have it even when they buy a really expensive Mac. If you're a Final Cut Pro guy, it's only way you can get your creative tool. So that makes sense.

BTW, when Bubble 2.0 pops, Apple is going down too: all of these dot coms that have been buying their people Apple equipment will stop. I know that Apple needs to be shorted, especially since their stock had a $12 pop after hours today. The problem is "when"? Usually the best time to short a stock is when I thought it was overhyped all along and then finally conclude it's time to buy. I'm not there yet, so hold on a while, shorties.

Sunday, October 21, 2007

Why Would You Shoot Film Anymore? (Also: Always Shoot Your Digital Pictures in RAW.)

I just got a Canon 40D and downloaded a free trial of Adobe Lightroom.   The combination is amazing.  I snap a photo and it has flaws -- then I bring it into Lightroom and make it beautiful.  My camera shoots in 14-bit color, so a ton of detail is recoverable from under- and over- exposed photos.

Meanwhile, I have massive binders and many cases of slides that I shot in 1999-2001, before I bought my D30 and when I was really getting into photography.  Many have exposure problems, composition problems, etc..  What the hell do I do with all of these?  I'd love to get them in the computer and work on them there, but scanning slides is expensive and I don't care to buy a scanner to do it myself.

So, I have to ask, why would anyone shoot film anymore?  What possible gain would you find in doing it?

By shooting film you:

  • Can't take more than a few dozen pictures without changing rolls.
  • Can't check your work/exposure/focus on the spot.
  • Have less ability to modify the image after the fact (unless scanned).
  • Have no record of your film settings unless you buy an extra data recorder.
  • End up having to store photos physically, as well as rely on analog processes to distribute your photos.
  • Oh yeah, you process dozens of rolls of film using a bunch of chemicals.  Like THAT's good for the environment.

Here's the kicker, it's not even cheaper!  Seven years after Canon released the first reasonably priced digital SLR body (the Canon D30), Canon is still charging  ~$1700 for an EOS 1-V!  Newlab, here in San Francisco, still charges $10 for a roll of E-6 slide film mounted.  Ususally, the old and busted technology should get cheaper.

I snapped off 250 photos this weekend -- most end up getting deleted.  If I had done that on my film camera it would have cost me $70.  Wow.. that much cash for photos that mostly aren't keepers.  I wish I had gotten into photography AFTER the digital revolution.

By the way, I include motion picture companies in this question about shooting on film.  Why on earth shoot film instead of using one of the several digital cinematography solutions that are out there?   I can understand distributing on film, since the infrastructure is there.  But shooting on it?  Unless you have special needs like high frame rates, why?!

Getting to the meaningful part of this post:  I just want to advise all digital photographers out there --  DSLR and point and shoot alike -- if you take this stuff seriously, shoot RAW.  Reason being, in the future you can take old RAW images and put them through improved software for even better results than you see now. 

Years ago, I almost always shot RAW on my Canon D30.  I'm so glad I did.  Pulling these into Lightroom gives me the ability to easily adjust imagery like I've never been able to before.  At the time I shot these pictures, Photoshop barely had 16-bit.  So all images had to be converted laboriously from RAW to linear TIFF to some Photoshop Curves, etc.  Just to get some basic color work done, it was a slow painful process only reserved for the best photos of the bunch.

Lightroom also has filters that weren't straightforward in Photoshop, like removal of chromatic aberration.  Why buy an expensive lens if you can remove artifacts like that digitally?!  (This question is a bit sarcastic, but for some people, that solution may make a lot of sense instead of buying really expensive lenses.)

In any case:  shoot RAW and get software that makes it easy to manipulate RAW data.  There are a few point and shoots that can do it.  If you plan on keeping your digital photos forever, it's worth it.

Thursday, October 18, 2007

How to Get Foreclosed On

The last couple of days have seen a lot of articles about the plight of the homeowner in trouble and the irresponsible lenders who got us there. Yesterday's Wall Street Journal had a pretty good article about the paper trail of one guy's mortgage as it was sold from company to company and then repackaged into trusts and whatnot. The article was called "Behind Subprime Woes, A Cascade of Bad Bets". Today's WSJ had an article about real estate speculators walking away from mortgages and losing their primary home in the process.

So let me give a bit of advice to homeowners who think they might be foreclosed on sometime soon: never have any equity.

You see, the bank just wants their money back and they have the right to get paid first in a foreclosure. So if you owe $200K on an $800K home, foreclosing on you is more likely to get back all the loaned money than someone who owes $700K on an $800K home. Now if these two homeowners live in an area where that $800K home has dropped to $600K, you can bet your ass the bank is going after the guy who only owes $200K one first. Otherwise they'd be selling the home of the guy who owes $700K at a loss, and that's $100K they can never recover after the house is sold.

So if you have an interest only ARM right now with zero equity and you can't make your payments, congratulations! You will lose nothing except a home you don't actually own. You're smarter than most people who were silly enough to get any equity through a down payment.

Maybe I'm the only idiot who doesn't see why borrowing to buy into an inflated market is a good thing, because basically every financial crisis goes something like this:
  1. Something (call it a Widget) appears to be valuable
  2. Widgets become a hotly traded item, raising the price
  3. People start borrowing additional money to buy Widgets
  4. Widgets keep going up
  5. Lenders then allow people to borrow money in structures that assume the price of Widgets will never drop
  6. Surprise, the Widget price drops a little
  7. Borrowed money gets called in, economy goes south, or people just stop buying Widgets
  8. Panic selling begins. Widget price plummets and becomes illiquid.

That's every panic in a nutshell:

  • Stock Market Crash of 1929: extreme buying on margin, small drop triggers margin calls.
  • 1987: "Portfolio insurance" combined with computer trading trigger step #7.
  • Bubble 1.0
  • Beanie Babies
  • Tulip Mania

The collapse of Enron is very similar to what's happening with mortgages. The quarterly earnings of Enron were pushed around into structured investments called "Raptors". Raptors were dependent on Enron stock to continue going up to function. When Enron stock dropped, the whole thing came crashing down.

So the next time you wonder, "How can I avoid ever losing money?" The answer is to never have any. The government, the banks ... they'll just take it from you if you ever falter, and borrowing gives you leverage to have the cool things you've always wanted. It's best to never really own anything.

If you have equity, borrow against it to buy more gadgets*.

You'd think I'm being sarcastic, but unfortunately that seems to be the way it works.

* - Actually, did you know that something like 30% of new US debt the past few years was money loaned by China? So we manufacture our MacBooks and gadgets in their country, then they loan the money back to us so we can buy those toys. What a racket.

Sunday, October 14, 2007

I stand corrected: Kubrick did NOT compose his later films for 1.33

Contrary to what I've been led to believe, Stanley Kubrick did not compose his post-1960 (post-Spartacus) movies for the 1.33 Academy aspect ratio. In fact, it turns out the screenshots I took from Eyes Wide Shut in my earlier post on the topic of HD aspect ratios had been recomposed so a standard 1.66 crop worked in 1.33 for full frame.

So, to give this discussion some technical background, and in an effort to make this blog informative rather than just opinionated, let's diagram what "Academy" looks like on a frame of film.

Academy is a 4:3 film format specified when sound entered the picture, so to speak. Prior to the Talkies, most of the film area of a 35mm film frame was used for a motion picture (threaded vertically, of course, as opposed to horizontally like in your home still camera). If you turn a strip of 35mm film on its side, the size of that frame is the size of 4 of the perforations you see on either side of the film that the sprockets fit into to move the film along.


When Talkies appeared, they needed some way to get the sound matched to the picture in theaters. The obvious way to do this was to put the soundtrack onto the film optically. Until the advent of digital sound in theaters, this was done by printing the sound onto the film to the left of the movie's image (the yellow area in the diagram to the right). The centered 4:3 area to the right of that soundtrack was called "Academy". Most films done before the Widescreen gimmick took off in the 60s were filmed and projected in Academy. Academy 1.33 is the gray area in the diagram you see here. If I recall correctly, Academy lenses are slightly offset to account for the change in perspective.


Most early mainstream widescreen films like Lawrence of Arabia and Spartacus were filmed on 70mm film, which gave a 2.2 aspect ratio. These films could only be distributed on 35mm prints by squeezing the wider aspect into the additional frame area above and below Academy, represented by the bluish tinted rectangle. This distribution format is and was called Anamorphic, Cinemascope, or just 'Scope. It requires using an anamorphic lens on the projector as well to get the full widescreen.

However, shooting on 70mm (or 65mm) is expensive and inconvenient. The last mainstream feature film I know of to do principal photography on 65mm was Far and Away, which was 15 years ago. So directors wanted a way to get that cool 2.2 widescreen while still using 35mm film. Enter Cinemascope (again) -- now Panavision. Anamorphic lenses that allow you to get that same squeeze-into-Academy-width that you can from a 70mm transfer, except during principal photography instead of during a transfer.

The problem? Well... the lenses are kinda wonky. For example, the bokeh on an anamorphic lens makes out of focus circles look like ovals, and a rack focus will horizontally distort subjects. Lens reflections have a distinct "Panavision" look -- fire up Knoll Lens Flare or a copy of Close Encounters on DVD for examples of that.
[Side note: personally, I prefer principal photography with anamorphic lenses because the end result has less film grain and no intermediate printing tricks while still being 2.2 aspect. I've had a much easier time working on films shot anamorphically. Oh, and I grew up on the Panavision look so I guess it's nostalgic.]

That brings us to the pink part of the diagram above: 1.85 academy. With cinematographers out there who don't like anamorphic lenses, you need a way to do widescreen with spherical lenses. So here's what you do: shoot your movie in Academy and have the film house put a 1.85 mask on their projector.

Which brings us, finally, to Mr. Kubrick.

The debate is whether his films were supposed to have that 1.85 mask or not for the "correct" composition of the frame. My understanding was they were not. I believed that actually the theatrical release of films like Eyes Wide Shut was manipulated the other way: to make the 1.33 composition work on a 1.85 or 1.66 screen. And the DVD backs that up.

But today I was watching The Shining on my Xbox 360 and complaining about how Xbox scales full frame DVDs to to the full 16:9 and can't be turned off without changing the whole Xbox config. This made me hyper aware of the composition, and after the movie was done, I flipped my Xbox into a mode where it forced it into 4:3 to feel vindicated. Instead, I wondered if maybe the movie was supposed to be 1.85 after all.

The shot that convinced me was this shot of Scatman Crothers. Here it is in full frame.


Note that there is way too much space above his head. I pondered this for a while, then took a look at some other scenes from the movie.



Again, very suspicious composition for a film that's supposed to be 4:3.

I searched on the internet and found a very interesting storyboard from the movie and thread related to this. Click here and scroll down to "Kubrick Archives".

I just went to the trouble to crop these shots for 1.85, and guess what, they look great:

So there you have it, the master filmmaker did not compose The Shining for 1.33. As others have said if you Google around for this, he actually only requested his films be transferred Full Academy to video. Now that video is becoming a 16:9 format, I guess that means we can start seeing Kubrick's films as they were actually intended to be seen. And I'll have to stop going around saying that 4:3 is the ratio of the gods and that Kubrick used it, yada yada yada.
Only two things confuse me. First off, when I watched Full Metal Jacket in HD, the composition did feel off and I thought that it was because of Kubrick's 1.33 thing. I guess I'll have to watch it again.
Second thing is, did this guy's video ever get to Stanley Kubrick? What did Kubrick think?

Saturday, October 13, 2007

The Nannification of America and How It Relates To College Football

We've become a nanny state, but not because of our government.

Ever notice how everyone around college athletics calls college players "kids?" The Oklahoma State coach's rant from a couple weeks ago is the best case in point:






He uses the words "kid" and "child" repeatedly to describe the player on his team in the newspaper article. Calling them "kids" is partly because of the aforementioned nannification, the other part is more insidious and I'll get to that last, as that is the main point of the post.

Let's start with the obvious fact that not all college football players are 18 years old. Chris Weinke was 27 when Florida State won their last national title. He had skipped college and tried to play baseball, that went nowhere so he went back and played college ball. Yet if you google for "Chris Weinke kid", you'll get about 4,000 hits.

Even if they're 18, what's up with this "kid" label? 18 year olds can go to war. They can vote. They are tried as adults in court and have passed the age of consent for sex.

I assert that a large part of this is nannification. Our society's parents have started coddling their children until they are long past the age where they should be considered adults. We let them live at home until 30. We buy them houses while they're in college, or after that. Or, in the case of this player at Oklahoma State, his mother was hand-feeding him chicken (which is what the newspaper article mentioned as the incident that made other players think that guy was a bit childish).

What drives the nannification? Is it because college has become the new high school? Almost everyone goes to college, and now that everyone goes to college, less leave there with real job skills or higher education than previously. That might be part of it.

What really pushes the nannification of America, I suspect, is commercialism. Advertisers push the idea that adults are supposed to live carefree like they're 8 year olds. Watch any commercial, ever, and this is almost always the angle that makes the product appealing to adults. Eventually this is in the public consciousness and parents begin believing their adult children should live this way.

Marketers are doing this because their objective is to allow adults to have expendable income for as long as possible. They have now created a generation gap of expendable income that wasn't there when people used to go off and have children at 18 or 20 years old. Makes sense, after all, what's sexier economically: a 25 year old that rents a shoebox apartment and spends all his cash on Xbox, football and beer and accrues lots of debt, or a sober 25 year old that is paying a low rate mortgage in the suburbs and raising a family with little debt? I don't blame marketers for targeting the former, now it's up to the public to not fall into that trap.

Which brings to the insidious part of college sports pundits always using the word "kid" to describe a college player. This is again a marketing ploy. The reason they do this is they want to drive home the concept that collegiate athletics are amateur sports. They don't want you to think of college sports as a professional endeavor when in fact, the two are almost indistinguishable. College sports are no different than the pros in all respects except one... guess which one:


  • Team owners get paid millions from TV rights and merchandising

  • Coaches get paid millions

  • Fans spend millions on tickets and merchandise

  • Millions gets spent on the stadium, parking, etc.
  • Players get paid a salary.


If you guessed #5 is the difference, you're technically correct. Players don't get paid in college sports. We all know that boosters take care of these guys while they're in college. And even without those extra ... ahem ... perks, they get free room and board, a free education, and get to be BMOC.

But they don't get paid competitive salaries, they don't get to be free agents or negotiate their pay with their bosses. If a college player plays for the best team, he gets no better treatment than the worst team except for the promise of a pro career later. That is the smokescreen subversively driven home by the term "kids". The perception you're supposed to get is exactly that: they're just kids, going to school for an education, playing for the love of the game.

In truth, the entire pipeline of college football players is artificially created. The pro football league has banned college players until they are 21. So they have no choice but to be part of this charade. This did not used to be the case with pro basketball, but the NBA has now instituted a rule that players have to be a year out of high school before joining the pros. Baseball and hockey are the only true market-based major sports, where players can join when they're 18.

And while college players begrudgingly have to be part of this, the guys in the front office are making insane money. Charlie Weiss, coach of Notre Dame, makes between $3m-$4m a year. You know how much Notre Dame's TV contract is with NBC to air 6 ND home games? At least $9m a year. And that's one team! There are 118 other teams in Division I-A football, and most of the TV money is shared on the conference level. Then there are bowl games. The Rose Bowl pays about $15m to the winning school.

One player tried to buck the trend and was quickly shut down: Maurice Clarett. He tried to go pro after just one year in college and spent another year unsuccessfully challenging the NFL in court. Now he's in jail. I'm not saying he's a good guy, but his story will deter many generations to come from challenging the NFL's rule, thus keeping the "kids" in the midst of the scam of unpaid college athletics. Don't forget, it's all really about the academics.

Anyway, just a few thoughts for you the next time you hear someone call college players "kids." Years of listening to it has made it so ingrained in my brain that I find myself slip sometimes.... hopefully the annoyance about it I've felt while writing this will right me.

Monday, October 08, 2007

Once you buy non-DRMed music, you don't go back...

I've bought a bunch of songs on Amazon's new MP3 site in the past couple weeks. They load right up into Windows media. They play on my Xbox. I even loaded them up on an iPod i had lying around so I could listen to them in the car.

So today I went to Starbucks and there are stacks of free iTunes cards there. You can get some free track from iTunes by entering a code on the card. I looked at that and wondered... why? Why bother with free DRMed tracks that I have to use iTunes to listen to? I'd rather pay for these tracks to get them without DRM than get the for free with DRM.

Freedom to use the content you buy... what a concept! It's like we're back in the 1900s again. The music companies would have never gotten in this situation if they had just opened things up much earlier. Most people are honest (in the US, at least). I would have been buying MP3s online as early as 1996 if they had moved into this area earlier.

But let's bring up a major idiocy of the music business at the moment. Forget the RIAA tactics, suing little old ladies whose kids download some tracks and all of that... hasn't anyone in the music business ... anyone... heard of the long tail? Cripes. There are a ton of singles that I want to buy that are not available for downloading. And no I don't want vinyl. And no I don't want a collection of 12 lame songs on a CD to get the one good song. I just want the one song for $1.

Guys, just digitize that stuff! If it's a good song, the only way to make money is availability. Plus, you never know when someone will use it in a movie or something and make it into a huge hit again.

And by the way, how long do people predict it will take until movies are in the same boat as music is today? 5 years, 10 years? 20 years? Why are we still paying $15 for downloads of movies with DRM, for example? The biggest puzzler is, why do we pay to rent movies online? There's no rental, we're not borrowing anything. There's no video shop that paid $200 for a VHS tape to loan to people in the neighborhood. We are getting the exact same compressed audio and video that we would buy, except that a mechanism is in place that artifically allows us to only watch it once.

Yeah, that business won't last. Riddle me this: what if booksellers today tried to start renting books, or musicians tried renting their songs? You'd laugh in their face. The only reason it does well at all is because the movie business has brainwashed you by having movie theaters for 80 years that were the only source you could see these things. Then VHS rentals and DVDs continued the trend of "pay per view". How can the rental idea continue when there's no longer a difference in ownership?

One more thing... isn't it obvious that Apple has no intention of losing DRM because their entire video library is still DRMed? If Jobs was at all serious about losing DRM, he could sway at least Disney and ABC to do it on iTunes. He's Disney's largest shareholder and a member of the board, not to mention messiah of the iPod and iTunes. You don't think they'd listen to him?

Sunday, October 07, 2007

This week's best blog post about Apple...

..is on Mark Pilgrim's blog. (found through de Icaza's blog.)

His point basically is, "If Apple's so much about 'it just works', why do you people keep buying their crippled stuff and hacking it? If you want Apple to open up their stuff, don't buy an iPhone." Nicely put.

Of course, this is irrelevant because as I've mentioned before, Apple is the new Real Networks.

Thursday, October 04, 2007

The futures of housing

Thanks to my new favorite blog, Burbed, I was pointed at this Wall Street Journal article outlining a new Chicago Mercantile Exchange futures vehicle for housing prices 60 months out.

If you're not familiar with futures trading, the idea is for commodities sellers and buyers to hedge against massive changes in the price of whatever they need to sell or buy. Say you own an orange juice company and need a supply of 5,000 oranges a day. You don't want your business to expose itself completely to the retail prices of oranges. If there was a short term orange shortage, you'd pay through the nose but probably can't reflect that price change in your own product. Likewise, the orange grower wants to ensure his prices in case there is an orange glut. So you might buy a contract to buy 50,000 oranges in April 2008 for $1 an orange, and pay $50 for the contract. The contract itself is worth something, because if the oranges were suddenly worth $2 an orange, and you have a contract for $1 an orange, you could turn around and sell your oranges for a cool $50,000 profit. If everyone thought oranges would be worth $2 at the time the futures contract delivered (April, 2008), that $50K would be priced into the current exchange price of the contract.

That was originally why futures were created. Today they have become entirely financial instruments unrelated to anything material like orange delivery. You can trade futures in stock prices, interest rates, currencies, and so on. But market watchers really really like these futures markets because they have proven to be remarkably accurate predictors. The hive mind seems to be at work within those.

One of my favorite websites that uses this is the Hollywood Stock Exchange. I'm not sure HSX has released numbers as to how accurate it has been as a predictor of box office, but my own experience is that it has been pretty darn good. The games industry has started getting in on this with a site called The Sim Exchange (unrelated to "The Sims").

The US government once controversially tried to introduce a futures market for Terrorism. The hope was that it would help predict terrorism attacks. Putting aside the ethical debate, few disagreed that this had a good chance of being a decent terrorism predictor.

Which brings us to today's topic: housing prices.

The Merc -- by the way, it's the exchange featured in Ferris Bueller's Day Off, I've visited the exchange myself when I was in high school -- has started trading futures in housing prices that are tied to the house price indices. Again, there's no material good you get from this future, but you can profit or lose money from it essentially the same way if you bought or sold a house in these areas.

According to the futures market, the San Francisco area is poised to drop 28% by 2011. Considering what we've seen for sale lately, this should be no surprise. In any case, you can hedge your house using these futures. Maybe it's worth looking into it for very large mortgage lendees who carry any risk with their situation.

That is, of course, if anyone could find out what the current price is...


Hey CME, you guys might want to turn off print exceptions for the world to see. Just a thought.

Tuesday, October 02, 2007

Put Silverlight into Windows Update

I'm convinced now that if Microsoft wants Silverlight to really take off, it is going to require more than paying websites like MLB to use it. An individual developer working on a small scale business is unlikely to require someone to do an extra download of Silverlight to use their site. Users won't do it, and they'll often go to another website.

In the early years of the intertubes, I refused to do the extra download of Flash or Java. Flash was only used for advertising. Then it was decided that Flash was going to be used for things like menus and video players, so you couldn't avoid installing it. Either way, we have a world where 80% or more of web users have Flash installed and 0.001% have Silverlight installed, if that many.

So why not put Silverlight into Windows Update and boost that number to 90% having Silverlight installed? Why not ask Apple to put it into Mac OS X Update and have it distributed to all mac users? Apple and Adobe aren't exactly a love-fest anyway.

I think Silverlight and Flash should be made to compete on technical merit. All that is required to do that is get Silverlight out there to users so they won't have to do the extra install step.