Tim's going to rip me for this post.
One of my co-workers recently bought a Mac laptop that he brings to work. Once in a while we discuss a few things about it, and generally we all give him a little bit of crap for owning a Mac. We're all a bunch of Windows geeks and Linux and Mac is just unthinkable for most of us.
Well, yesterday's WTF/E post got me pretty down about Microsoft's direction right now. I think WPF is actually pretty cool. I think ultimately something like it is going to take over for UIs (be it WPF, Flex, OpenLaszlo or whatever).
But let's look back in history again. In 2002, I bought my copy of Visual Studio .NET the day it came out. It's almost 2007 and we still do not have .NET out there as a realistic platform for developing consumer Windows apps. Vista will fix that, but how long is it going to take for Vista to have a meaningful amount of uptake? In 2004, 50% of Windows users were still not running XP. That's right: 3 years after release, half of the people out there were still running W2K, 98, ME, 95, NT 3.5.1, NT 4.0, whatever.
Granted, XP was not released during the go-go years of PC buying, so maybe that slowed its uptake. Or is it still going to take 3 years for Vista to have a majority of the installed base? Or longer, due to different hardware requirements of Vista vs. XP at the time (XP was not more demanding on hardware than Windows 2000)?
The thing that went through my mind today after talking to my newly anointed Cult of Mac™ coworker is "Would it be better doing my hobby coding on a Mac? Would I be pulling my hair out less?"
Let's look at the facts:
- API is changing less... more on this in a second
- Apple at least isn't doing one thing with one hand and something totally different with another (well, except for that Java thing they dumped). Microsoft can't seem to decide if they want C++ or C# to be their main language for app development mving forward... and that debate's been going on for 5 years.
- Has a hobbyist following, so software I develop might get more attention than if it's for Windows, where basically no one gives a damn.
I had done some Mac OS X development while the OS was in beta and just after release (pre-.NET .. after .NET came out, it was all over for that phase). However I had done some NeXT development back in the day and enjoyed it a lot. Nothing serious, just messing around for the most part.
The first thing I Googled was "Cocoa vs. Carbon." I was wondering what API most Mac developers are using these days. It turns out (after checking with another coworker as well), that there is no good answer to this. It's mostly religious right now. You either code in Objective C with Cocoa or you write in Carbon with C bindings. There are no C++ bindings to the Mac API and Java is deprecated. Ok, that all kinda sucks.
Then I thought about this a little more. Actually, if any platform out there has more legacy baggage right now than Windows, it could be Mac OS X. Cocoa uses Objective-C because that language was chosen in 1986 or whatever by the NeXT team. To this day, they haven't been able to shake it for a mainstream API using C++. Ditto for Carbon. Carbon is mostly a legacy API from the Macintosh Toolbox days, and just has C bindings.
I guess the point is, even though Microsoft seems really confused right now, they have some good ideas. Apple doesn't seem confused, but I just don't see where they're going in the future. They deprecated the language (Java) I'd probably want to actually use for coding a Mac without a pure C++ API. Are they planning on having their own language for programming their platform (Objective-C) forever and ever? It seems absurd.
I guess the end result of my thought experiment is that I'll just stick with my Windows and develop for WPF in the future. I don't really see the upside of going to a Mac. Seems too legacy. Even Linux has less legacy issues in a lot of ways. I'd like to see what Apple has up their sleeve in the future, but that's about it.