Monday, October 30, 2006
Oh, and the Xbox 360? 160 watts. Maybe up near 180 at load. I've actually measured it.
Never in the history of playing videogames on CD -- which is about 17 years for me, since I first fired up The Manhole to see what the CD-ROM hype was about -- have I ever messed up a CD or DVD based game where I couldn't play it anymore. The only time I've had disc errors, it's been the problem of the console or reader, not the disc itself.
So don't fool yourself if this is your excuse for modding your Xbox. You're doing it to do pirating of some sort. Either you're downloading copyrighted stuf on Torrent or you're hacking games on the console. Admit it.
On a side note, today HardOCP has an article about the Xbox Live Operations Center.
Saturday, October 28, 2006
Just noticed that Tom Skerritt is the host of See Windows Vista. Compare and contrast:
Yeah, that's what I thought too. Tom Skerritt can kick that guy's ass! Skerritt has trained Top Gun candidates, he's had to deal with Jodie Foster in Contact. He even dared to be in a Poltergeist movie with that weird blonde girl even after the Poltergeist curse was well known.
Oh, and he faced an Alien coming out of John Hurt's chest.
Tom Skerritt is a badass.
BTW, if you want to see some of the cool apps that are being written with Windows Presentation Framework, check out that See Windows Vista site. I just downloaded the New York Times Reader and it's a really great app that shows what WPF can do.
Friday, October 27, 2006
Tuesday, October 24, 2006
No offense to the folks at Microsoft who support Visual J#, but who on earth uses this software? If you use this, and somehow come across this blog, please write a comment! I'd really like to know why Microsoft keeps supporting this, when it seems virtually useless.
By the way, is TMZ AOL's most popular site right now? Haven't we come full circle when the media companies who hire the actors then sell gossip on them on distribution channels that they own (like AOL... or, more ominously,Time Warner Cable). I think the entertainment loop is almost complete. Soon we'll have media barons starting wars again like Hearst.
Monday, October 23, 2006
I bought my wife a new laptop last week (Dell has amazing deals on Core Duo Inspirons, if you're in the market). In any case, I decided to take her (aka my) old laptop and install Vista RC 1 on it.
Now, this laptop sucks. It's a Compaq Presario 1700, 1ghz Pentium III, 512M of RAM, 20G disk. No built-in Wi-Fi or Bluetooth. Purchased in January 2002, it was a low-to-mid-range laptop even at that time. I went to Fry's and found one that had a reasonable price/performance ratio and bought that.
But here I am running Vista without issue on this machine. It took a long time to install, though it did on my desktop. It also doesn't run Aero (no accelerated hardware graphics). However, it is running the OS well and with no additional driver help from me. I think I will have to install an ATI driver though because it can sometimes seem a little hitchy when scrolling with the mouse wheel in IE7 (I use a mouse on all laptops). Even without a real video driver, it seems to be keeping up with my typing in Windows Live Writer. Not bad.
I'm getting more excited about Vista thanks to playing around with Windows Presentation Framework and Expression. I really like WPF and am porting my perennial basketball computer software to it. I should have that posted in a week or two.
In any case, for all the hype about Vista's minspec, this really ancient crappy laptop that was barely holding on with XP is running Vista really great. So I have no idea what all of the anti-MS crowd have been talking about with respect to Vista minspec.
Sunday, October 22, 2006
Demo video that was posted to YouTube.. skip forward to 3:45 for gameplay.
As much as Gears of War has been overhyped, I'm pretty impressed with the battle mechanics. I like the speed which the player moves from cover to cover and slams in there. I agree with one of the commenters on YouTube that the AI looks pretty lame. Did any of the bad guys get shot by the AIs in this demo? I hope that the real game has smarter cooperative AI than this. One of GRAW's (Ghost Recon: Advanced Warfighter) major failings is that the AIs in cooperative multiplayer online are horrible. My buddy and I just picked them off as they came up a hill one after another, that was the one and only time I played GRAW in co-op.
As far as AIs go, I have to imagine that Army of Two has to have a superior AI if it is to be any fun. Here's an Army of Two trailer that's all prerendered and tells us nothing about the game except that there are two soldiers.
To get to the front page, all it requires is having a bunch of your buddies vote for your blog. I've noticed the same blogs come up on Digg's front page over and over and over. And if it's not from one of the same blogs, it seems like everything that gets dug these days is a "Top 10 list". Top 10 reasons Mac is better than Windows, top 10 reasons Vista is better than Mac. Whatever. It's all really boring.
No matter what the Web 2.0 crowd thinks, there's something to be said for manually editing news. The most meaningful news still comes to me that way. This is why I keep my Wall Street Journal subscription.
Thursday, October 19, 2006
Here's the thing: the idea of splitting a game into separate purchasable pieces is a good idea for both the consumer and the game publisher. On the consumer side, if I'm a casual gamer and don't want to pay $50 for all levels of the game, I shouldn't have to. On the publisher side, the game maker can make $10 from the base version from people who wouldn't pay for the full version. That $10 is money they wouldn't have seen before. Makes sense and everyone can be happy. And maybe someday the person will pay more to upgrade a little, or more levels can be released. Great.
The problem with Lumines taking this tack is that its.... hello... Lumines?! It was overpriced when we all paid $40 for it on PSP and, in my opinion, it's overpriced at $12.00 on Xbox Live Arcade. The game has been overhyped since day one. It's Tetris Advanced with a downtempo soundtrack. Why would I pay $35 to get all of that game on XBLA? At most, I'd pay $10 for the whole thing, which is what games like Zuma and Geometry Wars cost. Lumines is not a disc-shippable game on a console. No one would pay full price for it... in retrospect, I'm surprised people did on the PSP (including myself, but I bought into the hype from my neighbor at work)
Let's see a game like Splinter Cell DA on XBLA, with microtransactions that end up costing $60 for the full thing. Then casual gamers can buy a part of the game and, like I said, that's $10 or $20 the game publisher wouldn't have seen before since they wouldn't have spent $60 on the game up front.
Wednesday, October 18, 2006
Anyone who might be reading this, can you tell me where it was posted that it got so much attention? It's nice to see so many visitors from Apple, Danger, Nvidia, Microsoft, Sega and others. Mailing list perhaps?
Monday, October 16, 2006
The True Cost of Standby Power
A Closer Look at Folding @ Home on the GPU
Now, what are we supposed to make of things when the same people who are concerned about 2 watts of standby power are running their machines all the time with some protein folding simulator? Furthermore, what should we make of it when these folks are running said folding simulator on a GPU -- easily the most power hungry component in the computer beyond the motherboard/CPU.
Going flat out, a system with an Nvidia GX2 consumes upwards of 240 watts. 240 watts! Yet in the blogosphere, people tell us we're supposed to be worried about 2. People need a reality check.
So have video game consoles become a fashion statement like a pair of jeans? You know, like the Levi's ads where there's basically a story completely unreleated to jeans and the benefits of those jeans?
Here are the absurdites for you to look at:
Sunday, October 15, 2006
I've long said I was going to update my machine to a Core 2 Duo as soon as they arrived. Well, they're here, but I still haven't upgraded. My current desktop machine is 2.5 years old and is a Pentium 4 2.8ghz AGP. It's starting to show its age when playing games like Battlefield 2, which it can barely play before freezing anyway.
So in pricing out a new Core 2 Duo machine, I came up with a number around $1200. Doesn't sound bad. But I started thinking, should I wait for the quad chips that Intel will be pushing out the door soon? Furthermore, should I wait until next year when they roll out a 45nm process?
Right now the quad core Kentsfields are 65nm. From a power and heat standpoint it would make sense to wait for the shrink on those bad boys to 45nm. Better yet, it might pay to wait until the 45nm quads are not just two core 2 duos slapped into the same packaging (Kentsfields draw about 130 watts at idle).
Now you may ask, "Why, Trimbo? Why do you care about such things like process?" The answer is twofold. I want a computer that can handle upcoming games for several years -- hence the desire for a quad core processor. The other half is that I don't want a hot and noisy computer. I don't want to buy another Prescott (aka PressHOT) and have to deal with thermal issues while playing games.
Saturday, October 14, 2006
XBL is arguably the best thing about my Xbox 360. I love it. I don't mind paying $50 a year for it. It's the price of one game per year, not unimaginable, yet I get tons of demos, trailers, and of course online multiplayer.
So why is it fanboys out there like to say that now that Sony is going online with the PS3, XBL would be free? I don't get it. Microsoft would be silly to give that away -- it's awesome! Millions of people are currently paying for XBL in its current state. I don't really want jokers on there who aren't willing to pay to be on there. I certainly don't want the cheater of the month on there -- and Microsoft having a credit card in hand helps prevent that. (This isn't directly related to credit cards, but Bungie's classic "Call the Waaahmbulance" article is a must read.)
XBL could probably be cheaper. I cancelled it once for my original Xbox becuase I wasn't using it enough. On my 360, I can't imagine cancelling my gold subscription. $50? For what they give me, it's no biggie.
I just spent the last 38 minutes and 29 seconds helping my mother, who is 2000 miles away, hook her network printer into a new laptop her work gave her. Helping people with computers over the phone is amazingly frustrating -- and it's no fault of their own. It's the fault of UI choices programmers made years ago and we can't seem to lose today. Here's my list of things we need to somehow overcome.
#1. The double click vs. the single click
This is probably my biggest pet peeve. It's not clear when someone should double click to activate something or single click to activate it. I blame three groups of people for this confusion:
- The folks at Apple who decided to ship a one button mouse. If we had had a two or three button mouse, there would be no discrepancy here. But in trying to make things mechnically simple, they made the UI more complicated. (Let's not foget one of the original Macintosh's massive failures in UI: that you drag a floppy to the trash to eject it)
- Either Marc Andreesen and/or Eric Bina. Whichever one of them decided to make hyperlinks activate in Mosaic with a single click instead of a double click immediately confused the entire world. Today, you still see people double click on hyperlinks in web pages.
- Microsoft for making it so you can put the Windows explorer into a "single click" hyperlinkesque mode. Why did Microsoft try to make the Windows Explorer web-like? Ugh!
A huge concern I have for the future are all of the web UI designers who are making hovering an active event. Egads. Now people are going to be afraid to move the mouse, much less know to click or single click.
#2. The Forward Slash vs. Backward Slash
First of all, why are these two on the keyboard in the first place? Can't we all agree that slashes can generally go in a single direction? You hardly ever see anyone use the back slash in print. The backslash is almost exclusively used by programmers since it's such an obscure character.
Yet, some genius at Microsoft decided to use it to separate files and folders circa 1980 or whenever, and now we have to deal with it daily for network paths. I agree that it's kinda handy to have slashes in filenames, but did we have to use the backslash to separate those paths? Why not use the colon like Apple used to?
Someday, we can only hope that Microsoft might make it possible to use using the backslash. I'm not even going to start ranting about drive letters in this post, but that's yet another huge failing that continues even in Vista.
#3. Windows XP mode vs. Windows 95 mode of Windows Explorer
What was someone at Microsoft smoking when they decided to allow old UIs to run in the Windows explorer? It took at least 10 minutes longer to help my mother out with this problem because the IT guy at her work configured her machine in Windows 2K explorer mode and my machine is in Windows XP explorer mode. At my work, we actually have some software that has never been corrected to work for Windows XP explorer mode. The software vendor has not fixed the bug and just tells you to run in Windows 2K mode!
Did Microsoft get rid of this backwards UI option for Vista? One can only hope, but somehow I doubt it.
Those are my rants for today about UI. Maybe one day we can hope UI designers will think forward about the decisions they made. God forbid some hover interaction a Flash guy designs becomes a precedent and 20 years from now and we're still trying to help people over the phone to work around it.
Rumors are all over that Lou Pinella is going to be the Cubs' next manager. Dear Tribune Corp. : please, no!
The Cubs just tried hiring an old time manager in Dusty Baker. His reign in Chicago was no less than a travesty.
How is he a "proven winner"? His career record is barely above .500. This guy has only won anything once without A-Rod, Unit or KG Jr. -- that was the 1990 Reds who won the World Series. That great 2001 Seattle team that won 116 games had to go 5 games against the Tribe in the ALDS and only took 1 game from the Yankees in the ALCS.
Cubs fans want Joe Girardi. He's a leader in the town and we're willing to give him a break for as long as he needs to make a good team.
"Joe G" (as I like to call him) grew up in Central Illinois and went to Northwestern. Seeing as Illinois is very small, he has a connection to my family. According to my mother, my grandmother was friends with his mother who lived in Metamora, IL, when my grandmother taught at Metamora High.
Joe G told the Cubs crowd when Darryl Kile died before the game on June 22, 2002. In retrospect, I've thought only Joe could bring such a sad message to an ordinarily rowdy crowd in Chicago and have them pay attention to it.
Girardi is the perfect Cubs manager. Please hire him over Pinella. Lou Pinella is one of those old timey baseball guys that GMs just like to hire. Take a chance and hire a Chicago/Illinois guy who was in the Cubs organization before. Pinella is not a Cubs guy. He's not a Chicago guy. He's not even a National League guy.
Friday, October 13, 2006
I don't really think there has been a mad rush on Macs outside of the blogosphere, but I agree that Macs are increasing in popularity. My answer to this question is very simple: people use their own computers for less. Many people spend so much of our time inside of a web browser that it's hard to notice that it's on a Mac or on a PC. A lot of these same people have gotten burned by Kazaa or other Spyware on Windows, and it sounds safer to be on a Mac. They end up not really noticing the difference because of point #1.
For people like myself, it's almost impossible to consider moving to a Mac. I'd have to use Mono to use .NET. Mono's not bad, but it's also not VS 2005. I also have games that I like to play on my PC, which of course run on DirectX. Finally, one of my favorite programs is Rhapsody, which also doesn't run on a Mac (except in a web browser!)
However, this week when I was asking advice for possibly purchasing a laptop as a desktop replacement, I got a bunch of friends who are Mac fans recommending to buy a Mac. Here's why I wouldn't do that, in a nutshell:
- BootCamp or no BootCamp, Apple does not support Windows innately. Until Macs ship from Apple with Windows pre-installed with a "Designed for Vista" sticker on them, BootCamp doesn't make up for this. I don't want to buy a Mac to run Windows and then find out Vista SP 1 wouldn't work on it (for example).
- I really have no desire to run MacOS. It has some fancy features, but I just don't see the point when I have everything I need on Windows. There's also no killer app on Mac that I'm dying to have and can't have on my Windows box.
- The hardware is somewhat cost competitive these days, but there lacks a benefit of theat hardware in light of first two points. Again, there's no killer feature about their hardware that isn't available from a PC vendor that supports Windows from the factory.
ps - I used to be a die hard Cult of Mac™ kind of guy in the 80s and 90s. I only owned Apple stuff from 1983 -> 1996. Except for my brief foray into owning a Mac in 1999 (aforementioned Powerbook), mostly what I do on a Mac is doodle around and wish I had a PC running Windows to get stuff done. But to each their own.
Wednesday, October 11, 2006
For those not in the know about this, I should be able to write the following in ASP.NET when using an ObjectDataSource in a GridView: <%# Bind("SomeObject.Name") %>. In straight C#, this would translate to RowObject.SomeObject.Name. As it stands, you can only bind to value types in the properties of the RowObject. So I could do <%# Bind("Name") %> to get to RowObject.Name. But I can't get at any properties of that SomeObject. If I try that, I get the error "A call to Bind was not well formatted. Please refer to documentation for the correct parameters to Bind" when trying to build the project.
Folks, this makes ObjectDataSource worthless for any reasonable object model except the most basic of the basic. It means you have to tear your objects apart and hand them to the GridView, just as you'd have to do if you weren't using ObjectDataSource.
If someone has any other ideas about this besides subclassing ObjectDataSource, feel free to step forward. Otherwise, allow me to say "WTF?"
Tuesday, October 10, 2006
Folks, it defeats the point of the console to pay this much. Stop the madness or next time around nobody gets consoles cheap. I wonder if Sony should have just offered the first 500,000 PS3 off the line for $2000 a pop. How about $5000 a pop? How high could they go? Whatever the sustainable price for those "special edition" PS3s, they'd get all of the profits right up front instead of letting idiots sell their consoles on eBay and profiting. Those people aren't even fans, whereas the people paying $1300 are.
Anyway, let's just hope some of the few PS3s out there actually get in the hands of some people who actually want one for Christmas, rather than jerks who will buy them and sell them on eBay, or GameStop employees who are (reportedly) hogging them to themselves.
Friday, October 06, 2006
I had no idea how true my last post on the GSM/HSDPA topic would turn out. See this article.
Margins are higher on 2G handsets. Margins are higher on 2G service. Vodafone needs to make money right about now, so 3G is not their focus. No wonder I didn't see any 3G handsets out in the Vodafone store (see picture) when I popped in, nor did they have the Motorola V3x out to see. You'd think the RAZR design plus 3G capabilities would be flying off the shelves? Nope. They had it on a sheet of paper somewhere on a wall and didn't have it out to see.
Meanwhile, in a different universe, T-Mobile USA just announced they're spending $6b on deploying an incompatible version of HSDPA, which uses 1700mhz/2100mhz. The 2100mhz, btw, is not the same band used in Europe. It just happens to be in the 21xx mhz range.
Now why on earth a company would spend $6b on an incompatible format of HSDPA is beyond me, but that's what T-Mobile is doing. I wish them luck. Even CDMA vendors in the US have a better roaming strategy. They deploy on 850/1900mhz, which is used pretty much everywhere they use CDMA, except Japan and parts of Eastern Europe (450mhz).
Wednesday, October 04, 2006
When I was visiting the UK last week, a woman we met in Scotland told me that the rest of Europe refers to the UK as something like the "UK Ripoff". I can't remember the exact term she used, but it was a lot more catchy than "UK Ripoff". "United Chargedom" or something.
The bottom line is that England and Scotland are amazingly expensive. An 8-day trip there rivaled the cost of my honeymoon because of the exchange rate. My honeymoon was twice as long and in a far more exotic locale in the Southern Hemisphere.
I know it's a dignified thing to have the world's most expensive currency, but maybe it's time they start thinking about devaluing the Pound a little bit. Better yet, maybe they should move to the Euro. A friend who is returning home to there next week for a vacation was complaining that he'll barely be able to afford the time off because of the exchange rate. It seems absurd that a guy who lives in Silicon Valley finds he'll be stretching his budget by returning home to England.
Something to think about. I loved my vacation, but I'd think twice about doing business there, unless I was of course exporting their money for goods made here or in China.
Most radio is successful because it's free (i.e. ad-sponsored). If it wasn't, people wouldn't listen. The more people you have listening, the more advertisers will pay. When you charge for the content up front, the balance doesn't work out as well because you have no hope in getting the size of audience. Think of YouTube. Would we pay to watch stupid videos on YouTube? No. But we will go there in droves because it's free.
But actually what these articles made me think of is about his free speech battles. Given all the conflict that Stern has had with the FCC, his audience up in arms over the FCCs fines, etc.. few care that much what he says on the radio to pay $132/yr to listen to him. I guess that's what the price of free speech is in Stern's case, and most of us just don't think he's worth that.
Tuesday, October 03, 2006
The only problem is that Zune supposedly won't support Rhapsody. This sucks, because I've had a Rhapsody subscription for a number of years. But I am getting a little tired of the shell game that Rhapsody plays with songs. I save a playlist 6 months ago and half of the songs are now only 30 second samples (this is happening to me as I write this) because they've been removed from the subscription playlist. Very annoying.
To the Diggers out there who want to make Zune-iPod comparisons every day: please stop mentioning the iPod in EVERY Zune article, you're not changing my mind. I'm definitely going to buy a Zune. I want an end-to-end subscription system that will work with my Xbox 360, which I'm guessing they will do with Zune music (since J Allard is in charge of Zune). Microsoft has the ability to offer that and Apple does not.
So far this is the only iPod-Zune comparison that's been any good. Especially "If Detroit Tigers pitcher Joel Zumaya threw an MP3 player at my head at a blazing 104 MPH, would it hurt?" LAWLLERS
Monday, October 02, 2006
There you go, there's the easy out. This is pretty much what everyone seems to do these days if they get busted for anything. "I did something wrong, therefore I have a terrible disease called alcoholism that's to blame." I have an idea, how about, even if you're an alcoholic, you're still responsible for your actions.
By the way, if someone is in Congress, shouldn't we hold them to higher standards than everyone else? Shouldn't he have at least had in his mind, "I've drank a lot tonight, I'm going to hit on a child. Wait wait, I'm in Congress, that would be really, really bad?" I can understand Cletus not holding himself to a higher standard, but someone in our Congress? Just another one of the 2.5 trillion reasons we hate our Congress.
Sunday, October 01, 2006
This is not the sexiest 3D graphics topic but it's one that affects me every day.
I remember the first time I tried to get a model from one package into another, I think it was around 1993. I had a DXF file that came from AutoCAD. I was trying to bring that into Softimage. This ended up being a nightmare of trial and error because some stuff wasn't supported, so I had to get the file re-exported from those guys about 15 times, hack the file by hand, etc.
Fast forward 13 years later -- we're in the same boat. In trying to get some files converted from Maya to/from Softimage, I've tried 3 different converters over the last year (Collada, FBX and dotXSI), and all have their own idiosyncrasies, failings, etc. Plus, no matter which package you're using, the importers and exporters generally have very poor error reporting, so it's very hard to debug what aspect the converter is barfing on.
I have three things I want to port from 3D package to 3D package:
- Points in space and their related meta-data. Either it's a vert with polygon connectivity and bone weighting information, a nurbs patch, a camera with camera details, or whatever.
- Animation, constraints and expressions.
- Materials and textures.
Let's start at the bottom... surprisingly, #3 is the most reliable if you're just using a Phong or Lambert shader. It's when you get into custom shaders that this falls apart. Collada is, so far, the only format that has any kind of solution for custom shaders. It requires you to write CGFX, but it's better than nothing.
Animation generally works out -- except for the constraints and expressions part. Maybe we could all agree on a common bytecode/CodeDom type solution for being able to port expressions from package to package. I know, that's asking a lot, but given that the technology exists out there in the world already, maybe this isn't that hard.
It's the first bullet point that breaks all the time, and you'd think this would be the easiest one. Hard to believe that in 2006 we have not figured out how to perfect conversion of this from package to package.
So here's what I'd like to see: all vendors push on Collada more than any other format. Sorry, I just don't want any more custom formats like DXF, FBX or dotXSI. Right now, FBX is the best format for conversion betwen Maya and XSI, which is bad for the industry because it's proprietary and because it doesn't support all platforms (XSI Linux is not supported by FBX).
So I was kind of sad to see that Microsoft is pushing FBX with their XNA Framework instead of Collada. Microsoft are the XML kings, you'd think they'd go with an XML format, so why are they doing this? Is it because Collada started out as a Sony thing? It would really help all of us if Microsoft would just join with Sony on this one, then we could finally point 3D vendors to a single format with the requirement that they support it as their #1 conversion format.
That format should be Collada for the good of the community. We've wasted too much time on this for decades. It's time to stop screwing around.