There's a messed up philosophy-- a kind of offshoot of the 'eat your own
dogfood' school of thought -- that insists developers should have the same
desktop rig as end users. This is nonsense.
End users aren't running three virtual machines, a local database server, three instances of visual studio, and two instances of blend. Devs need a hot rig with massive RAM. Anything less is ridiculous.
One of my very first managers insisted on this philosophy for our company's R&D department and I've always found it to be absurd. The people with the most specialized knowledge should be given the best machines in the house. They should be able to work as efficiently as possible. If said programmers are given great machines and can't write efficient code, they shouldn't be doing that work anyway, whether their machines suck or not.
 - One of his other philosophies is a great one I still use today: "1 hour of human work is worth 10 hours of machine work." Meaning: if you can dump something onto a machine farm that takes 10 hours and call it a day, or stay for another hour to optimize it to take less time, you should do the former. He came up with this by looking at people's salaries and the leasing rates of machines (our entire farm used to be leased -- back when rackable HP-PA machines were like $40K a piece)
The equation's a little messed up in this day and age because hardware is so much cheaper and faster than it used to be, but the philosophy is still a great one. I like to throw hardware at problems as much as possible. Granted, this can't always work--the philosophy has been tried by many Ruby on Rails developers and has failed often--but since the work I do is not public-facing, it works out pretty well. $50K can buy you an enormous amount of computing power... if applied correctly, it could pay for itself in shaved off engineer time in a two months? a month? less?