Ummmm... Guys.... I think you may have lost your audience. Your sure lost me.
Sandeep's post with the links "This one good, this one better" was really helpful. But, I must question the actual utility of some of the other suggestions.
For example: we have a Gigabit switch. My client machine has the option of going hardwired or wireless. Obviously, hardwired is oodles faster. But, you know what? I can't really tell the difference in day to day operations. Most of my time is taken in thinking what I want to say, what that lab test means, whether those numb feet are from diabetes or B12 deficiency, and all those zillions of medical decisions that we do all the time. The network speed, or the speed of the hard drive is so far from the limiting factor of what time I go home each day that it is not even a consideration.
I remember some computer conference where the speed available from different database coding techniques was being discussed, specifically the speed gained by compiling a database. The example used was some function that had like 100,000 looping math operations. And, yes, it was LOTS faster running compiled than interpreted. So, I jumped through a bunch of hoops and compiled my application. And, it made almost no noticeable difference. In my application, an operation running compiled might take 5 msec and running interpreted might take 300 msec. But, to the user, it made no observable difference.
So, my question to you both; of your suggestions, which would make a real impact on day to day operations? Is there a reliability gain?