Part II of my response to this most interesting thread. I hope Indy and DocMartin will be good sports.
Doc you hit it on the head. In our office we had a billing and scheduling system that still ran DOS. It was completely obsolete,... and completely bulletproof. It was NEVER down, hacked, locked up or without its addresses.
The anticipated benefits of the EMR have been COMPLETELY erased by the unnecessary costs of updates and modifications to a system that is not yet mature enough to stand alone.
The iPhone when it first came out was a technological wonder, but subsequent "upgrades" have rendered it increasingly useless as what used to work doesn't and what is, requires ever increasing expenditures of time, to learn the system.
And I have posted this rant, which, like the upgrades we are all struggling with, is mostly useless. Sorry...
When it comes to EMR systems, I agree that most upgrades can be useless. There are two issues:
* The EMR companies are spending more time adding features for MU than they are adding features to be more efficient.
* Following up from the first point, there are way too many doctors who care more about MU than they do anything else. Too many docs saw MU as easy cash only to be sorely disappointed.
For those doctors focused solely on medicine and the patient, you are finding yourself outnumbered and outgunned. Sad but true. I can only hope things will get better. As for your DOS based program, it didn't have to deal with security, sharing the processor with other programs, and didn't have any multi-threading at all. While it may have seemed bullet proof in those before-Internet days, a program like that would not survive today. Indeed, you would see just how non-bullet proof it was.
Mac is different, but developing for Android/iOS/Blackberry/Windows can all be accomplished with the same tool set. Developing a touch-centric app uses a similar UX (User eXperience) regardless of the target OS. As a developer, I'll take that workflow any day over developing desktop specific applications.
The key macro factor that that Microsoft has attempted to respond to is that folk's next computer is increasingly a mobile device, not a desktop.
http://www.emarketer.com/Article/Desktop-Search-Decline-14-Billion-Google-Users-Shift-Mobile/1010668As others have observed, there is a whole generation of developers now who see Microsoft as 'legacy', and develop for mobile OSs; mobile developers are in the largest demand (even in the enterprise market) and Microsoft up to this point has not gotten traction.
http://recode.net/2014/02/13/why-satya-n...ft-from-itself/Perhaps that will change, or perhaps not, but desktops are a dying user-case.
I don't blame you for taking the workflow you mentioned. Making web based apps using a toolset that hides the underlying differences between browsers most certainly has its appeal. However, like everything else, there are pros and cons. Web based apps, and the additional abstraction brought on by the added API's you are using to make your code work across platforms has a performance penalty. A serious one at that. It may be this penalty doesn't matter. The app doesn't have to be fast, it just has to be fast enough. That reminds me of one of my friends who bought one of those super clear super HD TV's. And then he proceeded to laugh at me for my 5 year old HD TV. I then pointed out that the additional clarity of his TV is beyond the human eye's ability to detect. He stopped laughing.
The issue here, of course, is that while web-based touch apps are all the rage, there are still plenty of client-server apps out there and they aren't going any where. In fact, there are new client-server apps being made all the time. Furthermore, there are plenty of people on this forum who have made it clear they do not wish AC to become web based. I realize that a web-based app can be installed in-house, but that adds a lot of complexity. Do we really want to teach people how to manage Microsoft Internet Information Services when they can barely handle SQL Server? Also, client-server apps are easy to install. Just click Next three times then click Finish. This process is well understood. Web-based apps may trash each other: "SharePoint crapped out my Amazing Charts!"
What I'm trying to say as that these tools and paradigms have their uses. And, as developers, we use the tools best suited for the jobs that our clients demand. Today, I find myself maintaining client-server and web-based apps. I don't really get to choose which one I want to focus on lest I start turning down clients. While I certainly don't blame you for learning touch-based web development, don't be so quick to abandon everything else.
Regarding a whole generation of developers that see MS as legacy, let them commit career suicide. The new kids coming out of college who were raised by liberal professors that still see the Microsoft of 1995 will, of course, see MS as legacy. People go for what's new and sexy. All developers want to be like Steve Jobs: an arrogant asshole that can get away with being such because he has money and is just that smart and the worse he treats people the more they love him for it. Just like many doctors want to be Gregory House. But that's not how the real world works. MS isn't legacy. MS is the largest software maker in the world that can't afford to crap on it's existing user-base just to chase after the new way of doing things. Just like Windows 95 migrated us to 32-bit applications all the while supporting our old 16-bit applications
that had no idea they were now in a cooperative environment and no longer had total control of the processor, MS has to migrate BILLIONS of people from the old Start Menu to the world of Touch. And they have to do it carefully and be ready to fix mistakes along the way - and spend BILLIONS OF DOLLARS doing it. That's not legacy, that's being responsible.
The desktop is not a dying user case. I realize all the pundits out there are saying it is. I get it. What I have observed in my 20 years in this field is that pundits make lots of money "glorifying" news headlines and saying things that sound trendy. Notice I didn't say anything about truth. Has anyone ever noticed how most pundits start their article with a question? Instead of making an article that says, "The Desktop is Dying" and then listing their proof, they start their article with, "Is the Desktop Dying?" and then list a few sales numbers then let everyone debate about it in the comments - all the while clicking on the ads of the blog owner.
Without a doubt tablet and smartphone sales are going insane and, yes, at the expense of desktops. But there are reasons for that:
* The desktop market is already saturated. Everyone already has a desktop computer, but not everyone yet has a tablet.
* People are keeping their desktops longer. Windows Vista was the last version of Windows that needed way faster procs and way more RAM. Windows 7 and Windows 8 will both run on hardware that ran Vista. Why buy a whole new system? I think people are going to start keeping desktops 5+ years rather than trading up every three like they did before.
* People are learning the mobile market so that means turnover. Turnover means inflated sales. A friend of mine that works at Best Buy (yes, I give him crap about this daily) told me of a local office that came in and bought 20 Samsung Galaxy tablets from them. A few days before the return policy was up, that same person from the local office brought all 20 of the Galaxys back and exchanged them for 20 Surface Pros. The reason? The Galaxys couldn't do everything they needed them to do (they bought the Galaxy's because they were cheap). Do we think Samsung is going to subtract those 20 returns from their sales numbers? No. Is the Best Buy manager sick to his stomach? Yes. The reason Android is beating out iPad is because a lot of people are getting tired of paying a premium for a device that can do little more than surf the web and check email.
What we are going to see happen is most people are going to have three devices:
* The smartphone. Used for those who are on-the-go who need to, obviously, make phone calls, see important emails while waiting in line, and text quick notes to people. Although the Windows based smart phones come with mobile Office, I don't see people using that. Smartphones may also be used for portable MP3 players and GPS. I use my Smartphone on my Harley as an MP3 player into my Harley's sound system and for GPS.
* The tablet. Used for those who were on the go but have arrived at their off-site destination. Used for more in-depth email (opening attachments, making longer responses), reading and editing Word/Excel documents, giving presentations, and collaboration.
* The Desktop. Still the real work-horse. Most desktops now are being sold with multi-monitors. This is where people get their real work done when they are done gallivanting around. With their dual, or even triple, monitors they will have resource intensive tools open. I, right now, have 6 Remote Desktop connections, 14 Internet Explorers (counting individual tabs), Word, Excel, two copies of Visual Studio, and Windows Media Player open and running now. And since I have three monitors, it's all fairly easy to manage. I could not do all this multi-tasking on a tablet, much less a Smartphone. The lines can be blurred, however. Some tablets, like the MS Surface, have docking stations and can run more than one monitor. Of course, one can argue at that point that I have converted the Surface into a Desktop. I can go with that argument either way. I'm not particular.
My point is that we all know the news is biased, yet we keep listening to it. We should, in point of fact, look at how we are using computers ourselves. How many of us are ready to stop using our desktops and go full tablet? How many of you really want to stare at that small screen all day? How many hundreds of thousands of apps out there will require a desktop for decades to come? Can any of us see ourselves using Amazing Charts, QuickBooks, our PM system, and all our other heavy duty stuff on a tablet any time soon? How long before some really smart guy says, "Hey! We have all those desktops out there with 8G of RAM and big hard drives and dual CPU's. Why are we paying out the buttocks for per-processing licensing to our vendors so we can run all of our apps in the datacenter and send only the UI to the desktop? Why not distribute the load?!?" What's old will be new again.
What I think we are seeing is a market adjustment, not the death of the desktop.
I think that's enough for one night. I hope everyone is doing well.
JamesNT