I have come to the conclusion that many times business owners and managers unwittingly allow software programmers to determine their long-term technology strategy. Programmers are hired just to write software, so how could they be steering a business' technology strategy?
The explanation is really quite simple. Depending upon how the software is written and what tools are used in the process, it could be establishing some deep rooted dependencies. And, it seems that programmers are often given too much leeway in selecting their tools.
If a program is written with the intent of being installed on client computer, then there would be an obvious lock-in to the platform for which it was designed. It makes sense that if a program is written for Windows XP, then it will not run on OS X. Most managers would say that is acceptable since they must be currently using the system for which the software was designed. But, it has been my experience that software is always around for longer than the original designers intended. This was a major factor in the scare leading up to Y2K.
Furthermore, as people become more accustomed to using mobile computing platforms, their use will become more widespread and corporate software will be expected to become mobile as well. In my opinion this represents the most likely scenario where a company would want to use it's software outside of the platform for which it was designed.
There are ways to avoid platform dependencies. A cross-platform language, such as Java could be used. Or, as is becoming more common, application servers can be used. The most common application servers are web application servers. In a previous post I explained why I felt web applications represent the future of software applications, and more progressive managers are already looking to exploit web applications. But, just taking advantage of web applications does not guarantee that your software will be client independent.
I was recently talking to people from a company where they had two major web applications projects underway. One project was based on LAMP (Linux, Apache, MySQL, PHP), and the other project was being built with Microsoft based technologies. The LAMP based project was great, there was a LAMP server that the company had to support, but other than that there was no requirements. The application could be run from any modern web browser, on any platform. The Microsoft based project was also very good, it had a great interface and there were Microsoft servers on the back-end. But, there was one major problem that I saw. The application had to be run inside of Internet Explorer. So, with this project the company was paying a huge extra expense for the Microsoft technology, and they were locking themselves into using platforms that had Internet Explorer.
This situation is not a big deal now because Microsoft Windows is so ubiquitous. But, it severely limits the company's future flexibility. Ideally, a company could make all of their applications web-based, and then purchase dirt cheap netbooks with Linux on them. All they would need is a computer capably of surfing the web, and they could save a lot of money over time. But, because of the Internet Explorer requirement the company will forever be forced to maintain some sort of Microsoft licensed software.
The moral of this story is that you have to be careful with giving contracted programmers too much room to be creative. And, I am using the term programmers, but I also mean analysts and architects as well. Think of your strategy, think of the impact of client-side dependencies, and don't work with programmers who play down this issue.