Why Web Applications are the Future

Update: This post was written in 2008! So long ago. By now even the dustiest old IT manager knows about enterprise enabled SAAS solutions in the cloud.

It is largely assumed that most people know, as younger people know, that web based applications are the future. But, there are still many people, especially in large corporate environments, that I don't think fully appreciate this. Or, if they intuitively know that the web is the future, they don't understand why or how to get there in their enterprise.

When I speak of large corporate environments I am speaking of corporations that are not focused on developing software, large manufacturing corporations who use custom software to gain competitive advantages and therefore do not sell the software they develop. That is the type of company that I work for, so it is what I am most familiar with.

I have been struggling to get a web based application development strategy implemented at my company, but to this point have not been successful. I feel that there are a few reasons for this. First, we are a manufacturing company and have had a stated goal of not developing software, so management largely turns a blind eye to internal software development. This has also caused software development to become somewhat decentralized outside of IT, and further complicates the issue. Second, we have so many legacy applications that the limited development resources are consumed with maintenance activities. Of course, in our case, this is really a symptom of a lack of strategy in general. And, finally, the point of this article, the management does not appreciate what web applications would do for them over traditional applications.

At my division of our company, our IT management decided many years ago to move to a strategy of utilizing intranet applications. The impetus for this switch was that users were creating applications that quickly grew outside of their ability to design and support, so IT was doing more work supporting poorly designed, user generated application. In many cases users would start by creating a simple Excel document, and as the scope grew they grew the complexity in their Excel spreadsheets. Once they reached the limits in Excel they switched to Access and began growing their application there. There is where the problems started coming for IT. Microsoft Access is not inherently scalable, and performs poorly over a WAN. So, the user solution was to create multiple copies of their Access databases. And, when IT got involved to assist with problems we would have to fix the problem multiple times for each copy of the original. Furthermore, the Access applications themselves were not initially designed for scalability or extensibility, so fixing issues was overly complicated. Therefore, IT made an upfront investment of rewriting these applications for the intranet in order to save time in the long run. And, that strategy has worked well.

In another division of our company they ran into the same issues with user created applications, but decided upon a different solution. Their decision was to create what amounts to a custom client-server version of Access. This worked well inside of the facility in which it originated. But, as it grew, problems with scalability began to appear. The program was still sending large amounts of data across the network, like Access. And, as the number of user of the application grew it started running out of server resources. The server portion of this custom application was designed to be scalable across server, but it had never been fully tested and it's designer had long left the company. So, this strategy did not resolve all of the problems exhibited by Access.

These two strategies were created before our divisions were brought under the same corporate umbrella. Now, we have two competing strategies for handling internally developed applications in North America, three if you include the official strategy of ignoring it. I, of course, am a proponent of web-based applications. So, I will explain why these types of applications are the proper approach in my opinion, and why I feel that the represent the future.

In my opinion one of the greatest strengths of web pages is that the server is detached from the client and they communicate via standard protocols such as HTTP, HTTPS, and FTP. This flexible interchangeability on both ends of the connection certainly drove the popularity of the Internet to it's current ubiquitous state.

Any type of client hardware which supports these protocols can use a given web application. In practice, any hardware with a web browser can utilize a web application. Web applications are considerably "future proof" as next year's client hardware will likely still support web technologies. This gives a business flexibility when choosing which operating system or hardware they would like to use for their clients. Our company has a team of people who are constantly evaluating which IT hardware we will deploy, they have to be sure that the hardware they select will run the various applications that our business has. This process could be simplified if they only had to verify that the hardware came equipped with a modern web browser.

The infrastructure itself can also be changed. In theory, Microsoft's IIS could be replaced with Apache or some other web server. In practice this switch is not so simple, the server-side scripting language and existing data would have to be considered, but it is possible. This allows the technology services or infrastructure people to have flexibility in choosing their operating system as well as the physical hardware. It also opens up the possibility of outsourcing server space as well.

In my mind, web servers and web clients can be somewhat treated as a commodity, assuming that established standards have been used when designing the applications that run on them.

Properly designed web applications require no additional software on the client equipment. There is nothing for the IT department to update, install, and distribute on a regular basis. When changes are made to a website's structure, standard redirection techniques can be used on the server, no changes are required on the client computer.

Our company also has a team of people who update software packages and make sure that the latest versions of software are distributed. Switching to web-based applications would reduce this load.

Properly designed web applications are inherently scalable by simply using standard network and web server infrastructure. There are numerous methods for spreading the load of a web based application, the specific strategy would depend upon the intent of the applications being used, but here are some example cases.

If a web server becomes saturated with requests on a regular basis, then another web server can be added that is basically a copy of the original. The DNS records could be updated to list both servers for the given web address, and users would be randomly spread over the different actual servers.

If an application is serving a lot of media, such as images or videos, then special servers could be set up just for serving up the large static content.

It seems that there would be an upfront cost to selecting a web application strategy, but it would pay off in the long run. A company might have to hire more skilled web technologies programmers, but it could possibly reduce the number of people supporting server and desktop computers. The business wouldn't have to spend as much time on IT overhead items like selecting hardware and upgrading software, it would spend more time on developing solutions that help it be more competitive. The keys to this strategy would be finding the right web technology developers and having a manager who understands the strengths of web technologies and is able to exploit them for the company's benefit.
Published: 2008-12-15