Today’s technology is bridging the gap on what is an “application” and what is a “web site”. Traditionally, an application was a piece of software that was installed on your desktop or on a server in your office. These applications had strict system requirements which would degrade, or prevent, performance if not met. Applications were very rarely “fun”. Rarely did someone say, “hey, did you check out that new application…”. This was because it all took time to get from point A to point B. In order to “check out” an application, you had to go through the following steps:
- Find the application – normally physically – either at a store or from a vendor
- Determine if your current systems met the system requirements of the application and if not, upgrade your current system (this is a whole other issue on its own!).
- Acquire the application – either through simple transaction or lengthy sales process (again, a whole other discussion).
- Install the application on the required hardware (usually a server and a client (desktop/workstation) machine).
- Resolve any unforeseen hardware or software conflicts for each machine
- Configure any additional services such as network connectivity
- Start the application
- … wait for the hour-glass to stop spinning…
- Go through the licensing and activation screens
- … and then, ta-da! The application!
- Oh wait, now you need to read the manual to figure out what all the buttons and menus do and when.
This, is no easy process. And it consistently grew more and more difficult with the ever increasing rate at which technology is developed. After spending months assessing hardware requirements for your company and more months rolling it out, the applications you want to run tomorrow will likely not be compatible with the hardware you now have. Back to the drawing board and months of reviewing additional hardware upgrades, rolling them out, etc. To roll-out a single application could send the IT investment cost into the hundreds of thousands, or millions with hardware, licensing, compatibility testing and more.
Finally, you could answer your friend with a “yes, I’ve just managed to install it”. By which time your friend has moved on to three other new interests and doesn’t have any idea what application you’re talking about.
While this whole “application acquisition and implementation” process continues to keep a great many IT Professionals busy, it simply doesn’t add value to the business and is not a good use of IT’s knowledge or time.
The traditional method of implementing software is outdated. The Cloud is the new platform for business.
More and more business application are turning into web-applications. A web application is basically an application that is accessed over a network (or the internet) and managed through a web browser, such as Mozilla Firefox, Google Chrome and others. Typically, these applications are managed in the cloud. The cloud is a loose term for “somewhere on the network that we don’t need to know about”. This means that someone else can worry about the hardware requirements, compatibilty issues, security, updates and upgrades. And all we need to do is use it. This means that when someone says “have you checked out that new application” we can now go through these steps:
- Search the application name in Google
- Registers on the main page
- Log in
There’s no need to roll-out the application and install it on hundreds of user’s machines, or configure complex server rules. All you need to do is distribute login credentials to those people who should have access. And since web applications are run completely from within a web browser, they get to take advantage of all that fun technology such as video, audio, and, most importantly, an intuitive user experience. People that can get around the web should, more or less, be able to get around inside a web application.
What has been your experience with using the Cloud so far? Share your thoughts in the comments below.