The Evolution of the Desktop Computer

Article

Applied Clinical Trials

Applied Clinical TrialsApplied Clinical Trials-06-01-2008
Volume 0
Issue 0

A look at its early beginnings, where it is today, and what the future could hold.

The evolution of the personal computer from the first commercially available computer in 1951 to our incredibly computer-dependent world of mobile, embedded, and networked devices has been breathtaking. The evolution of both computers and biological species appears to happen through major leaps forward but actually progresses through uncountable twisting paths, false starts, and convergent trends that come together in unexpected ways.

Paul Bleicher

For example, the personal computer (the IBM PC) seemed to have appeared out of nowhere in 1981. Only one year earlier, most professional offices were using dedicated word processing machines, and most home documents were produced on electric typewriters. Within a few years, the desktop PC had all but replaced office word processors, and many nontechnical people had PCs or Macintosh computers at home. But for those who know the industry, the desktop PC revolution has been a slow, steady evolution that has included the integration of many different trends.

Aspects of the PC as we know it today began developing in 1964 at Stanford Research Institute, where the computer mouse and a Windows-like operating system were invented. In 1976, the first consumer computers—the Apple I and II, the TRS-80, Atari 800, and Commodore Pet (the models for the first PC)—began to be sold, but mostly to hobbyists. Many of the features of these experiments disappeared from the scene after they were first introduced, only to reappear years and even decades later in the PC (see The Mother of All Demos, http://www.old-computers.com/history/detail.asp?n=59&t=1).

The desktop workstation

Most people who don't work in information technology have the sense that we have reached a relatively stable stage in the evolution of the PC. Although the modern desktop is much more robust and functional than in the past, it has more or less the same "technology stack" as the first consumer computers (e.g., the Commodore Pet or Apple I).

The base of the computer is a hardware layer consisting of a processor, which is connected to a variety of input, output, and memory devices (disks, screens, etc.). The operating system resides on top of this hardware layer, allowing applications to interact with the lower level functions of the hardware layer. Applications such as word processors and spreadsheets are stacked on top of the operating system, as are local data and files.

PHOTOGRAPHY: GETTY IMAGES

The entire environment of the desktop is contained and could actually be operated independently without any network connection to another computer.

Evolutionary pressures

As desktop computing rapidly grew increasingly complex through more sophisticated programs, operating systems, and networking, the cost of corporate computer system support grew exponentially, and the frustrations of end users followed. These issues, multiplied by thousands of users, have driven corporate IT leaders toward radically different solutions for the desktop computer to control costs, improve the quality of the computer experience for the end user, and increase efficiency in the workforce.

The changing nature of the workforce creates evolutionary pressure on the desktop. The increasingly distributed, global, and mobile workforce may work in one office on one day and on another continent the next. Modern workers demand access to their applications and data, not only from any desk in any country, but on airplanes, on their mobile devices, and at home. Increasingly, groups also want to work collaboratively on documents, often in real time. A single user desktop with local applications and data simply doesn't support this working style.

As more and more crucial data becomes electronic, concern over security and privacy also increases. The current decentralized, distributed nature of data and documents makes it very difficult to track and control their purposeful or accidental distribution. When multiple versions of a document or data are present in many different places, it is impossible to maintain quality and consistency. In addition, the increasing ubiquity of laptops has led to the risks of data loss and security breaches from their theft.

We are currently in a maelstrom of software, hardware, and network solutions that are being designed and implemented to address these evolutionary pressures. Some of them turn our entire concept of computing upside down, but none of them have been established as the clear direction forward. It is likely that over the next five to 10 years, some combination of them along with concepts we haven't even yet considered will emerge as the next new mainstream direction in desktop computers. Interestingly, many of these concepts have their origins in the mainframe era of the 1960s.

Uncoupling technology stacks

The overall theme of new desktop solutions is the uncoupling of the traditional technology stack of local hardware, operating systems (OS), applications, and data. Beginning with a mouse, keyboard, and screen, these solutions have many different architectures, with any combination of centralized or localized data, applications or OS.

The most familiar of these solutions is the increasing availability of applications sufficient to meet the needs of all but the most demanding users on the Internet. For example, Google Apps has much of the computing software functionality that most offices might need: a word processor, spreadsheet, calendar, and mail application with sophisticated group collaboration features. When using these Web applications, the computer processing is taking place on the server, and the data are stored centrally. Your computer could be any configuration, in any location, and still access the same data. Anywhere, that is, but offline.

The idea of a Web application being offline may sound like an oxymoron, but anyone who needs to work on an airplane knows that it is an important concept. Google, Microsoft, and Adobe have developed a platform to allow Rich Internet Applications (RIA) to work offline as well as online. These virtualized applications may be deployed over the Internet each time they are used or kept locally, but are much smaller than traditional applications, don't have to be installed/uninstalled, and often work with data and documents that are stored centrally. Offline RIAs would allow users to work with documents locally when necessary and online at other times, with almost no perceivable difference to the user.

The desktop of the future may have no applications on it—it could download virtualized applications when needed or keep some locally for offline work. The applications would be managed centrally in a network, and any new versions or updates would be seamlessly and invisibly deployed for users with each download without the need to physically install or change the computer.

Desktop virtualization

Rather than virtual applications, why not have an entire virtual desktop? In fact, desktop virtualization is widely available today and has many potential uses. With virtualization software (known as hypervisors), it is possible to create and operate one or more "virtual computers" on a host computer, each running its own "guest OS" within a virtual hardware environment. These virtual computers can use the disks and screens of the host computer but are totally isolated from each other and the host, and can be networked with each other. Using virtual machines (VM), a single computer can house several Windows computers and several Linux computers, all operating independently and simultaneously.

VM desktops have another benefit for those who are newly entering the workforce. Many of these new graduates have grown up with the Internet, which they extensively use for entertainment, communication, and collaboration. As such, they are accustomed to fluid access to instant messaging, iTunes, and many other programs that couldn't be installed in a locked-down, corporate IT environment. Furthermore, these users may have grown up with a Mac and may balk at the idea of having to convert over to a PC (or vice versa). The VM desktop would allow them to bring their own computer to work, whether PC, Mac or Linux base, where they would have access to their personal computing environment and do work on a fully locked down and secure corporate desktop VM. The plague of viruses and other malware that often accompany recreational computing would be isolated from the corporate desktop.

Another type of desktop VM strategy involves a hypervisor that installs "on the bare metal"—that is, directly on the hardware without the need for an underlying operating system. These computers would have to be single purpose, with no local applications or data. A user could log in and immediately have access to a directory of VM desktops. They would choose the one they wished to work with and it would be opened on the desktop. Again, all application changes and operating system upgrades would be done on a central VM image, and the cost of maintenance and support is greatly reduced.

The aforementioned VMs all run on the local computer, but many desktop virtualization strategies involve VMs that run on servers, which would be especially useful for workers in industries with sensitive data. When a user selects a centrally located VM, it runs on the central server and the sensitive data remains there, with privacy and security controls intact. Only keystrokes, screen images, and mouse movements are transferred through a secure connection. In a hospital scenario, a physician could log on from home and open an exact duplicate of their hospital desktop that is hosted on the hospital servers. They could work securely with the VMs from several different hospitals, with each simultaneously open on their computer in different windows.

In remote corporate offices it should be easy for a worker to download a VM and begin working on their documents on any computer. However, sometimes access to a corporate network or a secure computer may not be possible, and limited bandwidth may make it difficult to download a large VM image. Virtualization with increasingly inexpensive flash storage now allows a mobile worker to keep a virtual machine, complete with operating system, applications, and data, to be contained as an encrypted VM on a thumb drive, mobile phone or iPod. Simply plug the storage device with VM into any computer, open the VM, and go to work.

Predicting the future?

The desktop is not evolving in isolation. Along with increasingly sophisticated hardware and networking technology, new concepts such as "cloud computing" and "Web operating systems" will continue to appear and influence the requirements and functionality of the desktop environment. As with any evolutionary process, these dynamic pressures are hard to fully identify, complex, and will lead to an unpredictable outcome. The only thing we can be sure of is that 20 years from now, the desktop of 2008 will seem quaint and archaic.

Paul Bleicher MD, PhD, is the founder and chairman of Phase Forward, 880 Winter Street, Waltham, MA 02451, (888) 703-1122, paul.bleicher@phaseforward.comwww.phaseforward.com

He is a member of the Applied Clinical Trials Editorial Advisory Board.

© 2024 MJH Life Sciences

All rights reserved.