Biometrics Comes of Age

Article

Applied Clinical Trials

Applied Clinical TrialsApplied Clinical Trials-12-01-2005
Volume 0
Issue 0

Despite accuracy and security concerns, biometrics are gaining in popularity.

When it comes to technology trends, extrapolating along a straight line rarely works. These trends frequently follow very different curves. Some ignite rapidly, consuming all available "oxygen" (witness the iPod explosion), while others smolder for years before catching fire. Smoldering is certainly the best description for the adoption of biometrics —the use of anatomic, physiologic, and/or psychometric measures for identifying and authenticating individuals for access to a computer system. However, recent issues of identity theft, border security, sensitive data center access, and the possibilities of point of sale purchases with one touch (eliminating credit cards, signing, etc.) have brought biometrics to the attention of the general public.

Paul Bleicher

The logon to a computer system requires authentication. It is akin to the process you use to allow someone into your home. After the knock on the door, you will often look or listen to the person at the door to see if you know them. This is single-factor authentication based on something the person IS—that is, an unchangeable physical characteristic of that person. If it is someone you don't know, you will use two-factor authentication—usually a photo ID from a recognized authority, where you look at something they HAVE plus something they ARE (their match to the photo). Secure authentication requires two factors from something you ARE, HAVE, or KNOW. The obvious example of the latter is a username (UN) or a password (PW). The discerning will see that UN/PW is marginally two factors (since the UN is commonly not well-controlled).

The simplest example of a biometric is replacing your standard username/pass- word with a fingerprint to log onto your computer or network. The use of a fingerprint for authentication isn't some futuristic thing; I log onto my computer every day using nothing more than a fingerprint. However, biometric authentication certainly isn't in wide-spread use today for either home or corporate computing.

Back in the early to mid-90's, biometric authentication was a major buzz in technology circles. Simple yet expensive biometric devices like fingerprint readers had been in use for years, and many newer approaches were being developed. Some of these included iris and retina scanning, palm prints, sophisticated fingerprint detection with heat/electrical conduction (to prevent the "severed finger" security risk), voiceprints (for telephone authentication), handwriting (various measures of letter shape, pressure, and velocity), facial recognition, and others. These devices were being displayed at various computer trade shows, and it seemed likely that they would become a standard part of computer workstations.

In fact, 21 CFR Part 11 (the FDA Electronic Record, Electronic Signature regulations) allowed biometric authentication as a method of single-factor authentication, in apparent anticipation of widespread adoption of the technology. Now, over eight years later the use of biometrics is only incrementally more common than it was back then. The beginnings of an apparent technology trend just didn't continue at the predicted or observed pace. This may change in the foreseeable future, but to begin with let's explore some of the reasons that we aren't all using the technology today.

Understanding biometrics

To understand biometrics, we must start with an explanation of the technology, and some key definitions. Biometric identification fundamentally requires a device that can measure a unique characteristic of an individual; for our example we will use the fingerprint for logging onto a computer. In some future era when we can do "on the spot" DNA typing, this may well become the gold standard. Today, alas, we are confined to more mundane methods such as fingerprints. The device analyzes the whirls and ridges from the individual's finger and converts key features into a digital representation. Obviously, the device must be capable of identifying the key features in the setting of many variables such as the exact positioning of the finger on the device and the presence of cuts and dirt smudges. Once digitized, the reader must match the digital data to a database of previously authenticated fingerprints. This data may reside within the device itself (not the most secure option), on the local computer, or in some central server. Once a match is confirmed, the software sends the message to the system or application logging the user on.

This very simple account of one-factor biometric authentication raises several key issues, each of which is worth discussing in some detail.

Biometric accuracy

The first set of issues revolves around the accuracy of the device; how well does the device recognize the fingerprint? Can it be easily fooled by an imposter? Is it likely to stubbornly resist logging on a legitimate user? If it can be easily fooled, is a fingerprint alone sufficient for security? These answers differ, depending upon the intended use of the biometric system.

The first concept to understand is the false positive (FP) rate (what percent of imposters will be recognized) and false negative (FN) rate (what percent of legitimate user attempts will fail). Different devices will have fundamentally different FP and FN rates, based on the nature and maturity of the technology. In addition, consumer-oriented devices will have higher FP/FN rates than more ironclad corporate devices.

There are two major reasons for using biometrics—convenience and security. For the average home user it isn't necessary to have ironclad security. The data on a home computer may be sensitive, but anyone with access to the computer is unlikely to have the ability, the time or the stamina to break the standard security measures that protect that data. In this setting, it is probably sufficient to use a UN/PW authentication (as long as the PW isn't posted on sticky notes), or a single biometric device. My home computer uses a $35 fingerprint reader so that my wife and kids can logon easily without posting a PW sticky note. It is for convenience and light security. The device has no perceptible FP rate to me, and a low FN rate.

The second reason for using biometrics is for rigorous security. In this case, a two-factor authentication should be used. In addition, the biometric FP rate must be extremely low. Two-factor authentications in this setting could include a PIN code or password (what you ARE and what you KNOW), or a token of some sort along with the biometric information (what you HAVE and what you ARE). This type of security is certainly appropriate for intelligence work, but wouldn't be overkill for logins to networks where critical, secret, or financial data reside—which is most corporate networks these days. If you have doubts, check Microsoft's Web site for their fingerprint reader where it says "the Fingerprint Reader should not be used for protecting sensitive data such as financial information or for accessing corporate networks." They understand that even a 0.1% or 0.01% FP rate still means that 1/1000 or 1/10,000 people could potentially be authenticated as somebody else. Microsoft doesn't believe that this level of protection is sufficient to protect sensitive and valuable assets. Adding a simple PIN would dramatically improve the security by making their authentication process a two-factor process.

System security

The next issues raised are around the security of the overall system. Can someone entirely circumvent the fingerprint by simply copying the digital stream from the device or the authentication message from the software to the system? Can the fingerprint information on the computer itself be compromised and changed to allow illegitimate access? These are all good questions that depend on the quality, integrity, and robustness of the software supporting the system. There are two weak links in any biometric authentication scheme—how the device talks to the software and how the software talks to the operating system or program that it is authenticating to.

The device must communicate digital information about the fingerprint to the software that controls the authentication. It is possible for a "bad guy" to intercept the data stream from a legitimate logon and simply reproduce it to logon to the system without ANY authentication. A "defense-level" authentication application would encrypt this data with a session-based encryption key, or would have various mechanisms to prevent the cloning of the device stream. A consumer device might have none of these.

The second issue is how the software talks to the operating system and/or program. In a secure setting, the authentication software would interact directly without the need to use the UN/PW at all. This requires fancy programming at the operating system or application program level. A less secure consumer style system might simply store the UN/PW (hopefully in encrypted form) and pass it to the system or application when requested.

The burden of support

Local biometric authentication on an individual computer requires the installation and support of a program, and creation of a biometric template (for matching to an individual logon attempt) for each individual who uses the system. This local administration can be a major headache for an IT department in a large corporation. Of course, some of this burden can be removed by using authentication software that resides on a central server. Centralized administration, of course, has its own issues. For example, a user couldn't login within the biometric method unless they were connected to the central server. In addition, authentication to a single computer has one or only a few fingerprints to compare a "challenge" fingerprint to. A central server may have thousands, increasing the likelihood of a false positive. Of course, this is offset by the use of higher quality fingerprint readers, and the use of smart cards that can hold fingerprint data. Finally, many people have significant concerns about privacy issues with centralized biometric data. It is important to note that most biometric fingerprint software doesn't store an image of the actual print, but rather some digital measurements from the print itself in an encrypted format.

Despite all the issues, biometric authentication is on the rise again. In fact, an advertisement fell out of my morning paper today with a huge fingerprint; IBM is pushing a new series of laptops with built in fingerprint authentication. More and more individuals and companies will consider putting biometric method for authentication in place at home and at workstations. It is likely that you will see biometric devices soon in stores and banks. A significant milestone will be the long-awaited release of the new Vista Microsoft operating system (with a previous code name of Longhorn), which will have built-in support for biometrics.

If you find yourself considering whether biometric authentication may be valuable for you or your company, you should consider the sensitivity of the data you are protecting, the risk of compromise, and the right balance of security and cost-effectiveness for your particular circumstance. It is no different than buying a front door lock for your home.

Paul Bleicher MD, PhD, is the founder and chairman of Phase Forward, 880 Winter Street,Waltham, MA 02451, (888) 703-1122, paul.bleicher@phaseforward.comwww.phaseforward.com. He is a member of the Applied Clinical Trials Editorial Advisory Board.

Related Videos
Related Content
© 2024 MJH Life Sciences

All rights reserved.