While it’s well established the “zoom and enhance” voodoo found on popular crime shows is nothing more than Hollywood magic, real technologies exist that help solve mysteries in fascinating ways. Caruso’s lasers and billions of dollars worth of equipment capture our imagination, but even on a rudimentary level, analysis evidence leans on compelling scientific knowledge and progressive technology that identifies criminals and helps get them off the streets. One of the oldest and incredibly, still relevant pieces of evidence in practice is the lowly fingerprint.
But while booths full of super-glue vapors help give a cool visual to the practice, what does the average CSI viewer know about fingerprints? A closer look demonstrates a technology with a history as fascinating as its utilization. Furthermore, understanding the roots of fingerprinting enlightens a concept currently guiding the advent of new biometric technologies today.
The history of fingerprinting began long before its use in criminal proceedings. According to historians, Babylonians would press their fingers into wet clay to record business transactions. The Chinese adapted this system, but held onto its benefit as a unique identifier, using ink on paper to conduct business transactions and identify their children. Even hundreds of years later, the practice was still in use when, in 1858, an Englishman named Sir William Herschel, then Chief Magistrate of the Hooghly district in Jungipoor, India, required residents to record their fingerprints when signing business documents.
It was upon these foundations that the first fingerprinting system came into the mind of a Scottish doctor named Henry Faulds. The doctor, while working in Japan, discovered fingerprints left on ancient pieces of clay. In 1880, Faulds wrote a letter to his cousin, Charles Darwin, asking for help developing a classification system. Darwin declined at the time, but forwarded the request on to Sir Francis Galton. Galton was a eugenecist who collected copious amounts of data on physical characteristics of people in order to determine the mechanics behind the inheritance of genetic traits. After gathering 8,000 fingerprint samples, Galton published what would become the first fingerprint classification system in history in his book “Fingerprints” during the year 1982. The system would not see popular adoption at that time, but its legacy lies in its surprising longevity.
At this same time, others around the world had similar ideas. At the time of publication of “Fingerprints”, a Frenchman by the name of Alphonse Bertillon was working on his own system, involving the measurement of hands, feet, and other distinguishing body parts. The practice, named anthropometry, was adopted by the British Indian Police in the 1890s. Across the world in Argentina, a police officer named Juan Vucetich had developed his own system. When called in to assist with investigation of the murder of two boys in a village near Buenos Aires, his system played a pivotal part. After comparing samples from the crime scene, he identified the killer, Francisca Rojas, the boys’ mother. She confessed to the crime and comparative dactyloscopy was born.
It wasn’t until 1896 when the modern system of fingerprint identification came to be. Sir Edward Henry, commissioner of the Metropolitan Police of London, created his own classification system using the pioneering work of Galton. His system used the now-familiar whorls, loops, and arches of frictional ridges on the fingertip to identify individuals. His system, the Henry Classification System, replaced the Bertillonage system and modern fingerprinting began. So successful was the technique that Scotland Yard established its own Fingerprint Bureau in 1901, presenting fingerprint evidence in court for the first time in 1902. In 1903, the system spread to New York state prisons, further cementing its usage as an investigatory tool.
[youtube_sc url=”http://www.youtube.com/watch?v=ZKi1CKTRCQM” modestbranding=”1″]
Unfortunately, the system was cumbersome. Records had to be compared manually, requiring hours or days to yield a match, if it was even successful. The Japanese National Police Agency answered this problem with the advent of computers in the 1980s. Their system, called the Automated Fingerprint Identification System (AFIS) allowed for cross-check of millions of prints simultaneously. Testament to Galton and Henry’s legacy, the digital utilizes the same identifying characteristics of their late-19th century system when determining a match.
Prior to 1999 however, fingerprint identification systems could only “speak” to other computers on the private network. If a criminal was arrested in Utah, for example, there was little way for Salt Lake police to cross-check his or her records with databases in New York. In response to this, the FBI’s Criminal Justice Information Services Division introduced the Integrated AFIS, which allows for categorization, search, and retrieval of fingerprints from anywhere in the U.S. in as little as 30 minutes. In addition, the system displays mug shots and criminal histories for persons in the system. Approximately 70 million records are in the IAFIS, including 34 million civil prints. This same system used for employment checks, issuance of license, and enrollment in social services programs, making it one of the most used and most valuable tools in the world.
Biometrics and Consumer Products
This same concept of utilizing truly unique identifiers in criminal investigation has sparked a wave of other “fingerprints” called biometrics. These technologies enable more thorough identification of potential suspects with even greater accuracy. In Quantico, VA, the FBI’s Audio Lab helps identify voiceprints of major international criminals. In homage to Alphonse Bertillon’s system, ear scans which examine the distinct size, shape, and structure of ears are also seeing use. But the most profound, of course, is the DNA fingerprint, which looks at the chemical code that constructs us in order to differentiate one person from another.
But fingerprints are not going away. With hackers’ increased proficiency in cracking character-based passwords, fingerprints, among other biometrics, are seeing increased use in consumer products. In particular, Apple’s new iPhone was released with fingerprint capabilities in an effort to offer a more secure alternative to conventional authentication methods.
And while fingerprinting is no rocket science, its consistency, ubiquity, and development leave no question as to its importance in investigatory purposes. From the humble beginnings of ancient Babylonian business practices to scanners built into our mobile devices, it is clear that fingerprinting has left its mark on our society. Only time will tell what the next great personal identifier will be, but regardless of the specific characteristic, it will have big shoes to fill if it is to supplant the lowly fingerprint.