• -
Fingerprints

Going beyond the tech – why data innovations need to be normal

Tags :

Category : Blogs

At 7 o’clock in the morning of March 27, 1905, Thomas Farrow, the proprietor of a paint shop in Deptford, south London, was getting dressed in his room above the shop, ready for a new working day. The 71-year-old heard a knock at the door downstairs, and, though still not fully dressed, he went to open up for a couple of early customers; then, as now, the building and decorating trades like to start work early.

He opened the door to find two young men, but they weren’t interested in his paint; it was money they were after. He tried to stop them from coming in and going upstairs to find the money, but he was smashed over the head with an iron bar, and his assailants dashed upstairs. They found the cash box, but also Thomas’ wife, Ann, whom they also bludgeoned. The thieves escaped with the money – about £10.

An hour later, Thomas’ shop assistant arrived for work to find the door locked, and went to get the police. Together, they broke in to discover his body lying in a pool of blood; his last act had been to lock the shop door to stop the robbers returning. They then found Ann, still alive, but moaning and drifting in and out of consciousness. She went to hospital but never regained consciousness and died 4 days later.

Local people had seen two men running down the High Street at that time, and they were identified as Alfred and Albert Stratton. However, there were no witnesses that could link them to the events in the paint shop, and the police’s evidence was circumstantial.

However, the Metropolitan Police sent their new Fingerprint Branch to assist with the investigation. The Fingerprint Branch found a thumb print on the cash box, which turned out to be a perfect match for Alfred’s thumb.

The Branch had only been set up by Sir Edward Henry, head of the Metropolitan Police’s CID, in 1901, and had solved a robbery case in 1902, but other than that, it had met with little success. The principle behind fingerprinting had originally been proposed in 1880 by Henry Faulds, a Scottish doctor, but it had been dismissed, until Henry set up the team. But in a high-profile case, the science was as much on trial as the two brothers. The defence brought in a doctor to argue against the validity of fingerprinting, but when the prosecution pointed out that the doctor had offered his services to both legal teams on the same day, his position was discredited. Even so, anything less than a perfect match of the thumb would probably have caused some doubt in such an important case for a pioneering methodology.

As it was, though, the fingerprinting was indeed the crucial factor in the case, and the brothers were convicted of double murder in early May 1905. They were hanged on May 23, 1905; the first men to be convicted of a capital offence using the new (and literally digital) technology. Fairly rapidly, fingerprinting became accepted as normal procedure in the UK, and across the world.

In the rapidly evolving world of technology, there is a lot of focus on when a technology becomes mainstream; it is the entire premise of things like the Gartner Hype Cycle. There are many ways to identify whether a new technology has become “mainstream”; these are mostly about adoption, ubiquity and location.

But for data and information innovation or development, there is perhaps a more fundamental milestone – when it becomes normal. These data innovations are often underpinned by new technologies, but in order for the data innovations to be successful, they need people not just to use or interact with them, but to believe them. And that takes longer to happen.

The moment a data innovation (such as a new scientific or statistical technique, for instance) becomes normal is when it’s so common that people just accept it as part of life; that it is accepted as “true”.

One of the best yardsticks for that is when things become admissible in court, which shows when the innovation or technique has gone so far beyond “just” being used that 12 people, drawn from any part of society, can agree that using its output is enough to demonstrate someone should, or shouldn’t, be deprived of their liberty or, in some cases, their life.

Identifying The Normality Cycle

If we assume that fingerprinting was first pioneered by Henry Faulds and then Francis Galton in the 1880s, it took about 15-20 years to reach mainstream (that is, when Sir Edward Henry set up the Fingerprint Branch). From there, it took 4 years to reach “normality”.

Similarly, DNA and genome identification was first investigated in the 1960-70s by scientists like Sir John Sulston and Sydney Brenner. One branch of this was developed into DNA profiling by Sir Alec Jefferies in 1984. It was first used to convict in 1987, so it took around 3 years to reach “normality”. We now live in a world where DNA and genome matching is so common, it’s almost an article of faith.

Digital technologies (in their modern sense) are also now commonly used – emails, phone messages, GPS signals and internet browser histories are all used in courts of law as evidence so regularly that it is just normal activity, even though digital technologies have only been mainstream for 15-20 years now. Just this week, there was news that police in Connecticut have brought murder charges against a victim’s husband because the victim’s Fitbit showed her being active an hour after her husband said she had been killed by a house intruder. Fitbit was launched just 10 years ago (nearly to the day) but only really went mainstream 3-4 years ago.

So interestingly, we appear to be no faster at going from “mainstream” to “normality” for new data innovations, even though new technologies become mainstream faster than ever before and the volume of digital technologies continues to proliferate. There’s probably an entire research programme to investigate why this may be the case. But it seems likely that even as we become more technologically savvy (and so technologies become mainstream more quickly), it still takes a certain amount of time for these new innovations to become truly normalised.

But setting aside how long it takes now or in the future, perhaps the really interesting question is “What will be the mainstream digital technologies of 20 or 40 years from now?” What element of scientific techniques or advances will our children (or our children’s children) accept while we struggle to keep up?


Sign up for our newsletter