Cataract Surgery Could Confuse Biometric Identification

As the use of iris recognition systems expands around the world, there are likely to be more conditions that affect the technology’s accuracy.

Cataract surgery can sometimes alter the iris in the eye, leading to errors in biometric verification, according to the results of a new study. If a problem arises, the only way out would be re-enrol for biometric identity after surgery.

A cataract arises when protein deposits cloud the eye, causing blurry vision. It is a common eye disorder and the leading cause of blindness worldwide. That said, the surgical procedure to fix the problem – whereby a doctor removes the old lens and replaces it with a new artificial lens – is widely available today.

However, whether and how the surgery impacts biometric recognition has not been effectively studied. A team of scientists from India and the US tried to resolve this; they studied the textures of affected irises, the coloured part of the eye surrounding the pupil, with the help of the cataract surgery database of the Indraprastha Institute of Information Technology, Delhi (IIITD).

This database has 132 entries and was collected between 2012 and 2015. It is the largest publicly available database of iris images collected from patients with cataract, according to the scientists.

All of the patients had been operated on using a widely-used method called phacoemulsification. Here, surgeons use ultrasound to break the eye’s existing lens into tiny pieces, dissolve the fragments and suck them out, and fit the eye with a new artificial lens.

Cataract and Aadhaar

After an initial analysis, the Indian and American scientists used three iris sensors and two commercial iris biometric matchers to check if the new irises passed biometric authentication. They found that the iris sensors’ success rate dropped to 75% after surgery. The biometric matchers did better, authenticating 93% of the irises.

Cataract in a human eye. Photo: Rakesh Ahuja, MD/Wikimedia Commons, CC BY-SA 3.0

Cataract in a human eye. Photo: Rakesh Ahuja, MD/Wikimedia Commons, CC BY-SA 3.0

While ’93’ sounds like a big number, India’s large population amplifies the error in absolute terms. For example, 7% of 100 million is nearly the population of Chennai. Having to work with a large number of people also means being able to deal with a very large number of unexpected results. “Developing a technology that can be used successfully with 1.3 or 1.4 billion people is not an easy task because there will be so much variety and so many special cases involved across the whole group of people,” Kevin Bowyer, a professor of computer science and engineering at Notre Dame University, Indiana, and an author of the study, said.

Also read: Four Reasons You Should Worry About Aadhaar’s Use of Biometrics

Indeed, the results are particularly relevant in India, whose government has been rolling out its Aadhaar unique identity programme with over 1.2 billion registrations. India is also projected to have about eight million cataract patients undergoing surgery every year by 2020, according to a report in the Indian Journal of Ophthalmology.

“The results indicate that cataract surgery affects the discriminative nature of the iris texture pattern,” the Indian and American scientists reported in a peer-reviewed paper published on July 31. “This finding raises concerns about the reliability of iris-based biometric recognition systems in the context of subjects undergoing cataract surgery.”

“The use of the iris pattern as a means of large-scale authentication,” the paper continues, “has also resulted in the need for understanding the effects of common ophthalmic disorders and medical procedures on identification performance.”

Two types of errors

Phacoemulsification seems to induce a change in the iris pattern that impedes segmentation and matching algorithms. Segmentation is the process by which the image of the iris is demarcated from the image of the surrounding pupil, eyelashes and eyebrows in a larger picture. If the segmentation step fails or performs poorly, the system could misidentify the person to whom the iris belongs.

“We found that most of the commercial, off-the-shelf systems failed to segment the iris of an eye,” Mayank Vatsa, one of the study’s authors and the head of the Infosys Centre for Artificial Intelligence at IIITD, told The Wire.

According to Kiran Raja, an associate professor at the department of computer science at the Norwegian University of Science and Technology, the odds of a mistake depend on the nature of incision made on the iris. With costly, high-precision surgical procedures in developed countries, the chances of error may be low, although he also clarified this idea hadn’t been experimentally confirmed. The challenge is to verify that the iris patterns have improved or have been restored in a predictable way after the procedure. Raja and his colleagues have published several papers on biometric verification.

Generally speaking, every commercial sensor has quality measurement systems built in to track the features of an iris, Raja said. If surgery had degraded the boundary between the sclera and the iris or the iris and the pupil, the quality of the image is flagged as low – as is the iris’s potential to identify the person to whom it belongs.

Segmentation algorithms generally rely on the elliptical shape of the pupil’s circumference. However, a new lens can reflect light differently, as do punctures in the iris and any post-surgery complications, throwing off the algorithms.

When this happens, one of two types of errors is said to occur: a false match or a false non-match, according to Bowyer. A false match is when two images are said to be from the same eye when they are actually not. “This is not the kind of error that you would expect to result from cataract surgery,” Bowyer told The Wire.

A false non-match is when two images of the same eye are said to not be from the same eye. And “this is the kind of error that you would expect to be caused by comparing an image of the eye from before cataract surgery with an image of the same eye after,” Bowyer explained. “This would be an inconvenience to someone trying to use the biometric system. They would have to take another image and try again or [have] to establish their identity with a different biometric.”

But if they just took another image and re-register with that, there would likely be no record of the first error. Iris recognition is a highly accurate biometric and each eye is independent. “So it is like having two identifiers with you. If there is a cataract that is making biometrics difficult with one eye, it may not be difficult with the other eye.”

Also read: Aadhaar Conundrum: Three Tests for a Fool-Proof Identity

More exceptions

The new findings suggest officials will have to improve their algorithms to keep from being misled by the consequences of cataract surgery. Otherwise, “the iris texture pattern” may no longer be a “robust biometric characteristic”. Other ophthalmic complications that could cause recognition systems to fail include inflammation of the front portion of the eye and birth defects in which some parts of the eye tissue don’t form normally.

Older studies haven’t been able to agree on the precise effects of cataract surgery on iris biometric recognition. Their conclusions vary by geography and socio-economic status, so it hasn’t been possible to compare their results directly with each other. This is also why the findings of the present study are limited to the Indian context.

But as the use of iris recognition systems expands around the world, there are likely to be more conditions, apart from cataract, that could affect the technology’s accuracy.

That said, the study’s findings are in fact consistent with those of previous assessments, according to Raja. And “the current solution is to re-enrol the iris after surgery to reduce the chances of a false acceptance or a false rejection.”

Vatsa said that “the key insight from this study is to alert policymakers that, for any iris-based biometric authentication,” it might be a good idea to require people to re-register themselves after surgeries of the eye “to ensure correct authentication”.

T.V. Padma is a freelance science journalist.

Why Victorians Feared Modern Technology Would Make Everyone Blind

In the 1800s, the rise of mass print was both blamed for an increase in eye problems and was responsible for dramatising the fallibility of vision too.

From concerns over blue light to digital strain and dryness, headlines today often worry how smartphones and computer screens might be affecting the health of our eyes. But while the technology may be new, this concern certainly isn’t. Since Victorian times people have been concerned about how new innovations might damage eyesight.

In the 1800s, the rise of mass print was both blamed for an increase in eye problems and was responsible for dramatising the fallibility of vision too. As the amount of known eye problems increased, the Victorians predicted that without appropriate care and attention Britain’s population would become blind. In 1884, an article in The Morning Post newspaper proposed that:

The culture of the eyes and efforts to improve the faculty of seeing must become matters of attentive consideration and practice, unless the deterioration is to continue and future generations are to grope about the world purblind.

The 19th century was the time when ophthalmology became a more prominent field of healthcare. New diagnostic technologies, such as test charts were introduced and spectacles became a more viable treatment method for a range of vision errors. But though more sight problems were being treated effectively, this very increase created alarm, and a subsequent perceived need to curtail any growth.

In 1889, the Illustrated London News questioned:

To what are we coming? … Now we are informed by men of science that the eyes used so effectively by our forefathers will not suffice for us, and that there is a prospect of England becoming purblind.

The article continued, considering potential causes for this acceleration, and concluded that it could be partly explained by evolution and inheritance.

The 19th century was the time when ophthalmology became a more prominent field of healthcare. Credit: Pixabay

Urban myopia

Other commentators looked to “modern life” for an explanation, and attributed the so-called “deterioration of vision” to the built environment, the rise of print, compulsory education, and a range of new innovations such as steam power. In 1892, an article, published in The Nineteenth Century: A Monthly Review, reflected that the changing space of Victorian towns and lighting conditions were an “inestimable benefit” that needed to be set against a “decidedly lower sight average”. Similarly, a number of other newspapers reported on this phenomenon, headlining it as “urban myopia”.

Also Read: The Victorians Had the Same Concerns About Technology as We Do

In 1898, a feature published in The Scottish Review – ironically entitled “The Vaunts of Modern Progress” – proposed that defective eyesight was “exclusively the consequence of the present conditions of civilised life”. It highlighted that many advances being discussed in the context of “progress” – including material prosperity, expansion of industry and the rise of commerce – had a detrimental effect on the body’s nervous system and visual health.

Another concern of the time – sedentariness – was also linked to the rise in eye problems. Better transport links and new leisure activities that required the person to be seated meant people had more time to read. Work changed as well, with lower-class jobs moving away from manual labour and the written word thought to have superseded the spoken one. While we now focus on “screen time”, newspapers and periodicals emphasised the negative effects of a “reading age” (the spread of the book and popular print).

Education to blame

In a similar manner to today, schools were blamed for the problem too. Reading materials, lighting conditions, desk space, and the advent of compulsory education were all linked to the rise in diagnosed conditions.

English ophthalmologist Robert Brudenell Carter, in his government-led study, Eyesight in Schools, reached the balanced conclusion that while schooling conditions may be a problem, more statistics were required to fully assess the situation. Though Carter did not wish to “play the part of an alarmist”, a number of periodicals dramatised their coverage with phrases such as “The Evils of Our School System”.

The problem with all of these new environmental conditions was that they were considered “artificial”. To emphasise this point, medical men frequently compared their findings of poor eye health against the superior vision of “savages” and the effect of captivity on the vision of animals.

While we now focus on “screen time”, newspapers and periodicals emphasised the negative effects of a “reading age”. Credit: Pixabay

This, in turn, gave a more negative interpretation of the relationship between civilisation and “progress”, and conclusions were used to support the idea that deteriorating vision was an accompaniment of the urban environment and modern leisure pursuits – specific characteristics of the Western world.

Also Read: Alice in the Asylum: Wonderland and the Real Mad Tea Parties of the Victorians

And yet the Victorians were undeterred and continued with the very modern progress that they blamed for eyesight problems. Instead, new protective eyewear was developed that sought to protect the eye from dust and flying particles, as well as from the bright lights at seaside resorts, and artificial lighting in the home.

Despite their fears, the country did not become “purblind”. Neither is Britain now an “island full of round-backed, blear-eyed bookworms” as predicted. While stories reported today tend to rely on more rigorous research when it comes to screen time and eye health, it just goes to show that “modernity” has long been a cause for concern.The Conversation

Gemma Almond is a PhD Researcher at the Swansea University.

This article is republished from The Conversation under a Creative Commons license. Read the original article.