Algorithms Could Adjust Screens to Your Vision

Researchers are developing vision-correcting displays for computer monitors that would let people see text and images clearly without their glasses or contact lenses.
Algorithms Could Adjust Screens to Your Vision
"Our technique distorts the image such that, when the intended user looks at the screen, the image will appear sharp to that particular viewer," says Brian Barsky. "But if someone else were to look at the image, it would look bad." (smokingapples.com)
8/17/2014
Updated:
8/16/2014

Researchers are developing vision-correcting displays for computer monitors that would let people see text and images clearly without their glasses or contact lenses.

The technology could potentially help hundreds of millions of people who currently need corrective lenses to use their smartphones, tablets, and computers.

One common problem, for example, is presbyopia, a type of farsightedness in which the ability to focus on nearby objects is gradually diminished as the aging eyes’ lenses lose elasticity.


The blurred image on the left shows how a farsighted person would see a computer screen without corrective lenses. In the middle is how that same person would perceive the picture using a display that compensates for visual impairments. The picture on the right is a computer simulation of the best picture quality possible using the new prototype display. The images were taken by a DSLR camera set to simulate hyperopic vision. (Houang Stephane/Flickr, modified by Fu-Chung Huang/UC Berkeley)

More importantly, the displays could one day aid people with more complex visual problems, known as high order aberrations, which cannot be corrected by eyeglasses, says Brian Barsky, professor of computer science and vision science and affiliate professor of optometry at University of California, Berkeley.

“We now live in a world where displays are ubiquitous, and being able to interact with displays is taken for granted,” says Barsky, who is leading this project.

“People with higher order aberrations often have irregularities in the shape of the cornea, and this irregular shape makes it very difficult to have a contact lens that will fit. In some cases, this can be a barrier to holding certain jobs because many workers need to look at a screen as part of their work.”

Computation, Not Optics

The UC Berkeley researchers teamed up with Gordon Wetzstein and Ramesh Raskar, colleagues at the Massachusetts Institute of Technology, to develop their latest prototype of a vision-correcting display.

The setup adds a printed pinhole screen sandwiched between two layers of clear plastic to an iPod display to enhance image sharpness. The tiny pinholes are 75 micrometers each and spaced 390 micrometers apart.

The research team presented this computational light field display on August 12 at the International Conference and Exhibition on Computer Graphics and Interactive Techniques, or SIGGRAPH, in Vancouver, Canada. A paper on their findings is available in ACM Transaction on Graphics.

“The significance of this project is that, instead of relying on optics to correct your vision, we use computation,” says lead author Fu-Chung Huang, who worked on this project as part of his computer science PhD dissertation at UC Berkeley under the supervision of Barsky and Austin Roorda, professor of vision science and optometry.

“This is a very different class of correction, and it is non-intrusive.”

Pixels Through Pinholes

The algorithm, which was developed at UC Berkeley, works by adjusting the intensity of each direction of light that emanates from a single pixel in an image based upon a user’s specific visual impairment. In a process called deconvolution, the light passes through the pinhole array in such a way that the user will perceive a sharp image.

“Our technique distorts the image such that, when the intended user looks at the screen, the image will appear sharp to that particular viewer,” says Barsky. “But if someone else were to look at the image, it would look bad.”

In the experiment, the researchers displayed images that appeared blurred to a camera, which was set to simulate a person who is farsighted. When using the new prototype display, the blurred images appeared sharp through the camera lens.

This latest approach improves upon earlier versions of vision-correcting displays that resulted in low-contrast images. The new display combines light field display optics with novel algorithms.

Huang, now a software engineer at Microsoft Corp. in Seattle, notes that the research prototype could easily be developed into a thin screen protector, and that continued improvements in eye-tracking technology would make it easier for the displays to adapt to the position of the user’s head position.

“In the future, we also hope to extend this application to multi-way correction on a shared display, so users with different visual problems can view the same screen and see a sharp image,” says Huang.

The National Science Foundation helped support this work.

Source: UC Berkeley. Republished from Futurity.org under Creative Commons License 3.0.