Lightness Illusions (Contrast, Assimilation, and Natural Scenes with Edges and Gradients) show that appearances do not correlate with the light sent from the scene to the eye. Lightness Illusions begin with a control experiment that includes two identical Gray Regions-Of-Interest(GrayROI) that have equal appearances in uniform surrounds. The Illusion experiment modifies “the-rest-of-the-scene” to make these GrayROIs appear different from each other. Our visual system performs complex-spatial transformations of scene-luminance patterns using two independent spatial mechanisms: optical and neural. First, optical veiling glare transforms scene luminances into a different light pattern on receptors, called retinal contrasts. This article provides a new Python program that calculates retinal contrast. Equal scene luminances become unequal retinal contrasts. Uniform scene segments become nonuniform retinal gradients; darker regions acquire substantial scattered light; and the retinal range-of-light changes. The glare on each receptor is the sum of the individual contributions from every other scene segment. Glare responds to the content of the entire scene. Glare is a scene-dependent optical transformation. Lightness Illusions are intended to demonstrate how our “brain sees” using simple-uniform patterns. However, the after-glare pattern of light on receptors is a morass of high-and low-slope gradients. Quantitative measurements, and pseudocolor renderings are needed to appreciate the magnitude, and spatial patterns of glare. Glare’s gradients are invisible when you inspect them. Illusions are generated by neural responses from “the-rest-of-the-scene.” The neural network input is the simultaneous array of all receptors’ responses. Neural processing performs vision’s second scene-dependent spatial transformation. Neural processing generates appearances in Illusions and Natural Scenes. “Glare’s Paradox” is that glare adds more re-distributed light to GrayROIs that appear darker, and less light to those that appear lighter. This article describes nine experiments in which neural-spatial-image processing overcompensates the effects of glare. This article studies the first-step in imaging: scene-dependent glare. Despite near invisibility, glare modifies all quantitative measurements of images. This article reveals glare’s modification of input data used in quantitative image analysis and models of vision, as well as visual image-quality metrics. Glare redefines the challenges in modeling Lightness Illusions. Neural spatial processing is more powerful than we realized.

Edges and gradients in lightness illusions: Role of optical veiling glare / J.J. Mccann, V. Vonikakis, A. Rizzi. - In: FRONTIERS IN PSYCHOLOGY. - ISSN 1664-1078. - 13:(2022), pp. 958787.1-958787.23. [10.3389/fpsyg.2022.958787]

Edges and gradients in lightness illusions: Role of optical veiling glare

A. Rizzi
Ultimo
2022

Abstract

Lightness Illusions (Contrast, Assimilation, and Natural Scenes with Edges and Gradients) show that appearances do not correlate with the light sent from the scene to the eye. Lightness Illusions begin with a control experiment that includes two identical Gray Regions-Of-Interest(GrayROI) that have equal appearances in uniform surrounds. The Illusion experiment modifies “the-rest-of-the-scene” to make these GrayROIs appear different from each other. Our visual system performs complex-spatial transformations of scene-luminance patterns using two independent spatial mechanisms: optical and neural. First, optical veiling glare transforms scene luminances into a different light pattern on receptors, called retinal contrasts. This article provides a new Python program that calculates retinal contrast. Equal scene luminances become unequal retinal contrasts. Uniform scene segments become nonuniform retinal gradients; darker regions acquire substantial scattered light; and the retinal range-of-light changes. The glare on each receptor is the sum of the individual contributions from every other scene segment. Glare responds to the content of the entire scene. Glare is a scene-dependent optical transformation. Lightness Illusions are intended to demonstrate how our “brain sees” using simple-uniform patterns. However, the after-glare pattern of light on receptors is a morass of high-and low-slope gradients. Quantitative measurements, and pseudocolor renderings are needed to appreciate the magnitude, and spatial patterns of glare. Glare’s gradients are invisible when you inspect them. Illusions are generated by neural responses from “the-rest-of-the-scene.” The neural network input is the simultaneous array of all receptors’ responses. Neural processing performs vision’s second scene-dependent spatial transformation. Neural processing generates appearances in Illusions and Natural Scenes. “Glare’s Paradox” is that glare adds more re-distributed light to GrayROIs that appear darker, and less light to those that appear lighter. This article describes nine experiments in which neural-spatial-image processing overcompensates the effects of glare. This article studies the first-step in imaging: scene-dependent glare. Despite near invisibility, glare modifies all quantitative measurements of images. This article reveals glare’s modification of input data used in quantitative image analysis and models of vision, as well as visual image-quality metrics. Glare redefines the challenges in modeling Lightness Illusions. Neural spatial processing is more powerful than we realized.
glare’s paradox; HDR and LDR scenes; lightness illusions; neural spatial processing; python code-retinal contrast; retinal glare; scene content; visibility of glare
Settore INF/01 - Informatica
Settore ING-INF/05 - Sistemi di Elaborazione delle Informazioni
2022
Article (author)
File in questo prodotto:
File Dimensione Formato  
fpsyg-13-958787.pdf

accesso aperto

Tipologia: Publisher's version/PDF
Dimensione 2.94 MB
Formato Adobe PDF
2.94 MB Adobe PDF Visualizza/Apri
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2434/949274
Citazioni
  • ???jsp.display-item.citation.pmc??? 0
  • Scopus 3
  • ???jsp.display-item.citation.isi??? 0
social impact