Color constancy mechanisms in virtual reality environments

Authors: Rodríguez, R.G., Hedjar, L., Toscani, M., Guarnera, D., Guarnera, G.C. and Gegenfurtner, K.R.

Journal: Journal of Vision

Volume: 24

Issue: 5

Pages: 1-28

eISSN: 1534-7362

DOI: 10.1167/JOV.24.5.6

Abstract:

Prior research has demonstrated high levels of color constancy in real-world scenarios featuring single light sources, extensive fields of view, and prolonged adaptation periods. However, exploring the specific cues humans rely on becomes challenging, if not unfeasible, with actual objects and lighting conditions. To circumvent these obstacles, we employed virtual reality technology to craft immersive, realistic settings that can be manipulated in real time.We designed forest and office scenes illuminated by five colors. Participants selected a test object most resembling a previously shown achromatic reference. To study color constancy mechanisms, we modified scenes to neutralize three contributors: local surround (placing a uniform-colored leaf under test objects), maximum flux (keeping the brightest object constant), and spatial mean (maintaining a neutral average light reflectance), employing two methods for the latter: changing object reflectances or introducing new elements.We found that color constancy was high in conditions with all cues present, aligning with past research. However, removing individual cues led to varied impacts on constancy. Local surrounds significantly reduced performance, especially under green illumination, showing strong interaction between greenish light and rose-colored contexts. In contrast, the maximum flux mechanism barely affected performance, challenging assumptions used in white balancing algorithms. The spatial mean experiment showed disparate effects: Adding objects slightly impacted performance, while changing reflectances nearly eliminated constancy, suggesting human color constancy relies more on scene interpretation than pixel-based calculations.

https://eprints.bournemouth.ac.uk/39853/

Source: Scopus

Color constancy mechanisms in virtual reality environments.

Authors: Gil Rodríguez, R., Hedjar, L., Toscani, M., Guarnera, D., Guarnera, G.C. and Gegenfurtner, K.R.

Journal: J Vis

Volume: 24

Issue: 5

Pages: 6

eISSN: 1534-7362

DOI: 10.1167/jov.24.5.6

Abstract:

Prior research has demonstrated high levels of color constancy in real-world scenarios featuring single light sources, extensive fields of view, and prolonged adaptation periods. However, exploring the specific cues humans rely on becomes challenging, if not unfeasible, with actual objects and lighting conditions. To circumvent these obstacles, we employed virtual reality technology to craft immersive, realistic settings that can be manipulated in real time. We designed forest and office scenes illuminated by five colors. Participants selected a test object most resembling a previously shown achromatic reference. To study color constancy mechanisms, we modified scenes to neutralize three contributors: local surround (placing a uniform-colored leaf under test objects), maximum flux (keeping the brightest object constant), and spatial mean (maintaining a neutral average light reflectance), employing two methods for the latter: changing object reflectances or introducing new elements. We found that color constancy was high in conditions with all cues present, aligning with past research. However, removing individual cues led to varied impacts on constancy. Local surrounds significantly reduced performance, especially under green illumination, showing strong interaction between greenish light and rose-colored contexts. In contrast, the maximum flux mechanism barely affected performance, challenging assumptions used in white balancing algorithms. The spatial mean experiment showed disparate effects: Adding objects slightly impacted performance, while changing reflectances nearly eliminated constancy, suggesting human color constancy relies more on scene interpretation than pixel-based calculations.

https://eprints.bournemouth.ac.uk/39853/

Source: PubMed

Color constancy mechanisms in virtual reality environments

Authors: Rodriguez, R.G., Hedjar, L., Toscani, M., Guarnera, D., Guarnera, G.C. and Gegenfurtner, K.R.

Journal: JOURNAL OF VISION

Volume: 24

Issue: 5

ISSN: 1534-7362

DOI: 10.1167/jov.24.5.6

https://eprints.bournemouth.ac.uk/39853/

Source: Web of Science (Lite)

Color constancy mechanisms in virtual reality environments.

Authors: Gil Rodríguez, R., Hedjar, L., Toscani, M., Guarnera, D., Guarnera, G.C. and Gegenfurtner, K.R.

Journal: Journal of vision

Volume: 24

Issue: 5

Pages: 6

eISSN: 1534-7362

ISSN: 1534-7362

DOI: 10.1167/jov.24.5.6

Abstract:

Prior research has demonstrated high levels of color constancy in real-world scenarios featuring single light sources, extensive fields of view, and prolonged adaptation periods. However, exploring the specific cues humans rely on becomes challenging, if not unfeasible, with actual objects and lighting conditions. To circumvent these obstacles, we employed virtual reality technology to craft immersive, realistic settings that can be manipulated in real time. We designed forest and office scenes illuminated by five colors. Participants selected a test object most resembling a previously shown achromatic reference. To study color constancy mechanisms, we modified scenes to neutralize three contributors: local surround (placing a uniform-colored leaf under test objects), maximum flux (keeping the brightest object constant), and spatial mean (maintaining a neutral average light reflectance), employing two methods for the latter: changing object reflectances or introducing new elements. We found that color constancy was high in conditions with all cues present, aligning with past research. However, removing individual cues led to varied impacts on constancy. Local surrounds significantly reduced performance, especially under green illumination, showing strong interaction between greenish light and rose-colored contexts. In contrast, the maximum flux mechanism barely affected performance, challenging assumptions used in white balancing algorithms. The spatial mean experiment showed disparate effects: Adding objects slightly impacted performance, while changing reflectances nearly eliminated constancy, suggesting human color constancy relies more on scene interpretation than pixel-based calculations.

https://eprints.bournemouth.ac.uk/39853/

Source: Europe PubMed Central

Color constancy mechanisms in virtual reality environments.

Authors: Gil Rodríguez, R., Hedjar, L., Toscani, M., Guarnera, D., Guarnera, G.C. and Gegenfurtner, K.R.

Journal: Journal of Vision

Volume: 24

Issue: 5

ISSN: 1534-7362

Abstract:

Prior research has demonstrated high levels of color constancy in real-world scenarios featuring single light sources, extensive fields of view, and prolonged adaptation periods. However, exploring the specific cues humans rely on becomes challenging, if not unfeasible, with actual objects and lighting conditions. To circumvent these obstacles, we employed virtual reality technology to craft immersive, realistic settings that can be manipulated in real time. We designed forest and office scenes illuminated by five colors. Participants selected a test object most resembling a previously shown achromatic reference. To study color constancy mechanisms, we modified scenes to neutralize three contributors: local surround (placing a uniform-colored leaf under test objects), maximum flux (keeping the brightest object constant), and spatial mean (maintaining a neutral average light reflectance), employing two methods for the latter: changing object reflectances or introducing new elements. We found that color constancy was high in conditions with all cues present, aligning with past research. However, removing individual cues led to varied impacts on constancy. Local surrounds significantly reduced performance, especially under green illumination, showing strong interaction between greenish light and rose-colored contexts. In contrast, the maximum flux mechanism barely affected performance, challenging assumptions used in white balancing algorithms. The spatial mean experiment showed disparate effects: Adding objects slightly impacted performance, while changing reflectances nearly eliminated constancy, suggesting human color constancy relies more on scene interpretation than pixel-based calculations.

https://eprints.bournemouth.ac.uk/39853/

Source: BURO EPrints