The effect of luminance contrast between sign and surrounding object on gaze behavior: A study in virtual metro station environment with rendered static/dynamic panorama

Xiaoqun Ai, Yufei Liu, Zhendong Wu, Jingchun Chai, Xinping Ju, Wenxiang Duan, Lintao Zhao, Ying Liang

Abstract


This paper investigated the effect of luminance contrast between sign and surrounding object on the gaze behavior of pedestrian, static or dynamic 360° panorama rendered in a virtual environment applied to simulate the wayfinding of pedestrians in the metro stations. Fifty-five participants observed the sign posters and the advertisement boards with distinct luminance contrasts (31 of them were in static levels, 24 of the rest were in dynamic levels) and were asked to point out the graphic or textual changes in the area after 30s. The eye tracker recorded ocular data, and glare perception was inquired by questionnaire. The result of T-test and Regression analysis revealed that luminance contrast was a saliency feature distinguishing visual targets and surrounding objects. The correlation between the value of luminance contrast and fixations, fixation durations on the sign is negative. Each increase in luminance contrast by one unit reduces the mean of fixations by 0.826, accompanied by a stronger feeling of glare, which indicated the strategic adaption of visual attention. The study contributed to our understanding of a new sight of lighting design in public traffic places and confirmed that lighting simulation in an immersive virtual environment can effectively analyzes visual perception.


Full Text:

PDF

References


Fotios, S., & Gibbons, R.(2018). Road lighting research for drivers and pedestrians: The basis of luminance and illuminance recommendations.Lighting Research & Technology, 50(1), 154-186. https://doi.org/10.1177/1477153517739055.

Kruisselbrink, T., Dangol, R., & Rosemann, A. (2018). Photometric measurements of lighting quality: An overview. Building and Environ- ment,138, 42-52. https://doi.org/10.1016/j.buildenv.2018.04.028.

International Commission on illumination. LIGHTING OF WORK PLACES - PART 1: INDOOR. ISO 8995-1:2002(E)/CIE S 008/E:2001. https://cie.co.at/publications/lighting-work-places-part-1-indoor.

Hamedani, Z., Solgi, E., Hine, T., & Skates, H.(2020). Revealing the relationships between lu-minous environment characteristics and physiological, ocular and performance measures: An experimental study. Building and Environment, 172, 106702. https://doi.org/10.1016/j.buildenv.2020.106702.

Amundadottir, M. L., Rockcastle, S., Khanie, M.S., & Andersen, M. (2017). A human-centricapproach to assess daylight in buildings for non-visual health potential, visual interest and gaze behavior. Building and Environment, 113, 5-21. https://doi.org/10.1016/j.buildenv.2016.09.033.

Sarey Khanie, M., Stoll, J., Einhäuser, W., Wienold, J., & Andersen, M. (2015). Gaze-driven approach for estimating luminance values in the field of view for discomfort assessments (No. CONF).

Hamedani, Z., Solgi, E., Skates, H., Hine, T., Fernando, R., Lyons, J., & Dupre, K. (2019). Visual discomfort and glare assessment in office environments: A review of light-induced physiological and perceptual responses. Buildingand Environment, 153, 267-280. https://doi.org/10.1016/j.buildenv.2019.02.035.

Le Meur, O., & Baccino, T. (2013). Methods for comparing scanpaths and saliency maps: strengths and weaknesses. Behavior research methods, 45(1), 251-266.

Lund, H., 2016. Eye tracking in library and information science: a literature review. Library Hi Tech 34, 585-614.

Singh, J., & Modi, N. (2019). Use of information modelling techniques to understand researchtrends in eye gaze estimation methods: An automated review. Heliyon, 5(12), e03033. https://doi.org/10.1016/j.heliyon.2019.e03033.

Borji, A., & Itti, L. (2012). State-of-the-art in visual attention modeling. IEEE transactions on pattern analysis and machine intelligence, 35(1), 185-207.

Charles, R. L., & Nixon, J. (2019). Measuring mental workload using physiological measures: A systematic review. Applied ergonomics, 74, 221-232. https://doi.org/10.1016/j.apergo.2018.08.028.

Cosma, G., Ronchi, E., & Nilsson, D. (2016). Way-finding lighting systems for rail tunnel evacuation: A virtual reality experiment with Oculus Rift®. Journal of Transportation Safety & Security, 8(sup1), 101-117. https://doi.org/10.1080/19439962.2015.1046621.

Tang, M., & Auffrey, C. (2018). Advanced digital tools for updating overcrowded rail stations:using eye tracking, virtual reality, and crowd simulation to support design decision-making. Urban Rail Transit, 4(4), 249-256.

Schrom-Feiertag, H., Settgast, V., & Seer, S. (2017). Evaluation of indoor guidance systems using eye tracking in an immersive virtual environment. Spatial Cognition & Computation, 17(1-2), 163-183. https://doi.org/10.1080/13875868.2016.1228654.

Suzer, O. K., Olgunturk, N., & Guvenc, D. (2018). The effects of correlated colour temper- ature on wayfinding: A study in a virtual airport environment. Displays, 51, 9-19. https://doi.org/10.1016/j.displa.2018.01.003.

Chamilothori, K., Chinazzo, G., Rodrigues, J., Dan-Glauser, E. S., Wienold, J., & Andersen, M. (2019). Subjective and physiological responses to façade and sunlight pattern geometry in virtual reality. Building and Environment, 150, 144-155. https://doi.org/10.1016/j.buildenv.2019.01.009.

Berton, F., Hoyet, L., Olivier, A. H., Bruneau, J., Le Meur, O., & Pettré, J. (2020, March). Eye-gaze activity in crowds: impact of virtual reality and density. In 2020 IEEE Conference onVirtual Reality and 3D User Interfaces (VR) (pp. 322-331). IEEE.

Zahabi, M., Machado, P., Lau, M.Y., Deng, Y., Pankok, C., Jr., Hummer, J., Rasdorf, W., Kaber, D.B., 2017. Driver performance and attention allocation inuse of logo signs on freeway exit ramps. Appl Ergon 65, 70-80.

Jones, N. L., & Reinhart, C. F. (2017). Experimental validation of ray tracing as a means of image-based visual discomfort prediction. Building and Environment, 113, 131-150. https://doi.org/10.1016/j.buildenv.2016.08.023.

Mahić, A., Galicinao, K., & Van Den Wymelenberg, K. (2017). A pilot daylighting field stu-dy: Testing the usefulness of laboratory-derived luminance-based metrics for building design and control. Building and Environment, 113, 78-91. https://doi.org/10.1016/j.buildenv.2016.11.024.

Chen, Y., Cui, Z., & Hao, L. (2019). Virtual reality in lighting research: Comparing physical and virtual lighting environments. Lighting Res-earch & Technology, 51(6), 820-837. https://doi.org/10.1177/1477153518825387.

German Institute for Standardization. Artificial lighting - Part 8: Workplace luminaries - Requirements, recommendations and proofing. DIN 5035-8:2007-07. https://www.beuth.de/de/norm/din-5035-8/98019763.

Pedersen, E., & Johansson, M. (2018). Dynamic pedestrian lighting: Effects on walking speed, legibility and environmental perception. Lighting Research & Technology, 50(4), 522-536. https://doi.org/10.1177/1477153516684544.

Kim, I. T., Choi, A. S., & Sung, M. K. (2018). Accuracy evaluation of a calculation tool based on the spectral colour property of indoorluminous environments. Building and Environ-ment, 139, 157-169. https://doi.org/10.1016/j.buildenv.2018.05.028.

Murdoch, M. J., Stokkermans, M. G., & Lambooij, M. (2015). Towards perceptual accuracy in 3D visualizations of illuminated indoor environments. Journal of Solid State Lighting, 2(1), 1-19.

Moscoso, C., Matusiak, B., Svensson, U. P., & Orleanski, K. (2015). Analysis of stereoscopic images as a new method for daylighting studies. ACM Transactions on Applied Perception (TAP), 11(4), 1-13. https://doi.org/10.1145/2665078.

Hidayetoglu, M. L., Yildirim, K., & Akalin, A.(2012). The effects of color and light on in- door wayfinding and the evaluation of the per- ceived environment. Journal of environmental psychology, 32(1), 50-58. https://doi.org/10.1016/j.jenvp.2011.09.001.

Bian, Y., & Luo, T. (2017). Investigation of visual comfort metrics from subjective res- ponses in China: A study in offices with day- light. Building and Environment, 123, 661-671. https://doi.org/10.1016/j.buildenv.2017.07.035.

Cheng, T. J., Yang, B., Holloway, C., & Tyler, N. (2018). Effect of environmental factors on how older pedestrians detect an upcoming step. Lighting Research & Technology, 50(3), 405-415. https://doi.org/10.1177/1477153516669968.

Fotios, S., Uttley, J., Cheal, C., & Hara, N. (2015). Using eye-tracking to identify pedestrians’critical visual tasks, Part 1. Dual task approach.Lighting research & technology, 47(2), 133-148.https://doi.org/10.1177/1477153514522472.

De Boer, J., 1967. Public lighting. Cleaver-Hume.

Gellatly, A. W., & Weintraub, D. J. (1990). User reconfigurations of the de Boer rating scale for discomfort glare. University of Michigan, Ann Arbor, Transportation Research Institute.

Shiferaw, B., Downey, L., & Crewther, D. (2019). A review of gaze entropy as a measure of visual scanning efficiency. Neuroscience & Bio-behavioral Reviews, 96, 353-366. https://doi.org/10.1016/j.neubiorev.2018.12.007.

Arthur, P., Passini, R., 1992. Wayfinding: people, signs, and architecture.

Zang, X., Huang, L., Zhu, X., Müller, H. J., & Shi, Z. (2020). Influences of luminance contrast and ambient lighting on visual context learning and retrieval. Attention, Perception, & Psychophysics, 82(8), 4007-4024.

Armougum, A., Gaston-Bellegarde, A., Joie-La Marle, C., & Piolino, P. (2020). Physiological investigation of cognitive load in real-life train travelers during information processing. AppliedErgonomics, 89, 103180. https://doi.org/10.1016/j.apergo.2020.103180.

Lin, Y., Fotios, S., Wei, M., Liu, Y., Guo, W., & Sun, Y. (2015). Eye movement and pupil size constriction under discomfort glare. Investigative ophthalmology & visual science, 56(3), 1649-1656. https://doi.org/10.1167/iovs.14-15963.

Pakkert, M., Rosemann, A. L., van Duijnhoven, J., & Donners, M. A. (2018). Glare quantification for indoor volleyball. Building and Environment, 143, 48-58. https://doi.org/10.1016/j.buildenv.2018.06.053.

Higuera-Trujillo, J.L., Lopez-Tarruella Maldonado, J., Llinares Millan, C., 2017. Psychological and physiological human responses to simulated and real environments: A comparison between Photographs, 360 degrees Panoramas, and Virtual Reality. Appl Ergon 65, 398-409.

Deb, S., Carruth, D.W., Sween, R., Strawderman, L., Garrison, T.M., 2017. Efficacy of virtual reality in pedestrian safety research. Appl Ergon 65, 449-460.

Wang, Y., Wang, L., Wang, C., & Zhao, Y. (2016). How eye movement and driving performance vary before, during, and after entering a long expressway tunnel: considering the differences of novice and experienced drivers under daytime and nighttime conditions. SpringerPlus, 5(1), 1-10.

Löcken, A., Yan, F., Heuten, W., & Boll, S. (2019). Investigating driver gaze behavior during lane changes using two visual cues: ambient light and focal icons. Journal on Multimodal User Interfaces, 13(2), 119-136.




DOI: https://doi.org/10.31449/inf.v45i7.3660

Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 License.