Competition for Attention Among Spatial Modulations of Brightness Gradients
PDF Russian

Keywords

brightness gradients
spatial modulation
visual filters
contrast
orientation
spatial frequency
envelope selection
eye movements
attention control
simulation

Abstract

Introduction. The pre-attentive stage of visual processing includes the stage of selection and coding of brightness gradients, and the stage of spatial integration of this information. The first operation is realized by the first-order visual filters (striatal neurons), the second one – by the second-order visual filters (extrastriatal neurons). The second-order filters transmit image areas with spatial modulations of contrast, orientation, and spatial frequency (modulation dimensions) and can function as attention gates. This investigation is aimed at establishing priorities among the modulations in their competition for attention.

Methods. All the stimuli in the study represented three images of the same object. These images were constructed from the areas of the original object and contained modulations of a) contrast (first image), b) orientation (second image), and c) spatial frequency (third image). These areas were in advance isolated from the object by means of the model of the second-order filters developed by the authors. In the first experiment, images composed of different in dimension modulations competed for participants’ attention. In the second experiment, the authors used images of the same dimension, but different in their spatial frequency. Attention focus was determined by recording eye movements.

Results. The findings suggest that the modulations of contrast and orientation had an advantage in the competition for attention among dimensions. When the choice was made between the images of the same dimension, the images formed from the middle-range spatial frequencies had the advantage in the competition for attention.

Discussion. The present study is the first to address the priorities among the modulations in competition for attention. Higher average rates of output signals in the second-order filters that form the test image increase the probability that this image will attract participants’ attention.

Conclusion. The second-order filters can play the role of attention gates. In addition, the filters with higher rates of their output signals have an advantage in competing for attention in visual search. The Appendix describes the models of the second-order visual mechanisms, which the authors used for preparing the study stimuli.

https://doi.org/10.21702/rpj.2018.3.8
PDF Russian

References

Yarbus A. L. Rol' dvizhenii glaz v protsesse zreniya [Eye movements and vision]. Moscow, Nauka Publ., 1965. 167 p.

Borji A., Itti L. State-of-the-art in visual attention modeling. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2013, V. 35, Issue 1, pp. 185–207. DOI: 10.1109/tpami.2012.89

Shurupova M. A., Anisimov V. N., Tereshchenko L. V., Latanov A. V. Impact of cognitive tasks on the parameters of eye movements when viewing static and dynamic scenes. Sensornye sistemy – Sensory Systems, 2016, V. 30, no. 1, pp. 53–62 (in Russian).

Zhang B., Liu S., Doro M., Galfano G. Attentional guidance from multiple working memory representations: Evidence from eye movements. Scientific Reports, 2018, V. 8, Article number 13876. DOI: 10.1038/s41598-018-32144-4

Henderson J. M., Weeks P. A. Jr., Hollingworth A. The effects of semantic consistency on eye movements during complex scene viewing. Journal of Experimental Psychology: Human Perception and Performance, 1999, V. 25, no. 1, pp. 210–228. DOI: 10.1037/0096-1523.25.1.210

Orquin J. L., Loose S. M. Attention and choice: a review on eye movements in decision making. Acta Psychologica, 2013, V. 144, Issue 1, pp. 190–206. DOI: 10.1016/j.actpsy.2013.06.003

Theeuwes J. Visual selection: Usually fast and automatic; Seldom slow and volitional; A reply to commentaries. Journal of Cognition, 2018, V. 1 (1), p. 21. DOI: 10.5334/joc.32

Treisman A. M., Gelade G. A feature-integration theory of attention. Cognitive Psychology, 1980, V. 12, Issue 1, pp. 97–136. DOI: 10.1016/0010-0285(80)90005-5

Eimer M. The neural basis of attentional control in visual search. Trends in Cognitive Sciences, 2014, V. 18, Issue 10, pp. 526–535. DOI: 10.1016/j.tics.2014.05.005

Wolfe J. M., Cave K. R., Franzel S. L. Guided search: an alternative to the feature integration model for visual search. Journal of Experimental Psychology: Human Perception and Performance, 1989, V. 15, no. 3, pp. 419–433.

Becker S. I., Harris A. M., Venini D. R., James D. Visual search for color and shape: When is the gaze guided by feature relationships, when by feature values? Journal of Experimental Psychology: Human Perception and Performance, 2014, V. 40, no. 1, pp. 264–291. DOI: 10.1037/a0033489

Cajar A., Engbert R., Laubrock J. Spatial frequency processing in the central and peripheral visual field during scene viewing. Vision Research, 2016, V. 127, pp. 186–197. DOI: 10.1016/j.visres.2016.05.008

Wolfe J. M. Guided search 4.0. Current progress with a model of visual search. In: W. D. Gray (ed.) Integrated Models of Cognitive Systems. Oxford University Press, 2007, pp. 99–156. DOI: 10.1093/acprof:oso/9780195189193.001.0001

Canosa R. L., Pelz J. B., Mennie N. R., Peak J. High-level aspects of oculomotor control during viewing of natural-task images. In: B. E. Rogowitz, T. N. Pappas (eds.) Proceedings Volume 5007 “Human vision and electronic imaging VIII”. Santa Clara, California, USA, 2003. DOI: 10.1117/12.477375

Henderson J. M., Hayes T. R., Rehrig G., Ferreira F. Meaning guides attention during real-world scene description. Scientific Reports, 2018, V. 8, Article number 13504. DOI: 10.1038/s41598-018-31894-5

Wang W., Shen J. Deep Visual Attention Prediction. IEEE Transactions on Image Processing, 2018, V. 27, Issue 5, pp. 2368–2378. DOI: 10.1109/TIP.2017.2787612

Graham N. V. Beyond multiple pattern analyzers modeled as linear filters (as classical V1 simple cells): Useful additions of the last 25 years. Vision Research, 2011, V. 51, Issue 13, pp. 1397–1430. DOI: 10.1016/j.visres.2011.02.007

Dakin S. C., Mareschal I. Sensitivity to contrast modulation depends on carrier spatial frequency and orientation. Vision Research, 2000, V. 40, Issue 3, pp. 311–329. DOI: 10.1016/S0042-6989(99)00179-0

Landy M. S., Oruç İ. Properties of second-order spatial frequency channels. Vision Research, 2002, V. 42, Issue 19, pp. 2311–2329. DOI: 10.1016/S0042-6989(02)00193-1

Yavna D. V., Kupriyanov I. V., Kokornikova V. I. The perception of the orientationally modulated textures and its expression in the visual evoked potentials. Rossiiskii psikhologicheskii zhurnal – Russian Psychological Journal, 2015, V. 12, no. 4, pp. 161–274 (in Russian). DOI: 10.21702/rpj.2015.4.13

Hallum L. E., Movshon J. A. Surround suppression supports second-order feature encoding by macaque V1 and V2 neurons. Vision Research, 2014, V. 104, pp. 24–35. DOI: 10.1016/j.visres.2014.10.004

Barabanshchikov V. A., Zhegallo A. V. Registratsiya i analiz napravlennosti vzora cheloveka [Registration and analysis of human eye orientation]. Moscow, Institute of Psychology RAS, 2013. 316 p.

Theeuwes J. Top-down and bottom-up control of visual selection. Acta Psychologica, 2010, V. 135, Issue 2, pp. 77–99. DOI: 10.1016/j.actpsy.2010.02.006

Wilson H. R., Gelb D. J. Modified line-element theory for spatial-frequency and width discrimination. Journal of the Optical Society of America, 1984, V. 1, Issue 1, pp. 124–131. DOI: 10.1364/JOSAA.1.000124

McDonald J. H. Handbook of biological statistics (3rd ed.). Baltimore, Sparky House Publishing, 2014. 305 p.

Nothdurft H. C. Texture segmentation and pop-out from orientation contrast. Vision Research, 1991, V. 31, Issue 6, pp. 1073–1078. DOI: 10.1016/0042-6989(91)90211-M

Schmid A. M., Victor J. D. Possible functions of contextual modulations and receptive field nonlinearities: Pop-out and texture segmentation. Vision Research, 2014, V. 104, pp. 57–67. DOI: 10.1016/j.visres.2014.07.002

Johnson A., Zarei A. Second-order saliency predicts observer eye movements when viewing natural images. Journal of Vision, 2010, V. 10, Issue 7, p. 526. DOI: 10.1167/10.7.526

Cheng M.-M., Mitra N. J., Huang X., Torr P. H. S., Hu S.-M. Global contrast based salient region detection. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2015, V. 37, Issue 3, pp. 569–582. DOI: 10.1109/tpami.2014.2345401

Onat S., Açık A., Schumann F., König P. The contributions of image content and behavioral relevancy to overt attention. PLoS ONE, V. (4), e93254. DOI: 10.1371/journal.pone.0093254

Grigorescu C., Petkov N., Westenberg M. A. Contour detection based on nonclassical receptive field inhibition. IEEE Transactions on Image Processing, 2003, V. 12, Issue 7, pp. 729–739. DOI: 10.1109/TIP.2003.814250

Duan H., Deng Y., Wang X., Xu C. Small and dim target detection via lateral inhibition filtering and artificial bee colony based selective visual attention. PLoS ONE, 2013, V. 8, no. 8, e72035. DOI: 10.1371/journal.pone.0072035