Temporal limits of visual segmentation based on temporal asynchrony in luminance, color, motion direction, and their mixtures

Poster Presentation 53.310: Tuesday, May 23, 2023, 8:30 am – 12:30 pm, Banyan Breezeway
Session: Perceptual Organization: Segmentation, grouping, similarity

There is a Poster PDF for this presentation, but you must be a current member or registered to attend VSS 2023 to view it.
Please go to your Account Home page to register.

Yen-Ju Chen1 (), Shin’ya Nishida1,2; 1Graduate School of Informatics, Kyoto University, Japan, 2Human Information Science Laboratory, NTT Communication Science Laboratories, Nippon Telegraph and Telephone Corporation, Japan

It is known that synchronous/asynchronous changes in visual attribute values at different spatial locations are effective cues for perceptual grouping/segmentation. To see whether changes in any visual attributes produce similar perceptual grouping/segmentation effects, and how this phenomenon is related to the perceptual asynchrony between different attributes, we measured segregation performance under single-attribute and dual-attribute conditions for three visual attributes: luminance, color, and motion direction. Our stimulus consisted of an array of 16×16 elements divided into four quadrants. Each element was a Gaussian bulb for luminance or color changes, and a Gabor patch for direction change. Temporal patterns of stimulus changes, A and B, were repetitive alternations at a given temporal frequency (variable between 1 and 10Hz) with a 180-deg phase shift between A and B. One of the four quadrants (target) followed the temporal pattern A, and the others B. The observer’s task was to detect the target quadrant (4AFC). In the single-attribute condition, all elements changed in the same attribute. In the double-attribute condition, the elements in the two diagonally opposite quadrants changed in one attribute, while the rest in another attribute. The results of the single-attribute conditions showed that the temporal frequency limit for luminance was 8Hz or higher, while that for color was only slightly lower. When luminance and color were paired in a double attribute condition, the temporal limit was reduced (4-5 Hz) relative to the single-attribute conditions. On the other hand, segmentation based on asynchronous motion direction changes was very difficult either in a single-attribute condition or in the double-attribute conditions paired with luminance or color. Our results indicate that the performance of asynchrony-based segregation is highly dependent on which visual attributes convey temporal pattern information.

Acknowledgements: This studies was supported by Supported by MEXT/JSPS KAKENHI (Japan) 20H00603, 20H05605