Poster Sessions

Tuesday Afternoon Posters, Pavilion

Poster Session: Tuesday, May 21, 2024, 2:45 – 6:45 pm, Pavilion

Abstract#

Poster Title

First Author

Session

56.415

Enhancing eye movement control in Virtual Reality: Ocular Biofeedback Training and its potential implications for Attention Deficits

Ben Joseph, Tchiya

Eye Movements: Clinical

56.444

Responses in human early visual cortex are more sensitive to task difficulty than object category.

Kim, June Hee

Object Recognition: Structure of categories

56.452

Spatial scrambling in human vision: investigating efficiency for discriminating scrambled letters using convolutional neural networks and confusion matrices

Zhu, Xingqi Raffles

Spatial Vision: Machine learning, neural networks

56.435

Influence of background on the spatial-frequency channel for object recognition

Subramanian, Ajay

Object Recognition: Basic features

56.426

No correlation between interocular delay and stereosensitvity in healthy adults

Reynaud, Alexandre

Binocular Vision: Disparity, stereopsis and suppression

56.408

Age-related preservation of proprioception- and vision-guided virtual hand movements

Reynoso, Jose

Action: Clinical, neural

56.401

Walking impacts vision: Spatial and temporal frequency dependent changes to vision during locomotion

Blencowe, Marlon

Action: Locomotor, flow, steering

56.409

Development of gross and fine visuo-motor ability: Insights from late-sighted children

Ben-Ami, Shlomit

Action: Clinical, neural

56.416

How contextual information modulates eye movements during natural sequential behavior in a dynamic scene.

Pomè, Antonella

Eye Movements: Clinical

56.427

Binocular combination under asynchronous viewing conditions

Gurman, Daniel

Binocular Vision: Disparity, stereopsis and suppression

56.436

Improvement of acuity following motion adaptation: the role of low spatial frequencies

Dakin, Steven

Object Recognition: Basic features

56.453

Towards determining the location of the Preferred Retinal Locus of patients with macular disease: A deep learning-based simulation

Soans, Rijul Saurabh

Spatial Vision: Machine learning, neural networks

56.402

Walking while performing visual, auditory and crossmodal tasks produces oscillations entrained to the gait cycle

Alais, David

Action: Locomotor, flow, steering

56.445

Investigating the impact of Gaussian noise on face recognition performance for humans and convolutional neural networks

Jeon, Ikhwan

Object Recognition: Structure of categories

56.437

More predictive, easier to detect? Contrast sensitivities in different predictability contexts

Song, Seyoon

Object Recognition: Basic features

56.410

How much vision impairment does it take to decrease performance in freestyle swimming?

Mann, David

Action: Clinical, neural

56.417

Decoding reading challenges: Eye movement patterns in Italian-speaking poor readers

Pasqualotto, Angela

Eye Movements: Clinical

56.454

Multitask Machine Learning of Contrast Sensitivity Functions

Barbour, Dennis

Spatial Vision: Machine learning, neural networks

56.428

Temporal latencies and position uncertainty in stereoscopic and luminance motion using a continuous eye-tracking task

Serrano-Pedraza, Ignacio

Binocular Vision: Disparity, stereopsis and suppression

56.403

Spatio-temporal collision envelope in virtual reality walking with colliding pedestrians

Doyon, Jonathan K.

Action: Locomotor, flow, steering

56.446

Using texture synthesis to identify the features supporting coarse and fine object categorization

Henderson, Margaret

Object Recognition: Structure of categories

56.418

Microsaccades and Ocular Drift in Ophthalmic and Neurologic Disease

Abozid, Ola

Eye Movements: Clinical

56.429

Global internal disparity noise increases with rising levels of disparity pedestal

Ding, Jian

Binocular Vision: Disparity, stereopsis and suppression

56.447

Common representational format in object-selective visual cortex for photographs and dynamic sketches

Saleki, Sharif

Object Recognition: Structure of categories

56.411

Functional connectivity of attention network related to individual differences in visual and proprioceptive weighting

Folco, Kess

Action: Clinical, neural

56.438

Which shape features determine detectability of camouflaged radial frequency patterns?

Lew, Wei Hau

Object Recognition: Basic features

56.404

Can covert and explicit “leaders” steer and split real human crowds?

Yoshida, Kei

Action: Locomotor, flow, steering

56.455

GramStatTexNet: Using the Gram Matrix of Multi-Scale Pyramids to Contrastively Learn Texture Model Statistics

DuTell, Vasha

Spatial Vision: Machine learning, neural networks

56.412

Neurophysiological cross-task similarities between metacontrol states

Wang, Xi

Action: Clinical, neural

56.448

DreamOn: Enhancing Deep Learning in Medical Imaging with REM-Dream-Inspired Data Augmentation

Lerch, Luc

Object Recognition: Structure of categories

56.419

Saccade Profiles Across Tasks After Childhood Hemispherectomy

Chroneos, Maria Z.

Eye Movements: Clinical

56.456

Recurrence is needed to account for the sharper orientation-tuned surround suppression for oblique versus cardinal orientations

Miao, Huiyuan

Spatial Vision: Machine learning, neural networks

56.439

Features for visual object recognition.

Arguin, Martin

Object Recognition: Basic features

56.405

How many moving obstacles do we respond to at once? A temporal threshold model best accounts for collision avoidance in a crowd

Veprek, Kyra

Action: Locomotor, flow, steering

56.430

The mechanisms of crossed and uncrossed disparities in coarse stereopsis

Wang, Penghan

Binocular Vision: Disparity, stereopsis and suppression

56.420

Differences in smooth pursuit characteristics in different types of strabismus.

SANZ, ELENA

Eye Movements: Clinical

56.449

Pattern of associations across categories in visual recognition

Soen, Laura

Object Recognition: Structure of categories

56.440

Object-substitution masking: The role of low-level chromatic similarity

Lange, Ryan

Object Recognition: Basic features

56.413

Order-Dependent Functional Brain Connectivity in a Cue-Separation Grasp Task

Luabeya, Gaelle

Action: Clinical, neural

56.406

Control theoretical models for visuomotor control explains brain activity during naturalistic driving

Zhang, Tianjiao

Action: Locomotor, flow, steering

56.457

Mapping models of V1 and V2 selectivity with local spectral reverse correlation

Oleskiw, Timothy D.

Spatial Vision: Machine learning, neural networks

56.431

Unraveling the impact of stereoscopic vision on daily tasks in younger and older adults

Chopin, Adrien

Binocular Vision: Disparity, stereopsis and suppression

56.421

Fine spatial vision is optimally adapted to the abnormal fixational eye movements of people with amblyopia

Chung, Susana T. L.

Eye Movements: Clinical

56.450

Exploring mental representation of visual emoji symbols through human similarity judgments

Yun, Yiling

Object Recognition: Structure of categories

56.458

Possible Optimal Strategies for Orientation Coding in Macaque V1 Revealed with a Self-Attention Deep Neural Network (SA-DNN) Model

Wang, Xin

Spatial Vision: Machine learning, neural networks

56.414

Visual cortex encodes spatially specific reward information during closed-loop naturalistic interaction

Kim, Royoung

Action: Clinical, neural

56.407

Anticipatory control of steering through multiple waypoints

Jansen, AJ

Action: Locomotor, flow, steering

56.432

Partitioning the effects of distinct natural-scene properties on visual performance

White, David

Binocular Vision: Disparity, stereopsis and suppression

56.441

Composite Object Representation Makes Tracking Through Rotation Deficient

Wu, Qihan

Object Recognition: Basic features

56.451

Time-resolved brain activation patterns reveal hierarchical representations of scene grammar when viewing isolated objects

Kallmayer, Aylin

Object Recognition: Structure of categories

56.422

Reduced visual acuity due to defocus cannot fully account for the abnormal fixational eye movements of persons with amblyopia

Kwon, Sunwoo

Eye Movements: Clinical

56.442

An Occipitotemporal Region that Identifies Relevant Features

Zeng, Yuxuan

Object Recognition: Basic features

56.433

Performance metrics of real-world perception in augmented reality

Duijnhouwer, Jacob

Binocular Vision: Disparity, stereopsis and suppression

56.459

Visual Inputs Reconstructing through Enhanced 3T fMRI Data from Optimal Transport Guided Generative Adversarial Network

Xiong, Yujian

Spatial Vision: Machine learning, neural networks

56.434

The Dichopter

Brecher, Kenneth

Binocular Vision: Disparity, stereopsis and suppression

56.443

Spiky and Stubby Objects in Human Visual Perception

Au, Ashley

Object Recognition: Basic features

56.423

The effect of image size and defocus on children’s reflex vergence eye movements to natural images

Marella, Bhagya L

Eye Movements: Clinical

56.424

MEASURING REFRACTIVE ERROR USING CONTINUOUS PSYCHOPHYSICS AND EYE TRACKING

Pirso, Ethan

Eye Movements: Clinical

56.425

Behavioral and oculomotor effects of scotoma awareness training in patients with central vision loss

Maniglia, Marcello

Eye Movements: Clinical

Undergraduate Just-In-Time Poster Submissions

VSS 2024 is pleased to announce that the “Just-In-Time” poster sessions for undergraduate students working on independent research projects are now open for submissions. Posters will be presented in person at the annual meeting in one of two sessions, either Saturday, May 18 or Monday, May 20.

VSS welcomes and encourages submissions from a diverse group of eligible students across the globe. To help accomplish this goal we are asking that you share this information with any programs within your institutions that sponsor or promote research for undergraduate students.

Eligibility

The submissions to these sessions are limited to students who:

  • Are currently enrolled in a 3-year or 4-year program leading to the bachelor’s degree. Or,
  • Have earned a bachelor’s degree in a 3-year program and are currently in their first year of study in a program leading to a master’s degree. (Students studying in European universities may fall into this category). Those who already have an abstract accepted for VSS 2024 are not eligible.

Space is limited. The window for submissions will open on March 1 and submissions will be accepted through April 1. Presenters will be informed of acceptance by April 11.

You must be a current student member (for 2024) to submit an abstract.

A limited number of travel grants are available for undergraduate students who submit abstracts during the Just-in-Time submission period. Travel application information will be available upon submission of the student’s abstract.

VSS welcomes and encourages submissions from a diverse group of eligible students across the globe. To help accomplish this goal we are asking that you share this information with any programs within your institutions that sponsor or promote research for undergraduate students. For details and to submit an abstract, go to Undergraduate Just-In-time Poster Submission Guidelines.

Submission Policies

  • A student may submit only one abstract to the Just-In-Time session.
  • The student must be a current VSS member (for 2024).
  • The student must be registered to attend VSS.
  • Those who already have an abstract accepted for VSS 2024 are not eligible to submit to the Just-In-Time session.
  • Abstracts must be work that has not been accepted for publication or published at the time of submission.
  • Poster presenter substitutions are not permitted.

Abstract Format

Abstracts are limited to 300 words. This does not include title, authors, and affiliations. Additional space is provided for funding acknowledgments and for declaration of commercial interests and conflicts.

Your abstract should consist of an introduction, methods and results sections, and a conclusion. It is not required that the sections be explicitly labeled as such. It is, however, important that each abstract contains sufficiently detailed descriptions of the methods and the results. Please do not submit an abstract of work that you are planning to do or work without sufficient results to reach a clear conclusion. Such abstracts will not be accepted.

Per the VSS Disclosure of Conflict of Interest Policy, authors must reveal any commercial interests or other potential conflicts of interest that they have related to the work described. Any conflicts of interest must be declared on your poster or talk slides.

Please complete your submission carefully. All abstracts must be in final form. Abstracts are not proofread or corrected in any way prior to publication. Typos and other errors cannot be corrected after the deadline. You may edit your abstract as much as you like until the submission deadline.

Given the just-in-time deadline, some aspects will differ from regular VSS submissions. Submissions will be reviewed by members of the VSS Board of Directors and designates. Accepted abstracts will appear in the VSS 2024 program, but unlike submissions accepted following the December review, “Just-In-Time” abstracts will not appear in the Journal of Vision.

If you have any questions, please contact our office at .

Submission Schedule

Submissions Open: March 1, 2024
Submissions Close: April 1, 2024
Undergraduate Travel Award Application Deadline: April 5, 2024
Notification of Accepted Abstracts: April 11, 2024

How to Submit

Undergraduate Just-in-Time Poster Submissions are Closed.