2025 Exhibitors

Please visit our exhibitors in the Pavilion.

Exhibit Hours
Saturday, May 17, 8:00 am – 5:30 pm
Sunday, May 18, 8:00 am – 5:30 pm
Monday, May 19, 8:00 am – 12:30 pm
Tuesday, May 20, 8:00 am – 5:30 pm

J.S. Held Silver Sponsor, Booths 9 & 10

J.S. Held is a global consulting firm that combines technical, scientific, financial, and strategic expertise to advise clients seeking to realize value and mitigate risk. Our professionals serve as trusted advisors to organizations facing high stakes matters demanding urgent attention, staunch integrity, proven experience, clear-cut analysis, and an understanding of both tangible and intangible assets.

Among myriad other specialized services, we provide product safety and user experience (UX) research across the entire product lifecycle based on five decades of human factors and failure analysis. We are always looking for qualified PhDs, postdocs, and early-career faculty interested in technical consulting.

Rogue Research, Inc. Silver Sponsor, Booths 11 & 12

Rogue Research has been your partner in neuroscience research for over 20 years. As developers of the Brainsight® family of neuronavigation systems for non-invasive brain stimulation, we have helped make transcranial magnetic stimulation more accurate and more reproducible while keeping it simple and effective. 20 years and over 1000 laboratories later, Brainsight® continues to evolve to meet the needs in non-invasive brain stimulation.

Rogue Research Logo

Rogue Research has expanded beyond navigation to develop our own, next-generation, TMS device: Elevate™ TMS. Elevate™ TMS offers control over the pulse shape to ensure more reproducible excitatory or inhibitory effects on the targeted network. While Brainsight® ensures accurate targeting and Elevate™ TMS ensures reliable circuit interaction, Rogue Research is also developing a robotic positioner to ensure that the plan is accurately and efficiently carried out. The unique design ensures accuracy, repeatability and simplicity.

Rogue Research also offers our Brainsight® Vet line of neurosurgical and neuronavigation tools for animal research. Come see our navigated microsurgical robot, which is the most accurate animal stereotaxic system on the market. We also offer custom MRI compatible implants and a line of MRI coils and testing platforms.

SR Research Ltd Silver Sponsor, Booths 18 & 19

SR Research Logo

SR Research produces the EyeLink family of high-speed eye trackers and has been enabling scientists to perform cutting-edge research since the early 1990s. EyeLink systems are renowned for their outstanding technical specifications, temporal precision, and superb accuracy. The EyeLink 1000 Plus has the world’s lowest spatial noise and can be used in the laboratory and in EEG/MEG/MRI environments. The EyeLink Portable Duo offers the same high levels of data quality in a small, portable package. SR Research also provides sophisticated experiment delivery and analysis software, and a truly legendary support service.

VPixx Technologies Silver Sponsor, Booths 1 & 2

VPixx has developed your innovative vision research hardware for over 24 years. Our PROPixx video projector, supporting refresh rates up to 1440Hz, has become the standard for neuroimaging, neurophysiology, and behavioral vision research applications. The PROPixx RIFT (Rapid Invisible Frequency Tagging) paradigm is enabling ground-breaking MEG and EEG research: https://pubmed.ncbi.nlm.nih.gov/35452080/.

The TRACKPixx3 2kHz binocular eye tracker and the DATAPixx3 I/O hub offer microsecond-precise data acquisition synchronized to stimulus presentation. Our new LabMaestro software is making these instruments even easier to use, while Pack&Go is your solution for quickly running psychophysics experiments on remote subject populations.

2025 is a very special year, in which VPixx releases several innovative displays that the vision research community has been requesting. Visit our booth for exciting live demonstrations!

ANT North America Inc Bronze Sponsor, Booth 17

ANT Neuro is a technology leader in high-density EEG, offering state of the art systems tailored to the demanding requirements of cutting-edge research. ANT’s eego line of EEG devices and new saline-based EEG nets enable efficient collection of high-density EEG data (8 – 256 channels) either at rest or during movement. ANT offers specialized EEG caps for study of neonates and infants.

BrainVision LLC Bronze Sponsor, Booth 15

Brain Vision LLC is the leading team for EEG in Vision Science. We offer full integration of EEG with many leading eye-tracking and video systems we also provide flexible and robust solutions for both stationary and mobile EEG. All of our systems are available with a variety of electrode types such as saline-sponge nets, active gel, passive, and dry electrodes, which are easily expandable with bio-sensors like GSR, ECG, Respiration, and EMG. Our team is specialized in using EEG with other modalities such as fMRI, fNIRS, MEG, TMS, and tDCS/HDtDCS.

If you want to know how EEG and Vision Science improve each other, please feel free to contact us:
Phone: +1.877.EEG 4 MRI, Email: 

Cambridge Research Systems Bronze Sponsor, Booth 4

At Cambridge Research Systems, our reputation is founded on values of scientific rigour and integrity. For over 30 years, our unique range of Tools for Vision Science, Functional Imaging and Clinical Research has been ubiquitous in laboratories throughout the world, and cited in thousands of papers.

Cambridge Research Systems Logo

We design and develop innovative new tools that enable the advancement of science by combining engineering expertise with innovation, cutting edge technology, and ongoing collaboration with our valued academic partners. Our products are market leaders, our people committed and knowledgeable. Our ambition is to continue setting standards in the vision science community, of which we are proud to be a part.

We look forward to seeing you again at VSS! Please call at our booth to see our latest products for visual stimulation, eye tracking, vision assessment, and MRI; or contact .

Cortech Solutions, Inc. Bronze Sponsor, Booth 5

Come see the new ActiveThree EEG system, the Brite wireless fNIRS system, and the new SAGA EEG system, all world class research instruments with the most advanced features, designed specifically for science. Also, we are the US representative for Cambridge Research Systems, providing the BOLDscreen calibrated display for fMRI, LiveTrack Lightning high speed eye-tracker, and Display++ calibrated LCD display, and more.

Exponent Bronze Sponsor, Booth 7

Exponent Logo

Exponent In an era of radically-accelerating change, Exponent is the only premium engineering and scientific consulting firm with the depth and breadth of expertise to solve your most profoundly unique, unprecedented, and urgent challenges. Exponent brings together 90+ technical disciplines and 950+ consultants to help our clients navigate the increasing complexity of more than a dozen industries, connecting decades of pioneering work in failure analysis to develop solutions for a safer, healthier, sustainable world.

NIRx Medical Technologies, LLC Bronze Sponsor, Booth 6

NIRx Logo

NIRx Medical Technologies, LLC is a leading provider of comprehensive solutions for functional near-infrared spectroscopy (fNIRS) research. Our non-invasive and user-friendly fNIRS technology enables the measurement of neural activity in the cortex and large-scale cortical networks, providing insights into the neural mechanisms underlying perception and cognition.

Our complete range of research solutions includes a versatile multimodal hardware platform, advanced online and offline analysis software, expert technical and scientific support, and comprehensive training programs. We are dedicated to supporting fNIRS researchers through our offices in Orlando, New York, and Berlin, Germany.

Whether you’re investigating changes in neural activity during development, researching disorders and their treatments, or exploring new applications in neuroscience, NIRx has the expertise and solutions to help you achieve your research goals. For more information, please contact us at +49 308 1453 5990 (EU), (+1) 321-352-7570 (US/Canada), or email us at .

Psychology Software Tools Bronze Sponsor, Booth 8

Psychology Software Tools Logo

Psychology Software Tools – Developers of E-Prime stimulus presentation software. E-Prime includes E-Prime Go for remote data collection! Integrate E-Prime with eye tracking, EEG, fMRI and fNIRS with E-Prime Extensions for Tobii Pro, EyeLink, Net Station, Brain Products, fMRI, and NIRx. Use Chronos for millisecond-accurate responses, sound output, and triggers to external devices. Chronos Adapters provide a simple connection to external devices, including Brain Products, ANT Neuro, BIOPAC, BioSemi, Neuroscan, Magstim EGI, NIRx, Zeto, g.tec, and more. PST also provides solutions for fMRI research, such as Fiber Optic and Wireless Response Systems, a NEW Hyperion Projection System, and MRI Simulators with head motion tracking. PST has a 37-year company history with 100,000+ users in 75 countries!

Pupil Labs Bronze Sponsor, Booth 13

At Pupil Labs we build eye tracking hardware and software. Our latest eye tracker – Neon – is modular and adapts to your research needs. We have a solution for almost all requirements. Whether you do research with kids, in a quiet lab, climbing a mountain, or venturing into virtual realms.

Neon uses deep learning algorithms to provide you with robust, accurate, and precise gaze and pupil data. Neon performs well in all environments – from complete darkness to direct sunlight. It works with all subjects: age, gender, ethnicity, eye make-up, contact lenses – no problem!

Our technology is open and accessible and seamlessly integrates with your existing toolkit: LSL, PsychoPy, MATLAB, and more. Over the last 10 years, we have developed strong relationships in academic research and have a dedicated team of specialists to help you achieve your research goals.

Come get a hands-on with Neon and meet our team at VSS 2025. Or get in touch with us on Discord or via email .

SilicoLabs Bronze Sponsor, Booth 14

SilicoLabs builds tools that allow researchers to capture and decode behaviour to reveal the foundations of learning, decision-making, and actions.

Their flagship software, Labo, allows anyone to quickly and easily create interactive experiences that simulate the real world. Labo captures high-fidelity behavioural data, like hand, face, and eye-tracking when using XR devices, as well as data from biosensors like EEG.

Vivid Vision Bronze Sponsor, Booth 3

Vivid Vision builds vision tests and treatments for VR headsets. Vivid Vision’s visual field test is being used in basic science and in clinical trials of treatments for progressive eye disease, such as AMD and glaucoma. Our treatment for amblyopia is undergoing a clinical trial sponsored by the NEI. Vivid Vision works closely with scientists and clinicians in academia and industry around the world, including in low and middle income countries.

WorldVizVR Bronze Sponsor, Booth 16

Worldviz Logo

WorldViz will present SightLab VR, a plugin to Vizard (Worldviz’s python platform for building VR applications), that can use either a GUI based interface or extended python code to allow users to easily set up VR eye tracking experiments (with little or no code) then collect, visualize, review and analyze eye tracking data. Supports for all the major PC based XR eye tracking devices including Vive Focus Vision, Varjo XR-4, Meta Quest Pro and more

It will allow drag and drop adding of videos and 3D models, and many of the most used analytics methods are included into the provided templates (visual search, reaction time, memory tasks and much more).

Build a scene, run your experiment and review in minutes. Fully expandable and modifiable by using the GUI configurator or python code. Additionally allows for multi user support.

The WorldViz components allow integration of highly targeted VR labs, and we are happy to help customers configure their own labs, tailored to their specific needs.

Opening Night Reception

Friday, May 16, 2025, 7:30 – 9:30 pm, RumFish Beach

Save Friday evening for the spectacular VSS Opening Night Reception! The reception will take place on the beach and lawn at the RumFish Hotel from 7:30 – 9:30 pm.

Don’t forget your drink tickets, which can be found in the back of your badge. Your drink tickets are also good at Demo Night, Club Vision and Chill Vision. Friends and family may accompany you with the purchase of a Friends and Family Pass. See the Registration Desk to purchase passes.

Prepare to sink your toes into the sand and enjoy this fantastic event! Please remember to wear your badge.

Student Volunteer Form

Thank you for your interest in volunteering at VSS 2025. We have received an overwhelming number of applications and have now closed the application form.

Honoring the Contributions of Eileen Kowler: Eye Movements as Windows to the Mind

Eileen Kowler, 1952-2024
Eileen Kowler, 1952-2024

Friday, May 16, 2025, 5:15 – 7:15 pm, Talk Room 2

Organizers: Preeti Verghese1, Marisa Carrasco2, David Melcher3 (1Smith-Kettlewell Eye Research Institute, 2New York University, 3New York University, Abu Dhabi)
Speakers: Preeti Verghese, Marisa Carrasco, Mike Landy, Barbara Anne Dosher, David Melcher, Mary Hayhoe, Rich Krauzlis, Jie Z. Wang, Jacob Feldman

Introduction: 5:15 pm

Preeti Verghese1, Marisa Carrasco2; 1Smith-Kettlewell Eye Research Institute, 2New York University

Eileen & VSS: 5:20 pm

Michael Landy, New York University

Talk 1: 5:25 pm

 “Cogito Ergo Moveo”—The role of cognition and attention in eye movements in the work of Eileen Kowler

Barbara Anne Dosher, University of California, Irvine

“I think therefore I move”, she titled one review paper (Kowler, 1996). One key strand of Eileen Kowler’s research revealed how attention and expectation engage eye movements to serve the needs of vision. Her experimental interventions expanded models of the control of eye movements beyond early models that “assume[d] that eye movements are driven by low-level sensory signals, such as retinal image position or retinal motion”.  She investigated how higher-level cognitive knowledge and goals influence behavior to optimize information acquisition by the eye. Key examples of this work include the role of attention and selection in smooth pursuit, the interaction of attention and perception in single eye movements, and the dynamics of attention used to guide sequences of eye movements. This talk considers some of these findings.

Talk 2: 5:40 pm

What visual representation guides saccades? Reflections on “Shapes, Surfaces and Saccades” (Melcher & Kowler, 1999)

David Melcher, New York University, Abu Dhabi

Back when eye tracking required an entire room full of machinery, pioneering research on the oculomotor system investigated fixational and saccadic eye movements for simple targets, like fixation points, crosses or disks. A series of studies in the 1990’s indicated that, for simple outline shapes, saccades landed near the center-of-gravity. These studies had suggested a relatively primitive representation of the visual target, prior to the linking of elements into contours and shapes. In a series of six experiments, we showed that the saccadic landing position was predicted by the center-of-area of a surface defined by the shape boundary. This finding followed a line of Eileen Kowler’s research showing that eye movements are not merely reflexive, but instead reflect complex visual and cognitive processing. As research has now progressed into the 21st century, key ideas from this 1999 paper have been expanded into studies of natural and 3D scene perception, trans-saccadic object feature prediction, ensemble processing, smooth pursuit, and grasping movements, among other topics. Still, there remain fundamental questions about how sensory and motor systems interact, and to what extent the oculomotor system reflects, and differs from, our conscious visual perceptual experience.

Talk 3: 5:55 pm

Understanding Natural Vision

Mary Hayhoe, University of Texas, Austin

At a time when much eye movement research was dominated by a stimulus driven, linear systems approach, Eileen Kowler demonstrated that eye movement control is intrinsically connected to a range of cognitive processes such as attention, memory, prediction, planning, and scene understanding. She also understood that this is a natural consequence of the fact that eye movements are embedded in ongoing actions, and argued for measuring eye movements in the context of unconstrained behavior. As the eye and body tracking technology have developed, we can measure the operation of these cognitive processes in more diverse contexts, and this has allowed a more unified view of visuo-motor control. If we assume that the job of vision is to provide information for selecting suitable actions, we can view gaze control as part of complex sequential decision processes in the service of goal-directed behavior. In natural behavior, even the simplest actions involve both long and short-term memory, evaluation of sensory and motor uncertainties and costs, and planning that takes place over time scales of seconds in the context of action sequences. Consequently, a decision theoretic context allows a more coordinated approach to understanding natural visually guided behavior.

Talk 4: 6:10 pm

Opening the window from eye movements to cognitive expectations and visual perception

Rich Krauzlis, Laboratory of Sensorimotor Research, National Eye Institute

There was a time not so long ago when eye movements were not widely appreciated as providing windows into visual cognition and perception. Instead, they were viewed mostly as motor reactions to visual “error” signals. This engineering perspective was spectacularly successful in ferreting out the basic principles for smooth pursuit and other eye movements, but it did not easily accommodate non-sensory and non-motor factors. Against this backdrop, Eileen made a series of seminal observations starting with her thesis work, showing that cognitive expectations exert strong influences on smooth pursuit eye movements. Her experimental designs were wonderfully creative and established that there is much more going on in the pursuit system than can be found slipping across the retina. Her results sparked controversy at the time, but her conclusions are now broadly accepted: smooth pursuit is guided not only by low-level visual inputs, but also by higher-level visual processes related to expectations, memory, and cognition. These conclusions now seem almost self-evident, but in fact they took a great deal of perseverance and ingenuity. Eileen should be lauded not only for the significance of her scientific accomplishments, but also for the example she provided of an independent and courageous intellect.

Talk 5: 6:25 pm

Predictive smooth pursuit eye movements reflect knowledge of Newtonian mechanics

Jie Z. Wang1, Abdul-Rahim Deeb2, Fulvio Domini3, Eileen Kowler4

1 University of Rochester, 2 John Hopkins University, 3 Brown University, 4 Rutgers University-New Brunswick

Smooth pursuit employs a variety of cues to predict the future motion of a moving target, enabling timely and accurate tracking. Since real-world motions often obey the Newtonian mechanics, an implicit understanding of these laws should be a particularly effective cue for facilitating anticipation in pursuit. In this study, we focus on understanding how 2-D smooth pursuit incorporates Newtonian mechanics to interpret and predict future motion. We examined the tracking of a “target object” whose motion path appeared to be due to a collision with a moving “launcher object”. The direction of post-collision target motion was either consistent with or deviated from the Newtonian prediction. Newtonian and non-Newtonian paths were run in separate blocks allowing observers the opportunity to predict and learn the target’s path based on the launcher’s movement. Anticipatory pursuit was found to be faster and more precise when post-collision paths conformed to predictions of Newtonian mechanics. Even when there was  ample opportunity to learn the non-Newtonian motion paths, there was evidence of a bias in the direction of the Newtonian prediction. These findings support the idea that smooth pursuit can leverage the regularities in everyday physical events to formulate predictions about future motion. These predictive capabilities of smooth pursuit result in increased compatibility with natural motions and thereby allow for more accurate and efficient tracking of real-world movements.

Talk 6: 6:40 pm

Decisions and eye movements in a dynamic naturalistic VR task (response to Kowler, 1995, personal communication)

Jacob Feldman, Rutgers University

(Joint work with Jakub Suchojad, Sam Sohn, Michelle Shlivko, and Karin Stromswold) 

One of the main goals of cognitive research, continually emphasized by Eileen Kowler, is to understand behavior in realistic, natural contexts. In this talk I’ll talk about a ubiquitous natural task that we have recently studied in virtual reality (VR): social wayfinding. Social wayfinding refers to the way people navigate through environments that contain other people, like a crowded train station. In addition to various generic motivations, like the desire to minimize time and energy expended, this task involves a number of specifically social goals, like avoiding colliding with or rudely cutting off other people. We have been studying this problem in VR, asking our subjects to navigate around both static obstacles (e.g. couches) and dynamic ones (e.g. people walking around). We have also been collecting eye movements so as to better understand how subjects handle the very complex series of decisions they need to make as they move through the environment. Broadly speaking, we find that their eye movements reflect the hierarchical nature of the task, sometimes fixating on “local” obstacles and at other times on “global” features such as the target gate. I’ll end by commenting on how this work addresses (and also fails to address) a question that Eileen posed to me many years ago.

Open Mic: 6:55 — 7:15 pm

Abstract Numbering System

Each abstract is assigned a unique 4 or 5-digit number based on when and where it is being presented. Talk presentations are a 4-digit number and Poster presentations are a 5-digit number (the last two digits of a poster are the board number).

The format of the abstract numbering is DT.RN, where D is the Day, T is the Time, R is the Room and N is the presentation Number.

First Digit – DaySecond Digit – TimeThird Digit – RoomFourth-Sixth Digits – Number
1 Friday1 Early AM talk session1 Talk Room 11, 2, 3… For talks
2 Saturday2 Late AM talk session2 Talk Room 201, 02… For posters
3 Sunday3 AM poster session3 Banyan Breezeway
4 Monday4 Early PM talk session4 Pavilion
5 Tuesday5 Late PM talk session
6 PM poster session

Examples
21.16 Saturday, early AM talk session in Talk Room 1, 6th talk
36.313 Sunday, PM poster session in Banyan Breezeway, poster board 13
53.496 Tuesday, AM poster session in the Pavilion, poster board 96

Newcomer Lemonade Social

Friday, May16, 2025, 7:15 – 8:15 pm, Pirate Island

Organizers: Noah Britt, McMaster University; Brady Roberts, University of Chicago; Student-Postdoc Advisory Committee

New to VSS? Traveling to the conference alone? Come on out to the Lemonade Social to make new friends! This event will offer a chance to meet new people that you can hangout with for the remainder of the conference. We will then all walk over to the Open Reception together afterwards.

Cold lemonade and light snacks will be provided.

Visibility: A Gathering of LGBTQ+ Vision Scientists and Friends

Tuesday, May 20, 2025, 8:30 – 10:00 pm, Banyan/Citrus

Organizers: Michael Grubb, Trinity College; Alex White, Barnard College

visibility logo

LGBTQ students are disproportionately likely to drop out of science early. Potential causes include the lack of visible role models and the absence of a strong community. This social event is one small step towards filling that gap and will bring awareness to continuing challenges for queer scientists.

Please join us on Tuesday night in Banyan/Citrus (located in Jacaranda Hall), before Club Vision.

All are welcome. Snacks, drinks, and camaraderie will be provided.

21st Annual Demo Night

Demos: Monday, May 19, 2025, 7:00 – 10:00 pm, Talk Room 1-2

Please join us Monday evening for the 21st VSS Demo Night, a spectacular night of imaginative demos solicited from VSS members. The demos highlight the important role of visual displays in vision research and education.

This year’s Demo Night will be organized and curated by Daw-An Wu, Cal-Tech; Peter Kohler, York University; Anna Kosovicheva, University of Toronto Mississauga; and Gideon Caplovitz, University of Nevada, Reno.

Demos are free to view for all registered VSS attendees and their families and guests.

The following demos will be presented from 7:00 to 10:00 pm, in the Island Ballroom and Jacaranda Hall

The Anne Boleyn Illusion and Other Mirror-Based Bodily Illusions

Grant Fairchild, Zhihan (Hannah) Guo, Stephanie Dietz, Jared Medina, Emory University
The Anne Boleyn Illusion uses a mirror box setup to produce a robust perception of a phantom sixth finger. This illusion and related mirror box illusions show that perception of the body’s location, orientation, and even its organization can be distorted by bottom-up sensory cues, including visuotactile and visuoproprioceptive synchrony.

La Hire Phenomenon: Seeing One’s Own Blind Spots and Retinal Blood Vessels

Charles Wu, Independent Researcher
I will demo the La Hire phenomenon: To see one’s own blind spots as black, white, or colored holes in their visual field. Linking this phenomenon to the neuroanatomical fact that the blind spot is represented in V1-L4, I claim that V1-L4 is the neural substrate for visual sensation.

Beuchet Chair

Tim Andrews, University of York
A Classic illusion setup that shrinks or enlarges the people and objects around the chair. Two parts of the chair, made at very different size scales, are placed at just the right distance to match in the eye. The failure of the visual system’s size constancy causes people to be shrunk or made giants. Explore the space around it, and add your own visual coincidences.

Travelling wave paradoxes

Christopher W. Tyler1, Josh Solomon2, and Stuart Anstis3; 1Smith-Kettlewell Eye Research Institute; 2City St George’s, University of London; 3UC San Diego
Longitudinal wave propagation is a well-known concept in physical acoustics, but unstudied as a perceptual phenomenon (unlike transverse waves, as commonly used for motion perception). Viewed visually, we show that it is strikingly nonlinear, has paradoxical forward and reverse phases that can be attentionally segregated, and generates no motion aftereffect.

Try out a spatio-spectral-temporal light logger

Geoffrey Aguirre, Zachary Kelly, Samantha Montoya, University of Pennsylvania
Try out a compact, all-day light-logger, and share your demo night experience with an audience. Wander the demo hall wearing the unit (which features prescription lenses), and data from the world and pupil camera, mini-spectrometer, and accelerometer will be broadcast to a screen for the audience to see.

Contour Erasure and Filling-in

Yih-Shiuan Lin1, Chien-Chung Chen2; Mark W. Greenlee1, Stuart Anstis3; University of Regensburg; National Taiwan University; 3University of California, San Diego
Here in our demos, you will see several examples of the fascinating contour erasure effect: objects of various shapes and sizes completely disappear into the background or merge together after only a short adaptation period on their contours. We will show you some old and new variations of contour erasure since its discovery.

Additive contrast and motion blur – unique perceptual aspects with Augmented Reality and head-mounted displays

Xiuyun Wu, Saeideh Ghahghaei Nezamabadi, Takahiro Doi, James Wilmott; Meta Platforms, Inc.
Check out the unique perceptual challenges with Augmented Reality and head-mounted displays, regarding how additivity and head movements affect the visual quality of virtual contents! In this demo, we showcase how virtual faces and text look on top of different backgrounds, and how motion blur changes with your head movements.

Eye Duel: Balloon Burst Showdown

Kurt Debono, Marcus Johnson, SR-Research Ltd.
Take on a collaborative eye movement challenge. Experience synchronised tracking of both you and your opponent’s gaze. Inflate your balloon by looking at it and burst it to win. Deflate your opponent’s balloon with your gaze to slow them down.

Celebrity EYE-Q: Holistic face processing in a tabletop game

Didi Dunin, Ben van Buren, The New School
Here we introduce a tabletop card game called Celebrity EYE-Q, in which players guess celebrities from their eyes, and learn about holistic face processing. Players must guess celebrities while viewing their eyes in isolation or held up to other players’ eyes to elicit disruptive processing by surrounding facial features.

Is your central foveal parvo system long wavelength (red) cone dominant or medium wavelength (green) cone dominant?

Lingyu Gan, George Sperling, University of California, Irvine
Traditional methods of determining isoluminance determine only magno isoluminance, which is trivial because the magno system, like the rod system, is monochromatic. We demonstrate unimpaired grating resolution in parvo isoluminance, which can be long wavelength (red) cone dominant, medium wavelength (green) cone dominant, or mixed. The demonstration determines this classification for individual viewers.

Magic Metamers and Saccadic Suppression, Hidden in Plain Sight

Peter April, Jean-Francois Hamelin, Dr. Jonathan Tong, VPixx Technologies
Can visual information be hidden in plain sight? We use the PROPixx multispectral projector to display a secret message hidden in a uniform field using chromatic metamers. Look through a filter to reveal the hidden message! Back by popular demand, the PROPixx 1440Hz projector demonstrates visual processing during saccades. We present a word which is only visible during your saccades. The player with the fastest word sighting wins a drink ticket!

Visual Phenomena from the Journal of Illusion

Arthur G Shapiro1, Stuart Anstis2, Alex Gokan1; 1American University; 2UC San Diego
The Journal of Illusion has been in operation since 2019 with Akiyoshi Kitaoka as founder and editor. Here we will present some phenomena that have been published in JoI since that time, including some illusions from the authors of this demo.

Level Up Your Aim: Feel Your Way to Perfect Shots in VR!

Ailene Chan, Caltech
Tired of missing your shots? We’ve got you covered! Experience an FPS game with unparalleled precision using vibro-tactile feedback. Feel the difference as adaptive haptic feedback sharpens your aim, and compete to top the leaderboard. Perfect your shots – immersion redefined!

A world without color: Monochromatic light room

Helen E. Feibes1, Spencer R. Loggia1,2, Karthik Kasi1; 1National Eye Institute, National Institutes of Health; 2Department of Neuroscience, Brown University
We provide an immersive experience of the world without color using monochromatic sodium light (589 nm). The demo highlights the myriad benefits color provides in natural vision. It also showcases a surprising finding: That faces, and only faces, provoke a paradoxical memory color, appearing greenish (Hassantash et al, 2019).

Compete with your colleagues for the focused visual attention prize!

Ian Buscay, Robert Lee, Cambridge Research Systems Ltd.
We will use the Brite functional near-infrared spectroscopy (fNIRS) system and the highly precise Display++ visual display to measure your ability to focus your visual attention in this game of skill. Bring your “A” game (as in Attention), or risk being shown up by your colleagues.

Caricature Effect in Data Visualization

Jeremy Wilmer1, Sarah Kerns2; 1Wellesley College; 2Dartmouth College
A hands-on exploration of a striking phenomenon in data visualization: the Caricature Effect.

The Bar Cross Ellipse Illusion

Gideon Caplovitz, University of Nevada Reno
Use your powers of mentalization to take control over your phenomenological experiences in this dynamic quad-stable stimulus!

Immersive Insights: Integrating eye-tracking and biosensors in XR with SilicoLabs and Brain Vision

Kyla Alsbury-Nealy, SilicoLabs
A physical exploration at the frontier of visual neuroscience research, combining eye-tracking from Pupil Labs and brain signals to create a unique experience. Witness real-time neuro-gaze interactions in an immersive Extended Reality (XR) environment powered by LABO, offering a glimpse into the future of research.

The UW Virtual Brain Project

Melissa Schoenlein1, Ross Treddinick2, Nathaniel Miller3, Chris Racey4, Simon Smith2, Kudirat Alimi2, Yash Sancheti2, Chris Castro5, Bas Rokers6, & Karen B. Schloss2,7 ; 1Department of Psychology, High Point University; 2Wisconsin Institute for Discovery, University of Wisconsin-Madison; 3University of Minnesota Medical School; 4Psychology, University of Sussex; 5College of Engineering, University of Wisconsin-Madison; 6Department of Psychology, New York University, Abu Dhabi; 7Department of Psychology, University of Wisconsin-Madison
Take a tour through the sensory systems of the human brain in the UW Virtual Brain ProjectTM. The VR lessons provide immersive experiences of information flow from sensory input to cortical processing. Evidence suggests these experiences are fun and easy to use, which can advance neuroscience education.

Strobe Hallucinations: A Window into Altered Visual Perception

Nathan H. Heller, Johns Hopkins Center for Psychedelic and Consciousness Research
Open your mind to the wild world of Strobe Hallucinations! This interactive demo uses controlled stroboscopic stimulation to induce dynamic, kaleidoscopic percepts that mimic visual effects experienced during altered states of consciousness. Discover how simple, flickering light can profoundly transform perception and reveal surprising features about the brain’s visual processing mechanisms.

Five Illusions Challenge Our Understanding of Visual Experience

Paul Linton
Try the illusions from my VSS poster “experiential3d”: [1]. Perceived stereo depth reflects retinal disparities, not 3d geometry (Linton stereo illusion). [2]. Visual scale is governed by horizontal disparities (Linton scale illusion). [3]. Perceived real-world depth is not inverted in the hollow face illusion (Linton un-hollow face illusion). [4]. Size constancy does not affect perceived angular size (Linton size constancy illusion). [5]. Color constancy does not affect perceptual appearance (Linton color constancy illusion).

Magnetic Sand Illusions: Action Capture

Shinsuke Shimojo, Shengjie Zheng, Eiko Shimojo, Caltech
On top of a dynamic white noise display, move one’s finger/hand. When drawing a letter slowly, it leaves a trace but fades. When the hand moves back and forth on the display, nearby dots seem to follow or be captured by it and follow the hand’s direction. All in these illusions, the display appears as though it is interactive with action.

Me and my Shadow

Stuart Anstis, UC San Diego
Your elongated shadow at sunset is a perfect, vertically-stretched replica of your body. But size- constancy failure makes your shadow’s head look tiny compared with its feet, A long generic shadow is projected from above onto the ground at your feet. You adjust the shadow’s taper until it looks “right”.

Strobo-Pong

VSS Demo Night Staff
Experience the chaos of table tennis under conditions of motion perception breakdown. Recreate a live demo of the classic flash-lag illusion (but please, no smoking). Note for the photosensitive: The room will be illuminated only by a flashing strobe l

Science across Countries and Cultures: Does Difference make a Difference?

Saturday, May 17, 2025, 12:45 – 2:30 pm EDT, Palm/Sabal/Sawgrass

Organizers: Anya Hurlbert (Newcastle University); Shin’ya Nishida (Kyoto University); Rich Krauzlis (Salk Institute); Jes Parker (University of Tennessee-Knoxville)
Moderator: Anya Hurlbert (Newcastle University)
Speakers: Yuko Yotsumoto (University of Tokyo); Reuben Rideaux (University of Sydney); Rosa Lafer-Sousa (University of Wisconsin-Madison); Jenny Bosten (University of Sussex)

This workshop looks at how vision science is done across cultures and countries, recognising and celebrating the fact that VSS is an international community. We aim to explore differences in barriers to collaboration and success, and consider the variety of directives, initiatives and biases that influence the practice of science across different institutions.  The discussion will be led by speakers from around the world, including Yuko Yotsumoto from the University of Tokyo, Reuben Rideaux from the University of Sydney, Rosa Lafer-Sousa from University of Wisconsin-Madison, and Jenny Bosten from the University of Sussex.

All attendees are warmly invited. We want to hear your views on how differences between individual backgrounds, cultures, and countries influence the practice and profile of science, and how we can collectively make a stronger, more cohesive and impactful community.

Refreshments and light lunch will be available.

Yoko Yotsumoto, PhD

University of Tokyo, Japan

Yuko Yotsumoto is a Professor in the Department of Life Sciences at the University of Tokyo and a Director of the UTokyo Institute for Diversity and Adaptation of Human Mind. She received her B.S. and M.S. from the University of Tokyo and earned her Ph.D. in Psychology from Brandeis University in 2005. Following her Ph.D., she conducted postdoctoral research at Boston University and Massachusetts General Hospital before returning to Japan to establish her lab. Her research investigates time and timing perception across timescales from milliseconds to minutes, using psychophysics, neural measurements, and computational modeling. She actively encourages her students to pursue international careers, and many graduates from her lab have gone on to conduct academic research around the world.

Reuben Rideaux, PhD

University of Sydney, Australia

Reuben Rideaux is a Senior Lecturer in the School of Psychology at the University of Sydney, and an Honorary Senior Research Fellow in the University of Queensland’s Brain Institute. Prior to this, he was a Leverhulme Early Career Fellow at the University of Cambridge and a PhD student at the Australian National University. He combines computational modelling, brain imaging, and psychophysics to study perception and cognition. He has a particular interest in developing new methods for understanding brain function and dysfunction, such as bio-inspired artificial intelligence systems, high resolution functional MR spectroscopy, and neural decoding.

Rosa Lafer-Sousa, PhD

University of Wisconsin-Madison, USA

Rosa Lafer-Sousa received her B.A. in Neuroscience from Wellesley College in 2009, and her Ph.D. in Brain and Cognitive Sciences from MIT under the supervision of Nancy Kanwisher. Her postbaccalaureate and doctoral work with Bevil Conway and Nancy Kanwisher aimed to shed light on the functional architecture of the primate visual system and establish links between neural activity, perception, and behavior, with a focus on color as a model system. Rosa is currently a postdoctoral research fellow at NIMH in the Laboratory of Neuropsychology working with, Dr. Arash Afraz in the Unit on Neurons, Circuits, and Behavior, where she investigates the causal role of mid and high-level visual regions in perception and behavior using optogenetics and electrophysiology in macaques. She will soon join the Department of Psychology at University of Wisconsin-Madison as an Assistant Professor.

Jenny Bosten, PhD

University of Sussex, UK

Jenny Bosten is Associate Professor in the School of Psychology at the University of Sussex, specialising in colour vision and individual differences, using neuroimaging, psychophysics and statistical modelling.  Her PhD research was with Professor John Mollon in the Department of Experimental Psychology at the University of Cambridge on the influence of spatial context on visual perception. She worked as a Research Fellow in Neuroscience at Gonville and Caius College, Cambridge (2008-2010 and 2012-2014) on the genetics of individual variation in visual traits and as a post-doctoral researcher at UC San Diego in the lab of Professor Donald MacLeod (2010-2012), where she used psychophysics to investigate colour perception and visual adaptation. She holds major funding from the EU and is highly active in the UK and international colour vision societies.

Anya Hurlbert, MD, PhD

Newcastle University, UK

Anya Hurlbert is a VSS Board member, and Professor of Visual Neuroscience at Newcastle University, where she co-founded the former Institute of Neuroscience and now steers the Centre for Transformative Neuroscience. From Texas originally, with Latvian heritage, she holds degrees from US (Princeton, MIT and Harvard) and UK (Cambridge) institutions, in physics, physiology, brain and cognitive sciences, and medicine. Her research interests include colour perception and its role in cognition and behaviour, with applications in imaging, lighting, and visual art, and the use of AI in ophthalmology. Through her work as Dean of Advancement at Newcastle University and in other roles she supports and promotes opportunities in science and education for students, early career researchers and the public, especially those from underserved backgrounds.

Undergrad Meetup – it all starts here!

Friday, May 16, 2025, 7:15 – 8:15 pm, Banyan/Citrus

VSS welcomes its undergraduate attendees! This event is designed as an opportunity to get to know each other and find our way through the meeting. Let’s gather in Banyan/Citrus (located in Jacaranda Hall) for a playful break after the afternoon science and before walking over to the opening night reception.

Vision Sciences Society