Monday, May 21, 2018, 6:00 – 10:00 pm
Beach BBQ: 6:00 – 8:00 pm, Beachside Sun Decks
Demos: 7:00 – 10:00 pm, Talk Room 1-2, Royal Tern, Snowy Egret, Compass, Spotted Curlew and Jacaranda Hall
Please join us Monday evening for the 16th Annual VSS Dinner and Demo Night, a spectacular night of imaginative demos solicited from VSS members. The demos highlight the important role of visual displays in vision research and education. This year’s Demo Night will be organized and curated by Gideon Caplovitz, University of Nevada, Reno; Arthur Shapiro, American University; Gennady Erlikhman, University of Nevada, Reno; and Karen Schloss, University of Wisconsin–Madison.
Demos are free to view for all registered VSS attendees and their families and guests. The Beach BBQ is free for attendees, but YOU MUST WEAR YOUR BADGE to receive dinner. Guests and family members must purchase a VSS Friends and Family Pass to attend the Beach BBQ. You can register your guests at any time at the VSS Registration Desk, located in the Grand Palm Colonnade. Guest passes may also be purchased at the BBQ event, beginning at 5:45 pm.
The following demos will be presented from 7:00 to 10:00 pm, in Talk Room 1-2, Royal Tern, Snowy Egret, Compass, Spotted Curlew and Jacaranda Hall:
Paradoxical memory color for faces
Rosa Lafer-Sousa, MIT; Maryam Hasantash, Institute for Research in Fundamental Sciences, Iran; Arash Afraz, National Institute of Mental Health, NIH; Bevil R. Conway, National Institute of Mental Health, NIH and National Eye Institute, NIH
In this demo we use monochromatic sodium light (589 nm), which renders vision objectively achromatic, to elicit memory colors for familiar objects in a naturalistic setting. The demo showcases a surprising finding, that faces, and only faces, provoke a paradoxical memory color, appearing greenish.
Vision in the extreme periphery: Perceptual illusions of flicker, selectively rescued by sound
Daw-An Wu, California Institute of Technology; Takashi Suegami, California Institute of Technology and Yamaha Motors Corporation; Shinsuke Shimojo, California Institute of Technology
Synchronously pulsed visual stimuli, when spread across central and peripheral vision, appear to pulse at different rates. When spread bilaterally into extreme periphery (70˚+), the left and right stimuli can also appear different from each other. Pulsed sound can cause some or all of the stimuli to become perceptually synchronized.
Don’t Go Chasing Waterfalls
Matthew Harrison and Matthew Moroz, University of Nevada Reno
‘High Phi’ in VR, illusory motion jumps are perceived when the random noise texture of a moving 3D tunnel is replaced with new random textures. In 2D, these illusory jumps tend to be perceived in the direction opposite the preceding motion, but in 3D, this is not always the case!
The UW Virtual Brain Project: Exploring the visual system in immersive virtual reality
Chris Racey, Bas Rokers, Nathaniel Miller, Jacqueline Fulvio, Ross Tredinnick, Simon Smith, and Karen B. Schloss, University of Wisconsin – Madison
The UW Virtual Brain Project allows you to explore the visual system in virtual reality. It helps to visualize the flow of information from the eyes to visual cortex. The ultimate aim of the project is to improve neuroscience education by leveraging our natural abilities for space-based learning.
Augmented Reality Art
Jessica Herrington, Australian National University
Art inspired by vision science! Come and explore augmented reality artworks that contain interactive, digital sculptures. Augmented reality artworks will be freely available for download as iPhone apps.
Staircase Gelb effect
Alan Gilchrist, Rutgers University
A black square suspended in midair and illuminated by a spotlight appears white. Now successively lighter squares are added within the spotlight. Each new square appears white and makes the other squares appear to get darker. This demonstrates the highest luminance rule of lightness anchoring and gamut compression.
Hidden in Plain Sight!
Peter April, Jean-Francois Hamelin, Stephanie-Ann Seguin, and Danny Michaud, VPixx Technologies
Can visual information be hidden in plain sight? We use the PROPixx 1440Hz projector, and the TRACKPixx 2kHz eye tracker, to demonstrate images which are invisible until you make a rapid eye movement. We implement retinal stabilization to show other images that fade during fixations. Do your eyes deceive?
Do I know you? Discover your eye gaze strategy for face recognition
Janet Hsiao and Cynthia Chan, University of Hong Kong
At VSS, do you often wonder whether you’ve seen someone before? Are you using good gaze strategies for face recognition? Try our hidden Markov modeling approach (EMHMM; http://visal.cs.cityu.edu.hk/research/emhmm/) to summarize your gaze strategy in terms of personalized regions-of-interest and transition patterns, and quantitatively assess its similarity to commonly used strategies.
Virtual Reality reconstruction of Mondrian’s ‘Salon for Madame B’
Johannes M. Zanker and Jasmina Stevanov, Royal Holloway University of London; Tim Holmes, Tobii Pro Insight
We present the first Virtual Reality realisation of Mondrian’s design for a salon painted in his iconic style which was never realised in his lifetime. Visitors can explore the VR space whilst their eye-movements are tracked allowing the researcher to evaluate possible reasons why Mondrian did not pursue his plan.
Hidden Stereo: Hiding phase-based disparity to present ghost-free 2D images for naked-eye viewers
Shin’ya Nishida, Takahiro Kawabe, and Taiki Fukiage, NTT Communication Science Lab
When a conventional stereoscopic display is viewed without 3D glasses, image ghosts are visible due to the fusion of stereo image pairs including binocular disparities. Hidden Stereo is a method to hide phase-based binocular disparities after image fusion, and to present ghost-free 2D images to viewers without glasses.
Quick estimation of contrast sensitivity function using a tablet device
Kenchi Hosokawa and Kazushi Maruya, NTT Communication Science Laboratories
Contrast sensitivity functions (CSFs) are useful but sometimes impossible in practical uses due to imitations of time. We demonstrate web-based applications to measure CSF in a short time (<3 min) at moderate precisions. Those applications allow collecting CSFs’ data from various types of observers and experimental circumstances.
The optical illusion blocks: Optical illusion patterns in a three dimensional world
Kazushi Maruya, NTT Communication Science Laboratories; Tomoko Ohtani, Tokyo University of the Arts
The optical illusion blocks are a set of toy blocks whose surfaces have particular geometric patterns. When combined, the blocks induce various types of optical illusion such as shape from shading, cafe wall, and subjective contour. With the blocks, observers can learn rules behind the illusions through active viewpoint changes.
Dis-continuous flash suppression
Shao-Min (Sean) Hung, Caltech; Po-Jang (Brown) Hsieh, Duke-NUS Medical School; Shinsuke Shimojo, Caltech
We report a novel variant of continuous flash suppression (CFS): Dis-continuous flash suppression (dCFS) where the suppressor and suppressed are presented intermittently. Our findings suggest approximately two-fold suppression power, as evident by lower breaking rates and longer suppression duration. dCFS thus may be suitable for future investigations of unconscious processing.
Virtual Reality Collaboration with interactive outside-in and tether-less inside-out tracking setup
Matthias Pusch, Dan Tinkham, and Sado Rabaudi, WorldViz
Multiple participants can interact with both local and remote participants in VR – the demo will contain both, outside-in tracking paradigm for some participants, in combo with inside-out integrated tracking for other participants. Importantly, the inside-out system will be entirely tether-less (using so-called consumer backpack VR ) and the user will be free to explore the entire indoor floor plan.
The illusion of floating objects caused by light projection of cast shadow
Takahiro Kawabe, NTT Communication Science Laboratories
We demonstrate an illusion wherein objects in pictures and drawings apparently float in the air due to the light projection of cast shadow patterns onto them. We also conduct a demonstration of a light projection method making an opaque colored paper appear to be a transparent color film floating in the air.
Extension of phenomenal phenomena toward printed objects
Takahiro Kawabe, NTT Communication Science Laboratories
We demonstrate that the phenomenal phenomena (Gregory and Heard, 1983) can be extended toward printed objects placed against a background with luminance modulation. In our demo, the audience experiences not only the illusory translation of the printed objects but also their illusory expansion/contraction and rotation.
Stereo Illusions in Augmented Reality
Moqian Tian, Meta Company
Augmented Reality with environmental tracking and real world lighting projection can uncover new perspectives of some classical illusions. We will present Hallow Face Illusion, Necker’s Cube, and Crazy Nuts Illusion in multiple conditions, while observers can interact with the holograms through Meta 2 AR headset.
A Color-Location Misbinding Illusion
Cristina R. Ceja and Steven L. Franconeri, Northwestern University
Illusory conjunctions, formed by misbound features, can be formed when attention is overloaded or diverted (Treisman & Schmidt, 1982). Here we provide the opportunity to experience a new illusory conjunction illusion, using even simpler stimulus displays.
Thatcherize your face
Andre Gouws, York Neuroimaging Centre, University of York; Peter Thompson, University of York
The Margaret Thatcher illusion is one of the best-loved perceptual phenomena. Here you will have the opportunity to see yourself ‘thatcherized’ in real time and we print you a copy of the image to take away.
The Ever-Popular Beuchet Chair
Peter Thompson, Rob Stone and Tim Andrews, University of York
A favorite at demo Night for the past few years, the Beuchet chair is back again. The two parts of the chair are at different distances and the visual system fails to apply size constancy appropriately. The result is people can be shrunk or made giants.
William F. Broderick, New York University
By windowing a large two-dimensional sinusoidal grating, a perpendicular illusory grating is created. This illusion is quite strong, and depends on the overall size of the image, as well as the relative size of the grating and windows.
Look where Simon says without delay
Katia Ripamont, Cambridge Research Systems; Lloyd Smith, Cortech Solutions
Can you beat the Simon effect using your eye movements? Compete with other players to determine who can look where Simon says without delay. All you need to do is to control your eye movements before they run off. It sounds so simple and yet so difficult!
Chromatic induction from achromatic stimulus
Leone Burridge, Artist/ Medical practitioner in private practice
These are acrylic paintings made with only black and white pigments. On sustained gaze subtle colours become visible.
Katerina Malakhova, Pavlov Institute of Physiology
If we could find a grandma cell, what kind of information would this cell code? Artificial neural networks allow us to study l atent representations which activate neurons. I choose a unit with the highest selectivity for grandmother images and visualize a percept which drives this neuron.
Planarian Eyespot(s) – Amazing redundancy in visual-motor behavior
Kensuke Shimojo, Chandler School, Pasadena, CA; Eiko Shimojo, California Institute of Technology
The planarian dissected body parts, even with incomplete eyespots, show ‘light avoiding behavior” long before the completion of the entire body (and sensory-motor organs). We will demonstrate this live (in Petri dishes) and in video.
Real-Life Continuous Flash Suppression – Suppressing the real world from awareness
Uri Korisky, Tel Aviv University
‘Real life CFS’ is a new method for suppressing real life stimuli. Using augmented reality goggles, CFS masks (“mondrians”) are presented to your dominant eye, causing whatever is presented to your non-dominant eye to be suppressed from awareness – even real objects placed in front of you.
The Motion Induced Contour Revisited
Gideon Caplovitz and Gennady Erlkhman, University of Nevada, Reno
As a tribute to Neomi Weisstein (1939-2015) we recreate and introduce some novel variants of the Motion Induced Contour, which was first described in a series of papers published in the 1980’s.
Illusory Apparent Motion
Allison K. Allen, Nicolas Davidenko and Nathan H. Heller, University of California, Santa Cruz
When random textures are presented at a moderate pace, observers report experiencing coherent percepts of apparent motion, which we term Illusory Apparent Motion (IAM). In this demo, we will cue observers to experience different types of motion percepts from random stimuli by using verbal suggestion, action commands, and intentional control.
Illusory color in extreme-periphery
Takashi Suegami, California Institute of Technology and Yamaha Motors Corporation; Daw-An Wu and Shinsuke Shimojo, California Institute of Technology
Our new demo will show that foveal color cue can induce illusory color in extreme-periphery (approx. 70°-90°) where cone cells are less distributed. One can experience, for example, clear red color perception for extreme-peripheral green flash, with isoluminant red foveal pre-cueing (or vice versa).
Christine Veras, University of Texas at Dallas; Gerrit Maus, Nanyang Technological University
A contemporary innovation of the traditional zoetrope, called Silhouette Zoetrope. In this new device, an animation of moving silhouettes is created by sequential cutouts placed outside a rotating empty cylinder, with slits illuminating the cutouts successively from the back. This new device combines motion, mirroring, depth, and size Illusions.
Spinning reflections on depth from spinning reflections
Michael Crognale, University of Nevada, Reno
A trending novelty toy when spun, induces a striking depth illusion from disparity in specular reflections from point sources. However, “specular” disparity from static curved surfaces is usually discounted or contributes to surface curvature. Motion obscures surface features that compete with depth cues and result in a strong depth illusion.
High Speed Gaze-Contingent Visual Search
Kurt Debono and Dan McEchron, SR Research Ltd.
Try to find the target in a visual search array which is continuously being updated based on the location of your gaze. High speed video based eye tracking combined with the latest high speed monitors make for a compelling challenge.
The photoreceptor refresh rate
Allan Hytowitz, Dyop Vision Associates
A dynamic optotype Dyop (a segmented spinning ring) provides a much more precise, consistent, efficient, and flexible means of measuring acuity. Adjustment of the rotation rate of the segmented ring determined the optimum rate as well as the photoreceptor refresh rate for perceived retrograde motion.
Stereo psychophysics by means of continuous 3D target-tracking in VR
Benjamin T. Backus and James J. Blaha, Vivid Vision Labs, Vivid Vision, Inc.; Lawrence K. Cormack and Kathryn L. Bonnen, University of Texas at Austin
What’s your latency for tracking binocular disparity? Let us cross-correlate your hand motion with our flying bugs to find out.
Motion-based position shifts
Stuart Anstis, University of California, San Diego; Patrick Cavanagh, Glendon College, York University
Motion-based position shifts are awesome!
Back by popular demand. Strobe lights and ping pong!