Rapid Assessment of Stereo Thresholds Using an Interactive Random Dot Stereogram in Virtual Reality

Poster Presentation 43.403: Monday, May 18, 2026, 8:30 am – 12:30 pm, Pavilion
Session: Spatial Vision: Binocular vision

Triya Belani1 (), Chengyi Ma1, Gabriel J. Diaz1; 1Rochester Institute of Technology

Stereovision provides precise depth perception and supports accurate judgments during visually guided actions. Approximately 7% of the population younger than 60 years is stereoblind [Chopin et al., 2019], yet despite its essential role in daily activities, stereovision is often under-assessed in clinical practice, largely due to limitations of current methods and the presence of monocular cues [Chopin et al., 2019; O'Connor et al., 2018]. To address this issue, we developed a method to render virtual reality (VR) environments using a real-time random-dot stereogram that isolates binocular disparity information in interactive VR tasks. Here, we demonstrate how this tool can be used in a stereo-threshold assessment task. Fine stereoacuity thresholds were measured in VR using a three-disc “Randot” test. A 2-up 1-down adaptive staircase procedure was used to obtain an individual rough stereoacuity estimate and to select the disparity range. The staircase started with an initial disparity of 150 arcseconds and a step size of 50 arcseconds, which was halved after each incorrect response. The procedure ended after four incorrect responses. Thresholds were measured for eight adults (ages 18–37 years) with self-reported normal binocular vision and no corrective lenses. Participants completed 2–3 blocks of 100 trials each, with 10 fixed disparities around the threshold estimate obtained from the staircase procedure. We estimated disparity thresholds by fitting a Weibull psychometric function (guess rate = 1/3, lapse rate = 0.05) and defined threshold as the disparity yielding 64% correct performance. Each block required approximately 5 minutes. The mean within-participant standard deviation of thresholds was 3.15 arcseconds, suggesting that the tool can measure fine stereoacuity with improved isolation of disparity and reduced monocular cues. In addition to this data, we present preliminary data on whether these measures predict performance in a visually guided interception task.