PLFEST; feasibility data for an open science tool for reliable perceptual learning research
Poster Presentation: Monday, May 19, 2025, 8:30 am – 12:30 pm, Pavilion
Session: Plasticity and Learning: Perceptual learning
Schedule of Events | Search Abstracts | Symposia | Talk Sessions | Poster Sessions
Marcello Maniglia1 (), Jaap Munneke2, Diya Anand2, Samyukta Jayakumar1, C. Shawn Green3, Aaron Seitz2; 1University of California, Riverside, USA, 2Northeastern University, Boston, USA, 3University of Wisconsin-Madison, USA
Perceptual learning (PL), the practice-induced improvement in perceptual tasks, holds the promise of expanding our understanding of brain plasticity mechanisms and developing effective interventions for visual pathologies such as myopia and amblyopia. However, despite its promise, research in this field is often hindered by issues of reproducibility and accessibility. Concerning reproducibility, observed inconsistencies might be due to methodological or environmental differences. Regarding accessibility, despite offering a potentially cost-effective intervention for visual pathologies, PL research is often confined to university laboratories or specialized clinics. To overcome these challenges and enhance the potential of perceptual learning (PL) as a tool for studying learning-related neural processes and broadening its application for interventions, we recently launched PLfest — a UNITY-powered application that enables data collection across different sites and platforms, including computers and tablets. PLfest currently supports several PL training paradigms, as well as visual, attentional and cognitive assessments. Here, we present preliminary data from a large number of healthy individuals as a proof of concept for using PLfest as a framework to test a variety of training paradigms within a controlled and reproducible environment across multiple testing locations. Specifically, we report data from participants trained on variations of a contrast detection task, a classic PL paradigm, conducted across four experimental sites. Training types varied in the parameters of the adaptive procedure (long and short staircases and sessions) and the stimulus configuration (contrast detection with flankers, contrast detection in noise, stimulus variety). Before and after training, participants were tested on several assessment tasks, assessing perceptual, attentional and cognitive performance. The results demonstrate that PLfest is a reliable tool for cross-site data collection, highlighting its potential as a platform for large-scale, multi-site, and remote studies.