The Claim that Pre-School Children are Insensitive to Nonaccidental vs. Metric Shape Properties Challenged by Biologically-Based Shape Scaling
26.31, Saturday, 17-May, 2:45 pm - 6:45 pm, Jacaranda Hall
Ori Amir1, Irving Biederman1,2; 1Psychology Department, University of Southern California, 2Neuroscience Program, University of Southern California
Nonaccidental properties (NAPs) are image properties that are invariant over orientation in depth, e.g., straight vs. curved, and are distinguished from metric properties (MPs), e.g., degree of curvature, that change continuously with variations in depth. The reliance on NAPs allows facile identification of objects at novel orientations. Greater sensitivity to NAPs than MPs has been demonstrated in adults (both in developed and undeveloped cultures), infants, and non-human organisms, i.e., pigeons, macaque IT cells. Two studies, Abecassis et al (2001) and Sera and Millett (2011), however, concluded that pre-school children had not yet developed increased sensitivity to NAPs. These studies reported that adultsbut not young children--were more likely to generalize a name for a novel object (a "wug") with, say, a slight degree of curvature of the axis of a part (a geon), to another object differing in an MP (greater curvature) compared to one that differed in a NAP (straight), a result that led the authors to infer a relatively late onset for NAP sensitivity. To compare sensitivity to NAPs vs. MPs, the physical differences between a standard (the "wug") and the NAP and MP variants must be equated in terms of the physical coding of the earlier stages of the visual system. Here we show, with a model that computes shape similarity based on V1 simple cells, that such differences (from the wug) were greater for the MP than the NAP stimuli, thus likely counteracting the greater sensitivity to NAPs. This inference was supported by an independent study in which every preschool child showed greater sensitivity to NAPs than MPs when the two kinds of shape variation were equated accorded to the V1 model of shape similarity. Taken together, NAP sensitivity and the V1 model can explain 97% of adults and 90% of childrens classification choices.