A scaling study of novel auditory stimuli for creating artificial events
Research in categorization, memory, and visual cognition typically employs isolated, static stimuli, whereas most events that are experienced in life are extended in time and potentially overlapping. A challenge in studying these more ecologically valid kinds of events is that it is difficult to relate their physical dimensions with their psychological representations. We begin to address this gap by developing a set of novel auditory stimuli with experimentally controlled physical features that can be related to psychological representations. The auditory modality is not only integral to many real-life events (e.g., speech and music), it is also well-suited for combining stimuli both within and across time. Our stimuli were generated by manipulating the frequency bands above a 200hz fundamental of seven electronically generated sounds such that they exhibited different degrees of spectral overlap. In a scaling study, participants listened to pairs of these sounds and rated their subjective similarity. We used non-metric multidimensional scaling to obtain a three-dimensional psychological representation of these stimuli. One dimension appears to correspond to timbral roughness, while other dimensions do not admit simple verbal labels. There were individual differences in the degree to which participants attended to these dimensions, potentially related to degree of musical expertise. Implications for using these stimuli in memory and categorization paradigms are discussed, particularly in relation to how they may be combined either sequentially or simultaneously to create artificial “events” that mimic the complexity of more naturalistic events.