toggle visibility Search & Display Options

Select All    Deselect All
 |   | 
Details
   print
  Records Links
Author Seyfarth, R.M.; Cheney, D.L. openurl 
  Title The acoustic features of vervet monkey grunts Type Journal Article
  Year 1984 Publication The Journal of the Acoustical Society of America Abbreviated Journal J Acoust Soc Am  
  Volume 75 Issue 5 Pages (down) 1623-1628  
  Keywords *Acoustics; Animals; Auditory Perception; Cercopithecus/*physiology; Cercopithecus aethiops/*physiology; Cues; Dominance-Subordination; Female; Male; Social Behavior; Sound Spectrography; *Vocalization, Animal  
  Abstract East African vervet monkeys give short (125 ms), harsh-sounding grunts to each other in a variety of social situations: when approaching a dominant or subordinate member of their group, when moving into a new area of their range, or upon seeing another group. Although all these vocalizations sound similar to humans, field playback experiments have shown that the monkeys distinguish at least four different calls. Acoustic analysis reveals that grunts have an aperiodic F0, at roughly 240 Hz. Most grunts exhibit a spectral peak close to this irregular F0. Grunts may also contain a second, rising or falling frequency peak, between 550 and 900 Hz. The location and changes in these two frequency peaks are the cues most likely to be used by vervets when distinguishing different grunt types.  
  Address  
  Corporate Author Thesis  
  Publisher Place of Publication Editor  
  Language English Summary Language Original Title  
  Series Editor Series Title Abbreviated Series Title  
  Series Volume Series Issue Edition  
  ISSN 0001-4966 ISBN Medium  
  Area Expedition Conference  
  Notes PMID:6736426 Approved no  
  Call Number refbase @ user @ Serial 703  
Permanent link to this record
 

 
Author Gentner, T.Q.; Fenn, K.M.; Margoliash, D.; Nusbaum, H.C. doi  openurl
  Title Recursive syntactic pattern learning by songbirds Type Journal Article
  Year 2006 Publication Nature Abbreviated Journal Nature  
  Volume 440 Issue 7088 Pages (down) 1204-1207  
  Keywords Acoustic Stimulation; *Animal Communication; Animals; Auditory Perception/*physiology; Humans; *Language; Learning/*physiology; Linguistics; Models, Neurological; Semantics; Starlings/*physiology; Stochastic Processes  
  Abstract Humans regularly produce new utterances that are understood by other members of the same language community. Linguistic theories account for this ability through the use of syntactic rules (or generative grammars) that describe the acceptable structure of utterances. The recursive, hierarchical embedding of language units (for example, words or phrases within shorter sentences) that is part of the ability to construct new utterances minimally requires a 'context-free' grammar that is more complex than the 'finite-state' grammars thought sufficient to specify the structure of all non-human communication signals. Recent hypotheses make the central claim that the capacity for syntactic recursion forms the computational core of a uniquely human language faculty. Here we show that European starlings (Sturnus vulgaris) accurately recognize acoustic patterns defined by a recursive, self-embedding, context-free grammar. They are also able to classify new patterns defined by the grammar and reliably exclude agrammatical patterns. Thus, the capacity to classify sequences from recursive, centre-embedded grammars is not uniquely human. This finding opens a new range of complex syntactic processing mechanisms to physiological investigation.  
  Address Department of Organismal Biology and Anatomy, University of Chicago, Chicago, Illinois 60637, USA. tgentner@ucsd.edu  
  Corporate Author Thesis  
  Publisher Place of Publication Editor  
  Language English Summary Language Original Title  
  Series Editor Series Title Abbreviated Series Title  
  Series Volume Series Issue Edition  
  ISSN 1476-4687 ISBN Medium  
  Area Expedition Conference  
  Notes PMID:16641998 Approved no  
  Call Number refbase @ user @ Serial 353  
Permanent link to this record
 

 
Author Zentall, S.S.; Zentall, T.R. openurl 
  Title Activity and task performance of hyperactive children as a function of environmental stimulation Type Journal Article
  Year 1976 Publication Journal of consulting and clinical psychology Abbreviated Journal J Consult Clin Psychol  
  Volume 44 Issue 5 Pages (down) 693-697  
  Keywords Achievement; Acoustic Stimulation; *Arousal; Auditory Perception; Child; Humans; Hyperkinesis/*etiology; Photic Stimulation; Visual Perception  
  Abstract  
  Address  
  Corporate Author Thesis  
  Publisher Place of Publication Editor  
  Language English Summary Language Original Title  
  Series Editor Series Title Abbreviated Series Title  
  Series Volume Series Issue Edition  
  ISSN 0022-006X ISBN Medium  
  Area Expedition Conference  
  Notes PMID:965541 Approved no  
  Call Number refbase @ user @ Serial 272  
Permanent link to this record
 

 
Author Lampe, J.F.; Andre, J. url  doi
openurl 
  Title Cross-modal recognition of human individuals in domestic horses (Equus caballus) Type Journal Article
  Year 2012 Publication Abbreviated Journal Animal Cognition  
  Volume 15 Issue 4 Pages (down) 623-630  
  Keywords Cross-modal; Recognition of humans; Horse; Equus caballus; Human–horse interaction; Animal cognition; Visual recognition; Auditory recognition; Voice discrimination; Interspecific  
  Abstract This study has shown that domestic horses are capable of cross-modal recognition of familiar humans. It was demonstrated that horses are able to discriminate between the voices of a familiar and an unfamiliar human without seeing or smelling them at the same moment. Conversely, they were able to discriminate the same persons when only exposed to their visual and olfactory cues, without being stimulated by their voices. A cross-modal expectancy violation setup was employed; subjects were exposed both to trials with incongruent auditory and visual/olfactory identity cues and trials with congruent cues. It was found that subjects responded more quickly, longer and more often in incongruent trials, exhibiting heightened interest in unmatched cues of identity. This suggests that the equine brain is able to integrate multisensory identity cues from a familiar human into a person representation that allows the brain, when deprived of one or two senses, to maintain recognition of this person.  
  Address  
  Corporate Author Thesis  
  Publisher Springer-Verlag Place of Publication Editor  
  Language English Summary Language Original Title  
  Series Editor Series Title Abbreviated Series Title  
  Series Volume Series Issue Edition  
  ISSN 1435-9448 ISBN Medium  
  Area Expedition Conference  
  Notes Approved no  
  Call Number Equine Behaviour @ team @ Serial 5698  
Permanent link to this record
 

 
Author Heffner, R.S.; Heffner, H.E. doi  openurl
  Title Hearing in large mammals: Horses (Equus caballus) and cattle (Bos taurus) Type Journal Article
  Year 1983 Publication Behavioral Neuroscience Abbreviated Journal  
  Volume 97 Issue 2 Pages (down) 299-309  
  Keywords auditory range & sensitivity, horses vs cattle  
  Abstract Determined behavioral audiograms for 3 horses and 2 cows. Horses' hearing ranged from 55 Hz to 33.3 kHz, with a region of best sensitivity from 1 to 16 kHz. Cattle hearing ranged from 23 Hz to 35 kHz, with a well-defined point of best sensitivity at 8 kHz. Of the 2 species, cattle proved to have more acute hearing, with a lowest threshold of –21 db (re 20 μN/m–2) compared with the horses' lowest threshold of 7 db. Comparative analysis of the hearing abilities of these 2 species with those of other mammals provides further support for the relation between interaural distance and high-frequency hearing and between high- and low-frequency hearing. (39 ref) (PsycINFO Database Record (c) 2012 APA, all rights reserved)  
  Address  
  Corporate Author Thesis  
  Publisher American Psychological Association Place of Publication Us Editor  
  Language Summary Language Original Title  
  Series Editor Series Title Abbreviated Series Title  
  Series Volume Series Issue Edition  
  ISSN 1939-0084(Electronic);0735-7044(Print) ISBN Medium  
  Area Expedition Conference  
  Notes Approved no  
  Call Number Equine Behaviour @ team @ 1983-29540-001 Serial 5633  
Permanent link to this record
 

 
Author Lemasson, A.; Koda, H.; Kato, A.; Oyakawa, C.; Blois-Heulin, C.; Masataka, N. url  doi
openurl 
  Title Influence of sound specificity and familiarity on Japanese macaques' (Macaca fuscata) auditory laterality Type Journal Article
  Year 2010 Publication Behavioural Brain Research Abbreviated Journal Behav. Brain. Res.  
  Volume 208 Issue 1 Pages (down) 286-289  
  Keywords Auditory processing; Hemispheric specialisation; Specificity; Familiarity; Head-turn paradigm; Macaque  
  Abstract Despite attempts to generalise the left hemisphere-speech association of humans to animal communication, the debate remains open. More studies on primates are needed to explore the potential effects of sound specificity and familiarity. Familiar and non-familiar nonhuman primate contact calls, bird calls and non-biological sounds were broadcast to Japanese macaques. Macaques turned their heads preferentially towards the left (right hemisphere) when hearing conspecific or familiar primates supporting hemispheric specialisation. Our results support the role of experience in brain organisation and the importance of social factors to understand laterality evolution.  
  Address  
  Corporate Author Thesis  
  Publisher Place of Publication Editor  
  Language Summary Language Original Title  
  Series Editor Series Title Abbreviated Series Title  
  Series Volume Series Issue Edition  
  ISSN 0166-4328 ISBN Medium  
  Area Expedition Conference  
  Notes Approved no  
  Call Number Equine Behaviour @ team @ Serial 5081  
Permanent link to this record
 

 
Author Friederici, A.D.; Alter, K. url  doi
openurl 
  Title Lateralization of auditory language functions: a dynamic dual pathway model Type Journal Article
  Year 2004 Publication Brain and Language Abbreviated Journal Brain Lang  
  Volume 89 Issue 2 Pages (down) 267-276  
  Keywords Auditory Pathways/physiology; Brain Mapping; Comprehension/*physiology; Dominance, Cerebral/*physiology; Frontal Lobe/*physiology; Humans; Nerve Net/physiology; Phonetics; Semantics; Speech Acoustics; Speech Perception/*physiology; Temporal Lobe/*physiology  
  Abstract Spoken language comprehension requires the coordination of different subprocesses in time. After the initial acoustic analysis the system has to extract segmental information such as phonemes, syntactic elements and lexical-semantic elements as well as suprasegmental information such as accentuation and intonational phrases, i.e., prosody. According to the dynamic dual pathway model of auditory language comprehension syntactic and semantic information are primarily processed in a left hemispheric temporo-frontal pathway including separate circuits for syntactic and semantic information whereas sentence level prosody is processed in a right hemispheric temporo-frontal pathway. The relative lateralization of these functions occurs as a result of stimulus properties and processing demands. The observed interaction between syntactic and prosodic information during auditory sentence comprehension is attributed to dynamic interactions between the two hemispheres.  
  Address Max Planck Institute of Cognitive Neuroscience, P.O. Box 500 355, 04303 Leipzig, Germany. angelafr@cns.mpg.de  
  Corporate Author Thesis  
  Publisher Place of Publication Editor  
  Language English Summary Language Original Title  
  Series Editor Series Title Abbreviated Series Title  
  Series Volume Series Issue Edition  
  ISSN 0093-934X ISBN Medium  
  Area Expedition Conference  
  Notes PMID:15068909 Approved no  
  Call Number Equine Behaviour @ team @ Serial 4722  
Permanent link to this record
 

 
Author Shettleworth, S.J. openurl 
  Title Stimulus relevance in the control of drinking and conditioned fear responses in domestic chicks (Gallus gallus) Type Journal Article
  Year 1972 Publication Journal of comparative and physiological psychology Abbreviated Journal J Comp Physiol Psychol  
  Volume 80 Issue 2 Pages (down) 175-198  
  Keywords Acoustic Stimulation; Animals; Auditory Perception; Chickens; *Conditioning (Psychology); Conditioning, Classical; Discrimination Learning; *Drinking Behavior; Electroshock; *Fear; *Light; Motor Activity; Photic Stimulation; Punishment; Quinine; *Sound; Taste; Visual Perception  
  Abstract  
  Address  
  Corporate Author Thesis  
  Publisher Place of Publication Editor  
  Language English Summary Language Original Title  
  Series Editor Series Title Abbreviated Series Title  
  Series Volume Series Issue Edition  
  ISSN 0021-9940 ISBN Medium  
  Area Expedition Conference  
  Notes PMID:5047826 Approved no  
  Call Number refbase @ user @ Serial 390  
Permanent link to this record
 

 
Author Parr, L.A. doi  openurl
  Title Perceptual biases for multimodal cues in chimpanzee (Pan troglodytes) affect recognition Type Journal Article
  Year 2004 Publication Animal Cognition Abbreviated Journal Anim. Cogn.  
  Volume 7 Issue 3 Pages (down) 171-178  
  Keywords Acoustic Stimulation; *Animal Communication; Animals; Auditory Perception/physiology; Cues; Discrimination Learning/*physiology; Facial Expression; Female; Male; Pan troglodytes/*psychology; Perceptual Masking/*physiology; Photic Stimulation; Recognition (Psychology)/*physiology; Visual Perception/physiology; *Vocalization, Animal  
  Abstract The ability of organisms to discriminate social signals, such as affective displays, using different sensory modalities is important for social communication. However, a major problem for understanding the evolution and integration of multimodal signals is determining how humans and animals attend to different sensory modalities, and these different modalities contribute to the perception and categorization of social signals. Using a matching-to-sample procedure, chimpanzees discriminated videos of conspecifics' facial expressions that contained only auditory or only visual cues by selecting one of two facial expression photographs that matched the expression category represented by the sample. Other videos were edited to contain incongruent sensory cues, i.e., visual features of one expression but auditory features of another. In these cases, subjects were free to select the expression that matched either the auditory or visual modality, whichever was more salient for that expression type. Results showed that chimpanzees were able to discriminate facial expressions using only auditory or visual cues, and when these modalities were mixed. However, in these latter trials, depending on the expression category, clear preferences for either the visual or auditory modality emerged. Pant-hoots and play faces were discriminated preferentially using the auditory modality, while screams were discriminated preferentially using the visual modality. Therefore, depending on the type of expressive display, the auditory and visual modalities were differentially salient in ways that appear consistent with the ethological importance of that display's social function.  
  Address Division of Psychobiology, Yerkes National Primate Research Center, Emory University, 954 Gatewood Road, GA 30329, Atlanta, USA. parr@rmy.emory.edu  
  Corporate Author Thesis  
  Publisher Place of Publication Editor  
  Language English Summary Language Original Title  
  Series Editor Series Title Abbreviated Series Title  
  Series Volume Series Issue Edition  
  ISSN 1435-9448 ISBN Medium  
  Area Expedition Conference  
  Notes PMID:14997361 Approved no  
  Call Number Equine Behaviour @ team @ Serial 2544  
Permanent link to this record
 

 
Author Harland, M.M.; Stewart, A.J.; Marshall, A.E.; Belknap, E.B. url  openurl
  Title Diagnosis of deafness in a horse by brainstem auditory evoked potential Type Journal Article
  Year 2006 Publication The Canadian Veterinary Journal. La Revue Veterinaire Canadienne Abbreviated Journal Can Vet J  
  Volume 47 Issue 2 Pages (down) 151-154  
  Keywords Acoustic Stimulation/veterinary; Animals; Deafness/congenital/diagnosis/*veterinary; Evoked Potentials, Auditory, Brain Stem/*physiology; Horse Diseases/congenital/*diagnosis; Horses; Male; Pigmentation/physiology; Sensitivity and Specificity  
  Abstract Deafness was confirmed in a blue-eyed, 3-year-old, overo paint horse by brainstem auditory evoked potential. Congenital inherited deafness associated with lack of facial pigmentation was suspected. Assessment of hearing should be considered, especially in paint horses, at the time of pre-purchase examination. Brainstem auditory evoked potential assessment is well tolerated and accurate.  
  Address Department of Clinical Sciences, College of Veterinary Medicine, Auburn University, Wire Road, Auburn, Alabama, USA  
  Corporate Author Thesis  
  Publisher Place of Publication Editor  
  Language English Summary Language Original Title  
  Series Editor Series Title Abbreviated Series Title  
  Series Volume Series Issue Edition  
  ISSN 0008-5286 ISBN Medium  
  Area Expedition Conference  
  Notes PMID:16579041 Approved no  
  Call Number Equine Behaviour @ team @ Serial 5680  
Permanent link to this record
Select All    Deselect All
 |   | 
Details
   print