toggle visibility Search & Display Options

Select All    Deselect All
 |   | 
Details
   print
  Records Links
Author Gácsi, M.; Miklósi, Á.; Varga, O.; Topál, J.; Csányi, V. doi  openurl
  Title Are readers of our face readers of our minds? Dogs (Canis familiaris) show situation-dependent recognition of human's attention Type Journal Article
  Year 2004 Publication Animal Cognition Abbreviated Journal Anim. Cogn.  
  Volume 7 Issue 3 Pages 144-153  
  Keywords Animals; Association Learning; *Attention; Bonding, Human-Pet; Cognition; *Concept Formation; Cues; Dogs/*psychology; *Facial Expression; Female; Humans; Male; *Nonverbal Communication; *Recognition (Psychology); Social Behavior  
  Abstract The ability of animals to use behavioral/facial cues in detection of human attention has been widely investigated. In this test series we studied the ability of dogs to recognize human attention in different experimental situations (ball-fetching game, fetching objects on command, begging from humans). The attentional state of the humans was varied along two variables: (1) facing versus not facing the dog; (2) visible versus non-visible eyes. In the first set of experiments (fetching) the owners were told to take up different body positions (facing or not facing the dog) and to either cover or not cover their eyes with a blindfold. In the second set of experiments (begging) dogs had to choose between two eating humans based on either the visibility of the eyes or direction of the face. Our results show that the efficiency of dogs to discriminate between “attentive” and “inattentive” humans depended on the context of the test, but they could rely on the orientation of the body, the orientation of the head and the visibility of the eyes. With the exception of the fetching-game situation, they brought the object to the front of the human (even if he/she turned his/her back towards the dog), and preferentially begged from the facing (or seeing) human. There were also indications that dogs were sensitive to the visibility of the eyes because they showed increased hesitative behavior when approaching a blindfolded owner, and they also preferred to beg from the person with visible eyes. We conclude that dogs are able to rely on the same set of human facial cues for detection of attention, which form the behavioral basis of understanding attention in humans. Showing the ability of recognizing human attention across different situations dogs proved to be more flexible than chimpanzees investigated in similar circumstances.  
  Address Comparative Ethology Research Group, Hungarian Academy of Sciences, Pazmany P. 1/c., 1117, Budapest, Hungary. gm.art@axelero.hu  
  Corporate Author Thesis  
  Publisher Place of Publication Editor  
  Language English Summary Language Original Title  
  Series Editor Series Title Abbreviated Series Title  
  Series Volume Series Issue Edition  
  ISSN 1435-9448 ISBN Medium  
  Area Expedition Conference  
  Notes (up) PMID:14669075 Approved no  
  Call Number Equine Behaviour @ team @ Serial 2547  
Permanent link to this record
 

 
Author Uller, C. doi  openurl
  Title Disposition to recognize goals in infant chimpanzees Type Journal Article
  Year 2004 Publication Animal Cognition Abbreviated Journal Anim. Cogn.  
  Volume 7 Issue 3 Pages 154-161  
  Keywords Analysis of Variance; Animals; Female; Fixation, Ocular; *Goals; *Intention; Male; Pan troglodytes/*psychology; Pattern Recognition, Visual; *Problem Solving; *Recognition (Psychology)  
  Abstract Do nonhuman primates attribute goals to others? Traditional studies with chimpanzees provide equivocal evidence for “mind reading” in nonhuman primates. Here we adopt looking time, a methodology commonly used with human infants to test infant chimpanzees. In this experiment, four infant chimpanzees saw computer-generated stimuli that mimicked a goal-directed behavior. The baby chimps performed as well as human infants, namely, they were sensitive to the trajectories of the objects, thus suggesting that chimpanzees may be endowed with a disposition to understand goal-directed behaviors. The theoretical implications of these results are discussed.  
  Address Department of Psychology, University of Essex, Wivenhoe Park, C04 3SQ, Colchester, UK. uller40@yahoo.com  
  Corporate Author Thesis  
  Publisher Place of Publication Editor  
  Language English Summary Language Original Title  
  Series Editor Series Title Abbreviated Series Title  
  Series Volume Series Issue Edition  
  ISSN 1435-9448 ISBN Medium  
  Area Expedition Conference  
  Notes (up) PMID:14685823 Approved no  
  Call Number Equine Behaviour @ team @ Serial 2546  
Permanent link to this record
 

 
Author Gothard, K.M.; Erickson, C.A.; Amaral, D.G. doi  openurl
  Title How do rhesus monkeys ( Macaca mulatta) scan faces in a visual paired comparison task? Type Journal Article
  Year 2004 Publication Animal Cognition Abbreviated Journal Anim. Cogn.  
  Volume 7 Issue 1 Pages 25-36  
  Keywords Animals; Eye Movements/*physiology; *Facial Expression; Macaca mulatta/*physiology; Male; Pattern Recognition, Visual/*physiology; *Task Performance and Analysis  
  Abstract When novel and familiar faces are viewed simultaneously, humans and monkeys show a preference for looking at the novel face. The facial features attended to in familiar and novel faces, were determined by analyzing the visual exploration patterns, or scanpaths, of four monkeys performing a visual paired comparison task. In this task, the viewer was first familiarized with an image and then it was presented simultaneously with a novel and the familiar image. A looking preference for the novel image indicated that the viewer recognized the familiar image and hence differentiates between the familiar and the novel images. Scanpaths and relative looking preference were compared for four types of images: (1) familiar and novel objects, (2) familiar and novel monkey faces with neutral expressions, (3) familiar and novel inverted monkey faces, and (4) faces from the same monkey with different facial expressions. Looking time was significantly longer for the novel face, whether it was neutral, expressing an emotion, or inverted. Monkeys did not show a preference, or an aversion, for looking at aggressive or affiliative facial expressions. The analysis of scanpaths indicated that the eyes were the most explored facial feature in all faces. When faces expressed emotions such as a fear grimace, then monkeys scanned features of the face, which contributed to the uniqueness of the expression. Inverted facial images were scanned similarly to upright images. Precise measurement of eye movements during the visual paired comparison task, allowed a novel and more quantitative assessment of the perceptual processes involved the spontaneous visual exploration of faces and facial expressions. These studies indicate that non-human primates carry out the visual analysis of complex images such as faces in a characteristic and quantifiable manner.  
  Address Department of Psychiatry, University of California Davis, 2230 Stokton Blvd., Sacramento, CA 95817, USA. kgothard@email.arizona.edu  
  Corporate Author Thesis  
  Publisher Place of Publication Editor  
  Language English Summary Language Original Title  
  Series Editor Series Title Abbreviated Series Title  
  Series Volume Series Issue Edition  
  ISSN 1435-9448 ISBN Medium  
  Area Expedition Conference  
  Notes (up) PMID:14745584 Approved no  
  Call Number Equine Behaviour @ team @ Serial 2545  
Permanent link to this record
 

 
Author Plotnik, J.; Nelson, P.A.; de Waal, F.B.M. openurl 
  Title Visual field information in the face perception of chimpanzees (Pan troglodytes) Type Journal Article
  Year 2003 Publication Annals of the New York Academy of Sciences Abbreviated Journal Ann N Y Acad Sci  
  Volume 1000 Issue Pages 94-98  
  Keywords Animals; *Facial Expression; Pan troglodytes; Recognition (Psychology); Visual Fields/*physiology; Visual Perception/*physiology  
  Abstract Evidence for a visual field advantage (VFA) in the face perception of chimpanzees was investigated using a modification of a free-vision task. Four of six chimpanzee subjects previously trained on a computer joystick match-to-sample paradigm were able to distinguish between images of neutral face chimeras consisting of two left sides (LL) or right sides (RR) of the face. While an individual's ability to make this distinction would be unlikely to determine their suitability for the VFA tests, it was important to establish that distinctive information was available in test images. Data were then recorded on their choice of the LL vs. RR chimera as a match to the true, neutral image; a bias for one of these options would indicate an hemispatial visual field advantage. Results suggest that chimpanzees, unlike humans, do not exhibit a left visual field advantage. These results have important implications for studies on laterality and asymmetry in facial signals and their perception in primates.  
  Address Department of Animal Science, Cornell University, Ithaca, New York 14853, USA. jmp63@cornell.edu  
  Corporate Author Thesis  
  Publisher Place of Publication Editor  
  Language English Summary Language Original Title  
  Series Editor Series Title Abbreviated Series Title  
  Series Volume Series Issue Edition  
  ISSN 0077-8923 ISBN Medium  
  Area Expedition Conference  
  Notes (up) PMID:14766624 Approved no  
  Call Number refbase @ user @ Serial 175  
Permanent link to this record
 

 
Author Parr, L.A. doi  openurl
  Title Perceptual biases for multimodal cues in chimpanzee (Pan troglodytes) affect recognition Type Journal Article
  Year 2004 Publication Animal Cognition Abbreviated Journal Anim. Cogn.  
  Volume 7 Issue 3 Pages 171-178  
  Keywords Acoustic Stimulation; *Animal Communication; Animals; Auditory Perception/physiology; Cues; Discrimination Learning/*physiology; Facial Expression; Female; Male; Pan troglodytes/*psychology; Perceptual Masking/*physiology; Photic Stimulation; Recognition (Psychology)/*physiology; Visual Perception/physiology; *Vocalization, Animal  
  Abstract The ability of organisms to discriminate social signals, such as affective displays, using different sensory modalities is important for social communication. However, a major problem for understanding the evolution and integration of multimodal signals is determining how humans and animals attend to different sensory modalities, and these different modalities contribute to the perception and categorization of social signals. Using a matching-to-sample procedure, chimpanzees discriminated videos of conspecifics' facial expressions that contained only auditory or only visual cues by selecting one of two facial expression photographs that matched the expression category represented by the sample. Other videos were edited to contain incongruent sensory cues, i.e., visual features of one expression but auditory features of another. In these cases, subjects were free to select the expression that matched either the auditory or visual modality, whichever was more salient for that expression type. Results showed that chimpanzees were able to discriminate facial expressions using only auditory or visual cues, and when these modalities were mixed. However, in these latter trials, depending on the expression category, clear preferences for either the visual or auditory modality emerged. Pant-hoots and play faces were discriminated preferentially using the auditory modality, while screams were discriminated preferentially using the visual modality. Therefore, depending on the type of expressive display, the auditory and visual modalities were differentially salient in ways that appear consistent with the ethological importance of that display's social function.  
  Address Division of Psychobiology, Yerkes National Primate Research Center, Emory University, 954 Gatewood Road, GA 30329, Atlanta, USA. parr@rmy.emory.edu  
  Corporate Author Thesis  
  Publisher Place of Publication Editor  
  Language English Summary Language Original Title  
  Series Editor Series Title Abbreviated Series Title  
  Series Volume Series Issue Edition  
  ISSN 1435-9448 ISBN Medium  
  Area Expedition Conference  
  Notes (up) PMID:14997361 Approved no  
  Call Number Equine Behaviour @ team @ Serial 2544  
Permanent link to this record
 

 
Author Izumi, A.; Kojima, S. doi  openurl
  Title Matching vocalizations to vocalizing faces in a chimpanzee (Pan troglodytes) Type Journal Article
  Year 2004 Publication Animal Cognition Abbreviated Journal Anim. Cogn.  
  Volume 7 Issue 3 Pages 179-184  
  Keywords Acoustic Stimulation; *Animal Communication; Animals; *Discrimination Learning; *Facial Expression; Female; Individuality; Pan troglodytes/*psychology; Photic Stimulation; *Recognition (Psychology); *Vocalization, Animal  
  Abstract Auditory-visual processing of species-specific vocalizations was investigated in a female chimpanzee named Pan. The basic task was auditory-visual matching-to-sample, where Pan was required to choose the vocalizer from two test movies in response to a chimpanzee's vocalization. In experiment 1, movies of vocalizing and silent faces were paired as the test movies. The results revealed that Pan recognized the status of other chimpanzees whether they vocalized or not. In experiment 2, two different types of vocalizing faces of an identical individual were prepared as the test movies. Pan recognized the correspondence between vocalization types and faces. These results suggested that chimpanzees possess crossmodal representations of their vocalizations, as do humans. Together with the ability of vocal individual recognition, this ability might reflect chimpanzees' profound understanding of the status of other individuals.  
  Address Primate Research Institute, Kyoto University, Kanrin, Inuyama, 484-8506, Aichi, Japan. izumi@pri.kyoto-u.ac.jp  
  Corporate Author Thesis  
  Publisher Place of Publication Editor  
  Language English Summary Language Original Title  
  Series Editor Series Title Abbreviated Series Title  
  Series Volume Series Issue Edition  
  ISSN 1435-9448 ISBN Medium  
  Area Expedition Conference  
  Notes (up) PMID:15015035 Approved no  
  Call Number Equine Behaviour @ team @ Serial 2541  
Permanent link to this record
 

 
Author Anderson, J.R.; Kuroshima, H.; Kuwahata, H.; Fujita, K. doi  openurl
  Title Do squirrel monkeys (Saimiri sciureus) and capuchin monkeys (Cebus apella) predict that looking leads to touching? Type Journal Article
  Year 2004 Publication Animal Cognition Abbreviated Journal Anim. Cogn.  
  Volume 7 Issue 3 Pages 185-192  
  Keywords Animals; Association Learning; *Attention; Cebus/*psychology; Cognition; *Concept Formation; Cues; Fixation, Ocular; Humans; *Nonverbal Communication; Recognition (Psychology); Saimiri/*psychology; Social Behavior; Species Specificity  
  Abstract Squirrel monkeys (Saimiri sciureus) and capuchin monkeys (Cebus apella) were tested using an expectancy violation procedure to assess whether they use an actor's gaze direction, signaled by congruent head and eye orientation, to predict subsequent behavior. The monkeys visually habituated to a repeated sequence in which the actor (a familiar human or a puppet) looked at an object and then picked it up, but they did not react strongly when the actor looked at an object but then picked up another object. Capuchin monkeys' responses in the puppet condition were slightly more suggestive of expectancy. There was no differential responding to congruent versus incongruent look-touch sequences when familiarization trials were omitted. The weak findings contrast with a strongly positive result previously reported for tamarin monkeys. Additional evidence is required before concluding that behavior prediction based on gaze cues typifies primates; other approaches for studying how they process attention cues are indicated.  
  Address Department of Psychology, University of Stirling, FK9 4LA, Stirling, Scotland. jra1@stir.ac.uk  
  Corporate Author Thesis  
  Publisher Place of Publication Editor  
  Language English Summary Language Original Title  
  Series Editor Series Title Abbreviated Series Title  
  Series Volume Series Issue Edition  
  ISSN 1435-9448 ISBN Medium  
  Area Expedition Conference  
  Notes (up) PMID:15022054 Approved no  
  Call Number Equine Behaviour @ team @ Serial 2540  
Permanent link to this record
 

 
Author Schwartz, B.L.; Meissner, C.A.; Hoffman, M.; Evans, S.; Frazier, L.D. doi  openurl
  Title Event memory and misinformation effects in a gorilla (Gorilla gorilla gorilla) Type Journal Article
  Year 2004 Publication Animal Cognition Abbreviated Journal Anim. Cogn.  
  Volume 7 Issue 2 Pages 93-100  
  Keywords Animals; *Deception; *Discrimination Learning; Gorilla gorilla/*psychology; Male; *Pattern Recognition, Visual; Photography; *Recognition (Psychology)  
  Abstract Event memory and misinformation effects were examined in an adult male gorilla ( Gorilla gorilla gorilla). The gorilla witnessed a series of unique events, involving a familiar person engaging in a novel behavior (experiment 1), a novel person engaging in a novel behavior (experiment 2), or the presentation of a novel object (experiment 3). Following a 5- to 10-min retention interval, a tester gave the gorilla three photographs mounted on wooden cards: a photograph depicting the correct person or object and two distractor photographs drawn from the same class. The gorilla responded by returning a photograph. If correct, he was reinforced with food. Across three experiments, the gorilla performed significantly above chance at recognizing the target photograph. In experiment 4, the gorilla showed at-chance performance when the event was followed by misinformation (a class-consistent, but incorrect photograph), but significantly above-chance performance when no misinformation occurred (either correct photograph or no photograph). Although the familiarity can account for these data, they are also consistent with an episodic-memory interpretation.  
  Address Florida International University, University Park, FL 33199, Miami, USA. schwartb@fiu.edu  
  Corporate Author Thesis  
  Publisher Place of Publication Editor  
  Language English Summary Language Original Title  
  Series Editor Series Title Abbreviated Series Title  
  Series Volume Series Issue Edition  
  ISSN 1435-9448 ISBN Medium  
  Area Expedition Conference  
  Notes (up) PMID:15069608 Approved no  
  Call Number Equine Behaviour @ team @ Serial 2532  
Permanent link to this record
 

 
Author Fortes, A.F.; Merchant, H.; Georgopoulos, A.P. doi  openurl
  Title Comparative and categorical spatial judgments in the monkey: “high” and “low” Type Journal Article
  Year 2004 Publication Animal Cognition Abbreviated Journal Anim. Cogn.  
  Volume 7 Issue 2 Pages 101-108  
  Keywords Animals; *Classification; Cognition; *Discrimination Learning; Form Perception; Macaca mulatta/*parasitology; Male; *Pattern Recognition, Visual; Semantics; *Space Perception  
  Abstract Adult human subjects can classify the height of an object as belonging to either of the “high” or “low” categories by utilizing an abstract concept of midline that divides the vertical dimension into two halves. Children lack this abstract concept of midline, do not have a sense that these categories are directional opposites, and their categorical and comparative usages of high(er) or low(er) are restricted to the corresponding poles. We investigated the abilities of a rhesus monkey to perform categorical judgments in space. We were also interested in the presence of the congruity effect (a decrease in response time when the objects compared are closer to the category pole) in the monkey. The presence of this phenomenon in the monkey would allow us to relate the behavior of the animal to the two major competing hypotheses that have been suggested to explain the congruity effect in humans: the analog and semantic models. The monkey was trained in delayed match-to-sample tasks in which it had to categorize objects as belonging to either a high or low category. The monkey was able to generate an abstract notion of midline in a fashion similar to that of adult human subjects. The congruity effect was also present in the monkey. These findings, taken together with the notion that monkeys are not considered to think in propositional terms, may favor an analog comparison model in the monkey.  
  Address Brain Sciences Center, Veterans Affairs Medical Center, One Veterans Drive, Minneapolis, MN 55417, USA  
  Corporate Author Thesis  
  Publisher Place of Publication Editor  
  Language English Summary Language Original Title  
  Series Editor Series Title Abbreviated Series Title  
  Series Volume Series Issue Edition  
  ISSN 1435-9448 ISBN Medium  
  Area Expedition Conference  
  Notes (up) PMID:15069609 Approved no  
  Call Number Equine Behaviour @ team @ Serial 2531  
Permanent link to this record
 

 
Author Goto, K.; Wills, A.J.; Lea, S.E.G. doi  openurl
  Title Global-feature classification can be acquired more rapidly than local-feature classification in both humans and pigeons Type Journal Article
  Year 2004 Publication Animal Cognition Abbreviated Journal Anim. Cogn.  
  Volume 7 Issue 2 Pages 109-113  
  Keywords Adult; Animals; Behavior, Animal/physiology; *Classification; Columbidae/*physiology; *Discrimination Learning; Form Perception; Humans; *Mental Processes; *Pattern Recognition, Visual; Species Specificity  
  Abstract When humans process visual stimuli, global information often takes precedence over local information. In contrast, some recent studies have pointed to a local precedence effect in both pigeons and nonhuman primates. In the experiment reported here, we compared the speed of acquisition of two different categorizations of the same four geometric figures. One categorization was on the basis of a local feature, the other on the basis of a readily apparent global feature. For both humans and pigeons, the global-feature categorization was acquired more rapidly. This result reinforces the conclusion that local information does not always take precedence over global information in nonhuman animals.  
  Address School of Psychology, Washington Singer Laboratories, University of Exeter, EX4 4QG, Exeter, UK. K.Goto@exeter.ac.uk  
  Corporate Author Thesis  
  Publisher Place of Publication Editor  
  Language English Summary Language Original Title  
  Series Editor Series Title Abbreviated Series Title  
  Series Volume Series Issue Edition  
  ISSN 1435-9448 ISBN Medium  
  Area Expedition Conference  
  Notes (up) PMID:15069610 Approved no  
  Call Number Equine Behaviour @ team @ Serial 2530  
Permanent link to this record
Select All    Deselect All
 |   | 
Details
   print