|
Dyer, F. C. (2002). Animal behaviour: when it pays to waggle (Vol. 419).
|
|
|
Griffin, D. R. (2001). Animals know more than we used to think (Vol. 98).
|
|
|
Cohen, J. (2007). Animal behavior. The world through a chimp's eyes (Vol. 316).
|
|
|
Anderson, J. R. (1995). Self-recognition in dolphins: credible cetaceans; compromised criteria, controls, and conclusions. Conscious Cogn, 4(2), 239–243.
|
|
|
Menzel, E. W. J. (1971). Communication about the environment in a group of young chimpanzees. Folia Primatol (Basel), 15(3), 220–232.
|
|
|
de Waal, F. B. M. (2003). Animal communication: panel discussion. Ann N Y Acad Sci, 1000, 79–87.
|
|
|
Kaminski, J., Call, J., & Tomasello, M. (2004). Body orientation and face orientation: two factors controlling apes' behavior from humans. Anim. Cogn., 7(4), 216–223.
Abstract: A number of animal species have evolved the cognitive ability to detect when they are being watched by other individuals. Precisely what kind of information they use to make this determination is unknown. There is particular controversy in the case of the great apes because different studies report conflicting results. In experiment 1, we presented chimpanzees, orangutans, and bonobos with a situation in which they had to request food from a human observer who was in one of various attentional states. She either stared at the ape, faced the ape with her eyes closed, sat with her back towards the ape, or left the room. In experiment 2, we systematically crossed the observer's body and face orientation so that the observer could have her body and/or face oriented either towards or away from the subject. Results indicated that apes produced more behaviors when they were being watched. They did this not only on the basis of whether they could see the experimenter as a whole, but they were sensitive to her body and face orientation separately. These results suggest that body and face orientation encode two different types of information. Whereas face orientation encodes the observer's perceptual access, body orientation encodes the observer's disposition to transfer food. In contrast to the results on body and face orientation, only two of the tested subjects responded to the state of the observer's eyes.
|
|
|
Scheibe, K. M., & Gromann, C. (2006). Application testing of a new three-dimensional acceleration measuring system with wireless data transfer (WAS) for behavior analysis (Vol. 38).
Abstract: A wireless acceleration measurement system was applied to free-moving cows and horses. Sensors were available as a collar and a flat box for measuring leg or trunk movements. Results were transmitted simultaneously by radio or stored in an 8-MB internal memory. As analytical procedures, frequency distributions with standard deviations, spectral analyses, and fractal analyses were applied. Bymeans of the collar sensor, basic behavior patterns (standing, grazing, walking, ruminating, drinking, and hay uptake) could be identified in cows. Lameness could be detected in cows and horses by means of the leg sensor. The portion of basic and harmonic spectral components was reduced; the fractal dimension was reduced. The system can be used for the detection and analysis of even small movements of free-moving humans or animals over several hours. It is convenient for the analysis of basic behaviors, emotional reactions, or events causing flight or fright or for comparing different housing elements, such as floors or fences.
|
|
|
Maros, K., Gácsi, M., & Miklósi, Á. (2008). Comprehension of human pointing gestures in horses ( Equus caballus ). Anim. Cogn., 11(3), 457–466.
Abstract: Abstract Twenty domestic horses (Equus caballus) were tested for their ability to rely on different human gesticular cues in a two-way object choice task. An experimenter hid food under one of two bowls and after baiting, indicated the location of the food to the subjects by using one of four different cues. Horses could locate the hidden reward on the basis of the distal dynamic-sustained, proximal momentary and proximal dynamic-sustained pointing gestures but failed to perform above chance level when the experimenter performed a distal momentary pointing gesture. The results revealed that horses could rely spontaneously on those cues that could have a stimulus or local enhancement effect, but the possible comprehension of the distal momentary pointing remained unclear. The results are discussed with reference to the involvement of various factors such as predisposition to read human visual cues, the effect of domestication and extensive social experience and the nature of the gesture used by the experimenter in comparative investigations.
|
|
|
Giret, N., Miklósi, Á., Kreutzer, M., & Bovet, D. (2009). Use of experimenter-given cues by African gray parrots ( Psittacus erithacus ). Anim. Cogn., 12(1), 1–10.
Abstract: Abstract: One advantage of living in a social group is the opportunity to use information provided by other individuals. Social information can be based on cues provided by a conspecific or even by a heterospecific individual (e.g., gaze direction, vocalizations, pointing gestures). Although the use of human gaze and gestures has been extensively studied in primates, and is increasingly studied in other mammals, there is no documentation of birds using these cues in a cooperative context. In this study, we tested the ability of three African gray parrots to use different human cues (pointing and/or gazing) in an object-choice task. We found that one subject spontaneously used the most salient pointing gesture (looking and steady pointing with hand at about 20 cm from the baited box). The two others were also able to use this cue after 15 trials. None of the parrots spontaneously used the steady gaze cues (combined head and eye orientation), but one learned to do so effectively after only 15 trials when the distance between the head and the baited box was about 1 m. However, none of the parrots were able to use the momentary pointing nor the distal pointing and gazing cues. These results are discussed in terms of sensitivity to joint attention as a prerequisite to understand pointing gestures as it is to the referential use of labels.
|
|