Home | << 1 2 3 4 5 6 7 8 9 10 >> [11–15] |
Linklater, W. L., Henderson, K. M., Cameron, E. Z., Stafford, K. J., & Minot, E. O. (2000). The robustness of faecal steroid determination for pregnancy testing Kaimanawa feral mares under field conditions. N Z Vet J, 48(4), 93–98.
Abstract: AIMS: To investigate the utility of faecal oestrone sulphate (OS) concentrations for detecting pregnancy in mares during behavioural studies of feral horses, in which the collection and preservation of samples is not immediate. METHODS: Oestrone sulphate concentrations were measured in fresh dung samples collected from 153 free-roaming Kaimanawa mares throughout the year. In addition, multiple samples were taken from the same pile to investigate the reliability of diagnosis from a single sample, as well as the influence of time until preservation on OS concentrations. Samples were also taken before and after a 10mm simulated rainfall event to test for dilution of OS concentrations by rain. Oestrone sulphate concentrations in all samples were measured using an enzyme immunoassay. RESULTS: From approximately 150 to 250 days of gestation, OS concentrations were consistently >80 ng/g in mares which subsequently foaled. Mares which did not foal and had low faecal OS concentrations in multiple samples throughout the year had faecal OS concentrations of 31+/-13 ng/g (mean+/-s.d.) with an upper 95% confidence limit of 57 ng/g. Mares sampled from 1 week before to 1 month after behavioural oestrus, and that did not foal in the previous and subsequent seasons, had OS concentrations of 37+/-32 ng/g (mean+/-s.d.) with an upper 95% confidence limit of 100 ng/g. The standard error of oestrone sulphate concentrations in multiple samples from the same dung pile ranged from 1 to 37% of the mean. This large within-pile variation, however, did not result in incorrect diagnoses from single samples unless mares were within 18 days of parturition. Keeping samples at ambient temperatures for up to 16 hours did not affect OS concentrations. Simulated rainfall caused a 17% mean reduction in OS concentrations, but did not change pregnancy diagnoses. CONCLUSIONS: Faecal OS concentrations >100 ng/g were indicative of pregnancy in Kaimanawa mares. For mares more than 150 days post-mating, OS concentrations <57 ng/g were indicative of non-pregnancy, while concentrations between 57 and 100 ng/g provided an inconclusive diagnosis. A single sample from each dung pile collected within 16 hours of defecation was sufficient to accurately diagnose pregnancy in mares 150-250 days post conception. CLINICAL RELEVANCE: Measurement of OS concentrations in dung samples was a reliable and robust indicator of pregnancy status in feral mares 150-250 days post mating. This corresponds approximately to the period from May to August, given the seasonal breeding pattern in this population. This method of determining pregnancy status is suitable for field use in behavioural and demographic studies of wild horse populations.
|
Linklater, W. L., Cameron, E. Z., Stafford, K. J., & Austin, T. (1998). Chemical immobilisation and temporary confinement of two Kaimanawa feral stallions (Vol. 46). |
Collery, L. (1974). Observations of equine animals under farm and feral conditions. Equine Vet J, 6(4), 170–173. |
Harman, A. M., Moore, S., Hoskins, R., & Keller, P. (1999). Horse vision and an explanation for the visual behaviour originally explained by the 'ramp retina'. Equine Vet J, 31(5), 384–390.
Abstract: Here we provide confirmation that the 'ramp retina' of the horse, once thought to result in head rotating visual behaviour, does not exist. We found a 9% variation in axial length of the eye between the streak region and the dorsal periphery. However, the difference was in the opposite direction to that proposed for the 'ramp retina'. Furthermore, acuity in the narrow, intense visual streak in the inferior retina is 16.5 cycles per degree compared with 2.7 cycles per degree in the periphery. Therefore, it is improbable that the horse rotates its head to focus onto the peripheral retina. Rather, the horse rotates the nose up high to observe distant objects because binocular overlap is oriented down the nose, with a blind area directly in front of the forehead.
|
Mills, D. S. (1998). Applying learning theory to the management of the horse: the difference between getting it right and getting it wrong. Equine Vet J Suppl, (27), 44–48.
Abstract: Horses constantly modify their behaviour as a result of experience. This involves the creation of an association between events or stimuli. The influence of people on the modification and generation of certain behaviour patterns extends beyond the intentional training of the horse. The impact of any action depends on how it is perceived by the horse, rather than the motive of the handler. Negative and positive reinforcement increase the probability of specific behaviours recurring i.e. strengthen the association between events, whereas punishment reduces the probable recurrence of a behaviour without providing specific information about the desired alternative. In this paper the term 'punishers' is used to refer to the physical aids, such as a whip or crop, which may be used to bring about the process of punishment. However, if their application ceases when a specific behaviour occurs they may negatively reinforce that action. Intended 'punishers' may also be rewarding (e.g. for attention seeking behaviour). Therefore, contingency factors (which define the relationship between stimuli, such as the level of reinforcement), contiguity factors (which describe the proximity of events in space or time) and choice of reinforcing stimuli are critical in determining the rate of learning. The many problems associated with the application of punishment in practice lead to confusion by both horse and handler and, possibly, abuse of the former. Most behaviour problems relate to handling and management of the horse and can be avoided or treated with a proper analysis of the factors influencing the behaviour.
|
Cooper, J. J. (1998). Comparative learning theory and its application in the training of horses. Equine Vet J Suppl, (27), 39–43.
Abstract: Training can best be explained as a process that occurs through stimulus-response-reinforcement chains, whereby animals are conditioned to associate cues in their environment, with specific behavioural responses and their rewarding consequences. Research into learning in horses has concentrated on their powers of discrimination and on primary positive reinforcement schedules, where the correct response is paired with a desirable consequence such as food. In contrast, a number of other learning processes that are used in training have been widely studied in other species, but have received little scientific investigation in the horse. These include: negative reinforcement, where performance of the correct response is followed by removal of, or decrease in, intensity of a unpleasant stimulus; punishment, where an incorrect response is paired with an undesirable consequence, but without consistent prior warning; secondary conditioning, where a natural primary reinforcer such as food is closely associated with an arbitrary secondary reinforcer such as vocal praise; and variable or partial conditioning, where once the correct response has been learnt, reinforcement is presented according to an intermittent schedule to increase resistance to extinction outside of training.
|
Goodwin, D., Davidson, H. P. B., & Harris, P. (2002). Foraging enrichment for stabled horses: effects on behaviour and selection. Equine Vet J, 34(7), 686–691.
Abstract: The restricted access to pasture experienced by many competition horses has been linked to the exhibition of stereotypic and redirected behaviour patterns. It has been suggested that racehorses provided with more than one source of forage are less likely to perform these patterns; however, the reasons for this are currently unclear. To investigate this in 4 replicated trials, up to 12 horses were introduced into each of 2 identical stables containing a single forage, or 6 forages for 5 min. To detect novelty effects, in the first and third trials the single forage was hay. In the second and fourth, it was the preferred forage from the preceding trial. Trials were videotaped and 12 mutually exclusive behaviour patterns compared. When hay was presented as the single forage (Trials 1 and 3), all recorded behaviour patterns were significantly different between stables; e.g. during Trial 3 in the 'Single' stable, horses looked over the stable door more frequently (P<0.001), moved for longer (P<0.001), foraged on straw bedding longer (P<0.001), and exhibited behaviour indicative of motivation to search for alternative resources (P<0.001) more frequently. When a previously preferred forage was presented as the single forage (Trials 2 and 4) behaviour was also significantly different between stables, e.g in Trial 4 horses looked out over the stable door more frequently (P<0.005) and foraged for longer in their straw bedding (P<0.005). Further study is required to determine whether these effects persist over longer periods. However, these trials indicate that enrichment of the stable environment through provision of multiple forages may have welfare benefits for horses, in reducing straw consumption and facilitating the expression of highly motivated foraging behaviour.
|
Marlin, D. J., Schroter, R. C., White, S. L., Maykuth, P., Matthesen, G., Mills, P. C., et al. (2001). Recovery from transport and acclimatisation of competition horses in a hot humid environment. Equine Vet J, 33(4), 371–379.
Abstract: The aims of the present field-based study were to investigate changes in fit horses undergoing acclimatisation to a hot humid environment and to provide data on which to base recommendations for safe transport and acclimatisation. Six horses (age 7-12 years) were flown from Europe to Atlanta and underwent a 16 day period of acclimatisation. Exercise conditions during acclimatisation (wet bulb globe temperature index 27.6+/-0.0 [mean +/- s.e.]) were more thermally stressful compared with the European climate from which the horses had come (22.0+/-1.8, P<0.001). Following the flight, weight loss was 4.1+/-0.8% bodyweight and took around 7 days to recover. Water intake during the day was significantly increased (P<0.05) compared with night during acclimatisation. Daily mean exercise duration was 72+/-12 min and the majority of work was performed with a heart rate below 120 beats/min. Respiratory rate (fR) was increased (P<0.05) throughout acclimatisation compared with in Europe, but resting morning (AM) and evening (PM) rectal temperature (TREC), heart rate (fC) and plasma volume were unchanged. White blood cell (WBC) count was significantly increased at AM compared with in Europe on Days 4 and 10 of acclimatisation (P<0.01), but was not different by Day 16. In conclusion, horses exposed to hot humid environmental conditions without prior acclimatisation are able to accommodate these stresses and, with appropriate management, remain fit and clinically healthy, without significant risk of heat illness or heat-related disorders, provided they are allowed sufficient time to recover from transport, acclimatisation is undertaken gradually and they are monitored appropriately.
Keywords: Acclimatization/*physiology; Animals; Body Temperature; Body Weight; Breeding; Feeding Behavior; Female; Heart Rate; Heat; Heat Stroke/prevention & control/veterinary; Horse Diseases/prevention & control; Horses/*physiology; Humidity; Male; Respiration; Sports; *Transportation; Tropical Climate
|
Goodwin, D. (1999). The importance of ethology in understanding the behaviour of the horse. Equine Vet J Suppl, (28), 15–19.
Abstract: Domestication has provided the horse with food, shelter, veterinary care and protection, allowing individuals an increased chance of survival. However, the restriction of movement, limited breeding opportunities and a requirement to expend energy, for the benefit of another species, conflict with the evolutionary processes which shaped the behaviour of its predecessors. The behaviour of the horse is defined by its niche as a social prey species but many of the traits which ensured the survival of its ancestors are difficult to accommodate in the domestic environment. There has been a long association between horses and man and many features of equine behaviour suggest a predisposition to interspecific cooperation. However, the importance of dominance in human understanding of social systems has tended to overemphasize its importance in the human-horse relationship. The evolving horse-human relationship from predation to companionship, has resulted in serial conflicts of interest for equine and human participants. Only by understanding the nature and origin of these conflicts can ethologists encourage equine management practices which minimise deleterious effects on the behaviour of the horse.
|
Pell, S. M., & McGreevy, P. D. (1999). Prevalence of stereotypic and other problem behaviours in thoroughbred horses. Aust Vet J, 77(10), 678–679. |