skip to main content
10.1145/3544548.3581166acmconferencesArticle/Chapter ViewFull TextPublication PageschiConference Proceedingsconference-collections
research-article
Open Access
Honorable Mention

Birds of a Feather Video-Flock Together: Design and Evaluation of an Agency-Based Parrot-to-Parrot Video-Calling System for Interspecies Ethical Enrichment.

Published:19 April 2023Publication History

Abstract

Over 20 million parrots are kept as pets in the US, often lacking appropriate stimuli to meet their high social, cognitive, and emotional needs. After reviewing bird perception and agency literature, we developed an approach to allow parrots to engage in video-calling other parrots. Following a pilot experiment and expert survey, we ran a three-month study with 18 pet birds to evaluate the potential value and usability of a parrot-parrot video-calling system. We assessed the system in terms of perception, agency, engagement, and overall perceived benefits. With 147 bird-triggered calls, our results show that 1) every bird used the system, 2) most birds exhibited high motivation and intentionality, and 3) all caretakers reported perceived benefits, some arguably life-transformative, such as learning to forage or even to fly by watching others. We report on individual insights and propose considerations regarding ethics and the potential of parrot video-calling for enrichment.

Figure 1:

Figure 1: Ten instances of pet parrots in video calls with other parrots. Each bird triggered the call with corroboration. From top left to right: P16 calling P18, P1 calling P2, P8 calling P7, P14 calling P4, and P7 calling P6. From bottom left to right: P13 calling P4, P6 calling P8, P4 calling P3, P18 calling P16, and P17 calling P18

Skip 1INTRODUCTION Section

1 INTRODUCTION

Found mostly in the wild in tropical and subtropical regions, parrots are birds of the order Psittaciformes and comprise roughly 400 species. Most species are highly social and live paired within large flocks. In captivity, they are the fourth most popular animal kept as pets, with over 20.6 million pet parrots in the United States alone [3]. Although pet parrots may reduce loneliness in humans [39], they themselves often suffer from a lack of companionship and species-specific socialization. Often described as ornamental, parrots are not considered domesticated animals even when raised and bred in captivity, and they retain complex cognitive and behavioral needs [6]. When kept in homes, they rarely receive appropriate levels of socialization and cognitive stimulation [6] resulting in high degrees of stereotypical behaviors. Stereotypies are abnormal behaviors observed exclusively in captive animals, indicative of poor psychological well-being. For parrots, these include pacing, rocking, hanging on the sides of cages, walking in circles, excessive vocalizations or sleeping, self-mutilation, and feather-picking [22]. The complex origin of these behaviours is posited to be related to physical pain and distress, genetic predisposition, psychological well-being [17, 79], age or species [42]. Certain captive birds have high rates of stereotypies and maladaptive behaviors. For example, nearly 40% of cockatoos (Cacatuidae) and African Grey parrots (Psittacus Erithacus) engage in feather destructive behavior, a form of self-injury [37], and some engage in self-mutilation [38], aggressive behavior [81] and excessive vocalization [23]. As a result, parrots are frequently put up for adoption due to behavior problems arising from their unmet needs [23].

Various approaches have been recommended to captive bird caregivers to alleviate the issues of pet parrot loneliness. One strategy consists of implementing environmental enrichment [19, 23], such as curated enclosure design [23], foraging [51], toys [14], and adapted puzzles [21] that can help cognitively stimulate the birds. As issues often persist, bird welfare specialists also recommend positive reinforcement-based behavior modification [23, 81] or even the use of restraints/collars to avoid stereotypical excessive feather plucking and self-injuries [5, 68]. Pharmacological approaches such as collars or medication are sometimes used as a last resort [40, 57]. However, some of these methods have drawbacks, as long-term use of collars can damage the neck leading to paralysis, and long-term use of medication can be harmful [57]. Additionally, none of these approaches appear to replace the importance of socialization with other parrots, which has become a recommended best practice in research [6, 84] and is required by law in some countries, such as Germany [65].

Theoretically, co-housing parrots with other birds can help improve bird well-being; however, there are many drawbacks. Some caretakers keep multiple birds hoping that they form a “flock” as seen in their wild counterparts, which often does not happen in captive forced socialization [6]. Local avian groups have attempted to provide opportunities for socialization with other birds through weekly or monthly gatherings of bird owners and their birds [52]. However, parrots have many uncured transmissible diseases. One of them is avian ganglioneuritis, a terminal wasting disease that afflicts approximately 43% of the parrot population, limiting the feasibility of in-person socialization with other birds [59]. In addition to the practical difficulties in arranging in-person social gatherings for birds, parrot socialization in person also poses ethical questions. Forced socialization through conspecific housing (i.e. co-housing several birds of the same species) can lead to aggression and injuries. Furthermore, when birds are forced to interact socially with others, they lack agency around their relationships, whereas, in the wild, parrots retain a great degree of agency around mate selection and flock interactions. Thus, the parrots’ agency is central to optimal parrot socialization.

In recent decades, the Human-Computer Interaction (HCI) community has worked on providing tools, approaches and applications to help support the well-being and socialization of captive animals [54]. These include systems to allow dogs to remotely interact [35], Tinder-like apps for orangutans [72] and music players for zoo-housed birds to interact with visitors [44]. Such research has provided important grounding for agency-driven approaches for animals in managed care. This research also suggests the possibility of leveraging new technologies to help improve the well-being and socialization of animals.

Given the unmet social needs of parrots and their high cognitive abilities, this project tackles the issues of pet bird loneliness and lack of species-specific social stimulation by exploring the potential of using video calls to provide birds with the agency to socialize with other birds. This approach could alleviate several of the drawbacks of other methods (such as diseases and unintended consequences) while mitigating risks. Online bird interactions may also provide a simple way to introduce different birds to each other and help identify potential matching companion birds. In this research, we do not aim to demonstrate parrot cognitive ability, but we are assessing the potential value of video calling other parrots as a form of enrichment. Although grounded in animal cognitive and behavioral research, our approach takes an HCI perspective in assessing the usability and value of the intervention for both the bird and the human caregiver.

In this regard, our investigation was guided by the following research question: Would parrots freely engage in making video calls to other parrots? To assess this question, following a pilot experiment and survey of parrot experts, we ran a three-month experiment with 18 birds which comprised two phases. In Phase 1 (“meet-and-greet”), each bird learned the association between ringing a bell, touching the photo of another bird on their tablet, and being connected with that bird on a video call. Caretakers were trained to end the call if their bird showed signs of stress, disengagement or left the space. Within Phase 1’s two-week period, every bird remotely met twice with all the other birds in their group (three to four members per group) and received treats as rewards for the bell/screen/call association but not during calls. Phase 2 ("open-calls") lasted up to ten weeks, during which the birds in each group were simultaneously provided with the bell for extended periods. If one rang their bell and then selected a bird’s image on the tablet, they would trigger a video call. No treats/rewards were provided during Phase 2, and the calls were ended after a maximum of five minutes, or earlier if a bird showed disengagement. To assess the usability and value of the system for the birds, we constructed a methodology assessing the following:

First, do the parrots display evidence of their ability to intake sensory perception of the video calls?

Second, if given the agency to request calls, would the birds engage in video calls with other birds? This question is considered quantitatively in terms of the number of calls.

Third, can the birds’ agency be further supported by quantitative measurements of their engagement during calls?

Finally, beyond immediate enjoyment, can we assess the potential value of such calls in the birds’ overall well-being, socialization, and the benefit as perceived by the bird owners?

The contributions of this project are 1) an ethical framework and set of rules to set up and run bird-triggered video calls between pet parrots, 2) an 18-bird longitudinal study over the course of three months investigating the potential of tablet-based video calling for parrots, 3) analyses and insights into how birds use their new agency and engagement in the video-calling process, 4) interviews and feedback from the human caretakers (the participants self-reported the perceived benefits and behavior change in their birds during the study) and 5) new perspectives and a design proposal for future parrot-centered video-calling systems. The Ethics Committee approved this project for Animal (number EA 01/22) and Human Use as Subjects (number 300210172) at Glasgow University.

Skip 2BACKGROUND Section

2 BACKGROUND

Various prior art from animal cognitive and behavioral research, as well as the literature on animal-computer interaction (ACI) and HCI ground our work and support our investigations and methods. In this section, we first explore the feasibility of our endeavor of using screen and video calls to tackle the issue of bird loneliness by reviewing the literature on bird perception, physiology and biology and prior studies on birds and screen interactions. Second, we present some of the historical challenges in animal agency research. Third, we frame our work within the field of HCI and ACI and present prior art in systems design for animal video calls and parrot agency. Finally, we explore the methodologies from previous works and best practices in assessing animal agency, engagement and intentionality.

2.1 Bird Perception and Screen Interaction

The first step validates the feasability of using off-the-shelf tablets to enable birds to video call each other is to assess parrots’ biological and physiological abilities to perceive and make sense of screen-based stimuli. Although parrot vision varies significantly from humans as well as within parrot species, there is strong evidence that parrots can perceive and make sense of screen-based stimuli [28].

Trained birds have been interacting with screens since before the advent of the human internet. In the 1940 Ocron project, Skinner used trained pigeons (Columbidae) lodged inside missile capsules to guide their trajectory by pecking at a dot on a screen [74]. Since then, using operant conditioning and leveraging birds’ excellent visual and tactile accuracy, researchers have used pecking behavior on screens to explore birds’ cognitive abilities. Researchers have used touchscreen devices to measure pigeons’ abilities to learn associations [24], discriminate quantities [71], engage in complex match-to-sample tasks [85], and measure whether they can recognize familiar and unfamiliar faces [77].

Perceiving single dots on screens is very different than making sense of complex scenes and the use of automated learning devices in laboratories [8, 70]. Various factors can undermine birds’ abilities to make sense of complex screen-based stimuli designed specifically for human vision, including color spectrum, critical flicker-fusion rate (FFR). Many parrot species are known to perceive colors in the UV spectrum [49]. This could potentially affect their sense-making abilities for screen-based images. Some bird species appear to use UV for sexual signaling [31], hunting and foraging [83]. However, prior work indicates that birds – and specifically pet parrots – can make sense of complex images without UV, especially as UV appears to often be used for redundant information or time-sensitive signaling [83]. Pigeons have been shown to recognise faces in photographs [77], and Goffin’s cockatoos (Cacatua goffiniana) and have corroborated complex discrimination selections on touch screen devices [12]. In addition, pet birds may have gotten used to a housing existence with less UV, as artificial indoor lighting is not a source of UV light.

The second, and perhaps most critical, factor that might affect birds’ abilities to make sense of moving images is the critical flicker fusion frequency (CFF), defined as the frequency at which flickering light can be perceived as continuous (50–90 Hz in humans). To support their efficient visual systems, most flying animals are thought to have a higher CFF threshold than humans [56] (like pigeons—143 Hz or peregrine falcons (Falco)—129 Hz). A parrot’s CFF is about twice that of humans [63] meaning that most parrots perceive screen displays as flickering rather than continuous. Moreover, research has shown a correlation between CFF and animal size [32]. In the case of parrots, the high variability in their body size may lead to highly variable CFF within species, and thus variability in their ability to make sense of screen-based stimuli.

However, despite the reduced color spectrum and the likelihood of flicker, there is evidence that birds can still make sense of complex moving scenes on screens, as a completely faithful reproduction of color or movement is not necessary for the viewing animal to produce a meaningful response [13]. Several studies have suggested sense-making based on regular display technology. A study by Hämäläinen showed that tits (Cyanistes caeruleus & Parus major) could learn new behaviors by watching prerecorded videos of other birds [30] even though their very small body mass would imply a very high CFF. Video images can provide effective stimuli for some birds as pigeons can respond to apparent motion as though it was real motion [47]. Video recordings for birds can also effectively substitute for social behavior such as alarm calling [18]. In this context, animal cognition behaviors highlight the importance of interactivity and appropriate response to bird behavior as factors to increase engagement with screens and underscore the need to confirm that the animal response to a video image is comparable to a real stimulus [13]. A widely adopted standard validation test compares the subject’s attention to the object of interest on the screen with objects in real life [7, 9]. Attention must be inferred from an animal’s behavior [7]. In our work, for the initial exposure to video calls in Phase 1, we assess each bird’s attention through their timely response to the other bird appearing on screen, moving off screen or moving within the screen.

2.2 Animal-Human-Computer Interactions and Animal Video-Calling

In recent years, the HCI and ACI communities have worked on developing tools to provide enrichment and socialization to captive animals, such as dogs [35] and cats [80], zoo animals [33, 44], farm animals like chickens [45, 48] and cows [29], as well as undomesticated animals such as elephants [20] and marine mammals [67, 69]. New ubiquitous technologies and a more grounded understanding of animal needs have led to the development of new tools to automatically recognise animal behaviors [86], support pet-owners in their care [36] or help interspecies rapport building [41].

An estimated 59% of households in the United States keep pets, many of which are often left home alone for long periods of time [60]. This has motivated research to support pets’ physical and cognitive needs [35]. Recently, some devices have shifted to camera and voice interfaces connecting pets and owners remotely. Research in this area has included creating video call interfaces for dogs to ring their owners [36] and investigating how dogs can use the internet to interact with each other [35] and humans [46]. When looking at how animals use technology, both as a user and a usee (Baumer’s term for being entangled within internet devices) [4], interactivity and agency appear as two important considerations to ensure ethical and meaningful animal experience. A notable risk of using technology is blind trust, where owners wrongly believe that technology can replace advice from professionals [46]. For domesticated animals, Lawson et al. [46] have highlighted that pet owners are willing to trust technology over professional veterinary specialists and other experts regarding their pets’ health and well-being.

One way this has been addressed is by requiring that the animal consent (“consent” defined here in animals is generally agreed to be reflected by the behavior in choosing) by exerting agency and choosing to use technology in an uncoerced manner. However, for animals to express consent, they must be given the capacity to do so. Within this context of technology, agency can be understood as consent plus capacity. One approach to measuring the strength of intention consists in finding out how hard an animal is willing to “work” or “pay” to access a requested outcome [15]. This economics-based approach, which aims to assess how highly an animal values a proposed option, is sometimes implemented in the context of operant conditioning by observing how many times the animal is willing to press a button to access a reward or how long it takes for the animal to make a choice [43]. As such, a predetermined specific quantifiable corroboration task can help provide additional evidence confirming the animal’s intentional choice [12]. Corroboration consists of either adding an extra step for the animal to confirm their agency, or observing behavior or time-on-task, which may further substantiate the animal’s choice [16]. In this study, we implement a corroboration task – the participant bird not only choosing to ring a bell, but also choosing to touch the photo of the bird to call– to increase the value of consent from the participating birds, which is discussed below.

2.3 Agency and Engagement

Some interactive systems have been shown to provide animals with agency in shaping their social or physical surroundings, which may yield substantial well-being benefits, including reduced stereotypies and improvements in behavior [26, 61, 64]. For example, researchers have developed choice-based technology for environmental control such as dogs choosing to call their caretakers [36], dolphins choosing food and toys [66], as well as a selection of music enrichment [28, 34, 44]. While the definition of “choice” is debatable [54], the perception of self-agency can be beneficial to animals [64]. Thus, there is potential for interactive tools that maintain the animal’s control and agency.

Prior research on agency-based technology for animals proposes frameworks for assessing agency use (intentionally using one’s capacities to express consent and gain control of one’s environment) and engagement (interest and focus on the task chosen). A recent study has proposed the categorization of animal agency into four levels 1) passive/reactive agency, distinguishing between passive and purely reactive behaviors, 2) action-driven agency in which the animal is behaviourally pursuing desirable outcomes, 3) competence-building agency, where the animal engages with the environment to gain skills and information; and 4) aspirational agency in which the animal achieves long-term goals through planning and reflection [75]. Similar agency scales exist for focused animal activities, such as making music [27].

Our current work fits between the second and third levels of agency. Although during the training “meet-and-greet” phase, the birds were invited to simply react to the bell and screen being presented to them, during the "open call" phase (Phase 2) they had to take action by going to the bell to trigger the call. This also required the birds to have built the required competency (through the association of bell and screen touching) in order to obtain the call. Indeed, the concept of competency is central to understanding agency [76]. In our case, we chose to assess agency quantitatively. After training and validation by several experts, we considered animals who have grasped the association between bell/screen/call to have acquired the appropriate competency. We assess whether the birds use their agency by counting the number of outgoing calls they choose to make independently during Phase 2.

Figure 2:

Figure 2: Pilot protocol and resulting number of calls requested and made by the birds

Beyond trigger-based agency, engagement can also be assessed quantitatively through time-on-task [2]. In this context, longitudinal repeated sessions are preferred when compared with cross-sectional studies, as animal preferences can take time to settle, and momentary choices may bias the overall preference response [53]. Research on fish [25], birds [73], and mice [50] have used such time-based proxies for motivation. Social recognition and mate preference in mice are often performed using a three-chambered protocol in which three acrylic boxes are placed next to each other, and the individual in the middle chamber can access the two other chambers. To assess their engagement vis-a-vis the other two, a common approach is to measure the time spent on either side [82]. In our research, we considered time-on-call (limited to five minutes), as a quantitative proxy for measuring engagement. If the caller bird disengaged from the call before the five minutes were over, it would indicate a low engagement. If the call lasted for the full five minutes or if the other bird ended the call, we consider the caller bird to have been engaged in the call. Measured positive engagement provides further validation of the birds’ agency and intent around making the video calls.

Skip 3PILOT EXPERIMENT AND EXPERT SURVEY Section

3 PILOT EXPERIMENT AND EXPERT SURVEY

3.1 Expert Survey

To assess the potential of leveraging technology to reduce parrot loneliness using technology we also collected feedback from four professional parrot experts: E1 (a Lead Wildlife Care Specialist with 30 years of experience working with parrots), E2 (a professional bird trainer with 47 years of experience working with parrots), E3 (a PhD scientist in the field of parrot communication and cognition with 45 years of experience working with parrots) and E4 (a professional animal trainer and consultant with 25 years of experience working with parrots). Three of the four experts reported the potential of using screen-based interaction with parrots. “I have seen parrots interact/get excited when seeing other parrots on screen, either through computer or tablet devices” (E1). “Certainly video connection and interaction could be very advantageous for birds” (E2). “Zoom meetings are a start”(E4). Expert E3 emphasized potential perceptual issues such as lack of UV information from the screen, motivating investigation of parrot perception and sense-making.

The experts broadly approved our approach and hypothesis around improving engagement with technology: “I do believe this could provide meaningful enrichment”(E1). “I think it could be beneficial” (E2). They also provided feedback regarding the risk of destructive behaviors: “larger macaws could potentially still break the device”(E1). “You have already identified the biggest challenges, which include safety, destruction of equipment, and training time. But if those issues could be solved, I think there could be great benefits in this" (E2). “Human caregivers would be crucial in habituation” (E3). “Absolutely! As long as the tablet is safe from the birds destroying it, falling, or falling on them/hurting them in some way, I don’t see any issues.” (E4). This feedback supports the need for monitoring and human presence.

3.2 Pilot Experiment

To first tackle the feasibility of parrot video-calling, we ran an 11-week pilot experiment with four birds: B1, a Goffin’s cockatoo (aged 9, F), B2, an African grey (aged 11, M), B3, a Senegal parrot (aged 20, M), and B4, a cockatiel (aged 1, M). Training for the protocol lasted three weeks, and the birds were tracked for an additional two months to see if additional calls were made. All parrots had prior experience interacting with a tablet for preference selection. The birds were recruited through Parrot Kindergarten, an online coaching and educational program for parrots and their owners led by the second authors. The setup occurred in the birds’ regular play area, and the tablets were placed for the birds to interact through direct touch. The caretakers were provided training to handle possible fear/aggression situations.

The pilot protocol, summarized in Figure 2 comprised initial introduction sessions during which caretakers used food reinforcers (seeds) to teach their parrot to select another bird’s photo on an interactive communication board on their tablets. The second author, a researcher and parrot training expert, provided the training protocol. When a bird successfully touched the photo on the tablet, the caretakers gave them a treat and initiated a video call to the other bird via a video messaging application. Each bird had two sessions with each other bird of their group until all of the parrots had met for a total number of six sessions per bird over six days. Afterwards, the birds had access to the preference selection program on their tablets for two weeks. The birds regularly use their tablet daily during dedicated “tablet times”, but during the study, they were given additional access to this new calling feature. By navigating the call menu themselves on their board app, they could request calls by pressing another bird’s photo. No treats were given during these two weeks for calls, and caregivers were instructed to limit their encouragement to reduce confounding motivation. We counted each time a bird requested a call during these two weeks. When both caretakers were available to facilitate the calls between the two birds, they were asked to video record the sessions and to fill out post-session diaries. All calls requested by the birds and all calls that took place in the following two months are summarized in Figure 2 (bottom right). Because of time-zone differences and synchronization issues, not all requested calls could actually be made.

Despite a low number of completed calls, the pilot was successful from an ethical and experiential perspective. The calls that were made appeared positive and engaging for the birds. Indeed, during calls and training, most of the behaviors exhibited by the birds were observed as positive (moving towards the screen, vocalizing to the other bird, touching the screen with their beak, relaxed resting, preening and beak grinding behaviors). At times, the birds appeared ambivalent to the video call or walked/flew to another location. No fear or aggression behavior was observed. However, some birds exhibited some anxious behavior when the caregivers moved away from the screen per the protocol (e.g. pacing, looking at the caregiver and vocalizing, flying/walking to the caregiver, and moving away from the tablet). Additionally, when caregivers adjusted the tablet setup during calls, some birds backed away from the screen. With these observations we adjusted the protocol to allow the caregivers to be in close proximity during the calls and encouraged them to interact with their birds during the calls (e.g. pointing to the screen and verbally encouraging their bird during calls). This led to a reduction in frustration/anxious behavior.

3.3 Learnings from Expert Survey and Pilot Study

The learnings from the pilot and expert survey led to various refinements of the protocol, such as:

Human in the loop: The presence of and encouragement from caretakers appeared instrumental in reducing birds’ neophobia (fear of new objects and experiences). Humans also helped to ensure safety by keeping the screen away in case of destructive behavior. This does bring the question of intrinsic vs extrinsic motivation, however, our approach is aimed as a form of enrichment that should fit in the animal’s existing social and living context.

Time zone synchronization: To ensure that all calls requested by the birds could be answered, groups were optimized for availability. We grouped birds by bird size, species, and caregiver timezone for the study.

Screen exposure-agnostic: It appeared that for a larger study, not all birds would be touch-screen trained, and for the full study, we adjusted the protocol to include ringing a bell to request to bring up the bird selection screen and then selecting a picture of the bird they wanted to call either via a tablet or a physical photo.

Corroboration: In addition to only touching the bell, we included a second level of corroboration of the bird’s intention by touching the image of the other bird to call. This served both as corroboration of intention and as an indication of a choice if the group comprised more than two individuals.

Limiting number and duration of calls: We limited the call time and duration to reduce the risk of caretaker fatigue as well as to alleviate the unknown risk of overattachment or pair bonding in the birds.

Skip 4MATERIALS AND METHODS Section

4 MATERIALS AND METHODS

To our knowledge, a longitudinal at-home study following a large number of birds is unique in the literature. Some studies have included home-based birds, including a single African grey [11] and a single cockatoo for one month [12]. Parrots have also been studied in laboratory settings, e.g., one [10] or three African greys over a five-day period [1], 12 cockatoos participating in a touch screen study over 14 weeks [62] or a four-week study on 62 orange wing Amazons studied in a lab for stereotypies [58]. In this project, we worked longitudinally over three months with 18 individuals in a home setting. Each bird was followed closely, and individually supported.

4.1 Participants

To recruit home-based pet parrot participants, we advertised through social media and within the Parrot Kindergarten network. Regarding inclusion criteria, each parrot was required to be over one year old, to have no known behavioral issues, and to be comfortable with looking at screens and touching objects. The parrot caretakers were required to have some prior training working with animals and to have sufficient available times to facilitate the interactions. Eighteen parrots were selected to participate in the study (P 1-18) none of whom had participated in the pilot experiment. Participants were grouped according to time zone and availability as well as bird size/species. During Phase 1, the birds were divided into four groups of four and two groups of three. After Phase 1, some participants were excluded (P3, P5, and P15) as they appeared uncomfortable during calls and stayed as far away as possible from the tablet, not engaging in any interaction despite additional support. As these types of behavior, if repeated, could lead to further frustration and possible trauma, we excluded them from the study for ethical reasons. This led to a total of 15 remaining participants reorganized into three groups of three and three groups of two birds. Figure 3 summarizes the participants’ IDs, names, species, sexes, and prior experiences with tablets, as well as their groupings during Phases 1 and 2.

Figure 3:

Figure 3: Subject demographics including participant ID, species, age, sex, social history, phase 1 grouping (groups of 4 birds), phase 2 grouping (groups of 2 to 3 birds) and device used for bird video-calling. Three birds (P3, P5, and P15) were released from the study at the end of Phase 1 and didn’t participate in Phase 2

4.2 Housing and Setup

Participant birds resided in family homes and stayed in their regular environment during calling sessions. Bird caretakers were instructed to not change anything in their bird’s regular schedule, activities and feeding during the study except for the study session times themselves. The birds had access to food and water ad libitum including during calling sessions. Special treats were only provided during the bell/screen touch/call association in Phase 1 but not during calls nor during Phase 2. The setup illustrated in figure 4 included the following physical items:

a tablet or cell phone (bird device) with Facebook Messenger that the bird felt comfortable with

a second device (phone or camera) to record the interactions

a tripod or kickstand for the bird device to keep it stationary

a tripod or support to hold the recording device

a dedicated toy bell, not previously used by the bird

For the calling session location, the participants used play areas or cages to which the birds were already acclimated, including perch trees, cages, sofas, tables, and other household furniture. The birds were perched on a location where they could approach the screen up to a few centimeters as well as retreat to at least one meter away. The calling device had to be protected by a solid case to avoid possible injuries. Participants used devices they already had at home, including various touch-screen tablets and cell phones, ranging in size from six to 11 inches that were adapted to the bird’s size (summarized in Figure 3). Caretakers were instructed to use the bird device at 75% brightness. Facebook Messenger was used for group communication and bird video calls. Participants received emailed communications and live training sessions from the researchers. They were provided with instructions, scripts, visual illustrations for setup and several live remote presentations and meetings to prepare them and answer their questions. The participants were also provided with 24/7 text support through group chat with the research team to ensure all their questions were answered.

Figure 4:

Figure 4: Setup illustrations: during the first few calls of Phase 1, the caregiver was instructed to hold the calling device (top or bottom middle image depending on bird size), during later calls, when the bird was more comfortable, the device had to be placed on a stationary attachment (bottom left or bottom right image depending on bird size)

4.3 Protocol

4.3.1 Phase 1: Meet-and-Greet.

During Phase 1, the parrots were introduced to other birds on video and received training on the association between bell/touch-screen/call. Study session times comprised four-to-six 30-minute meet-and-greet sessions for Phase 1. Based on caretaker availability and matching species, five groups were formed (and later reorganized into six smaller groups during Phase 2). For communication and coordination of the study, group chats with the researchers were created for each group. Within each group, parrots were assigned an order to meet individually with the other birds of their group as illustrated in Figure 5 (bottom left). The system was set up to show a page containing photos of the other birds in the group among which the caller bird had to choose for corroboration and choice of callee bird. The photo order was shuffled between sessions to avoid side preference. As illustrated in Figure 5 (top left), the meet-and-greet sessions served as training for bell/screen/call association. To learn to initiate calls, the birds were encouraged by their caretaker to touch the bell three times, and then the target bird picture on the tablet three times. They received a treat after each touch. Following the bell-picture touch sequence, the caregiver video-called the subject bird with the sound muted and the tablet at a distance. Once the bird was displaying comfort behaviors (watching the screen, feather preening, etc.), the sound was turned up slowly, and the tablet was moved to within touching range of the parrot. Each bird met twice with all the other birds in their group.

Figure 5:

Figure 5: Experimental protocol. The protocol consists of two phases: Phase 1 (meet & greet) comprised 6 individual sessions for each bird. The script and steps are described in the top left image. During Phase 2 (open calls), the birds are given access to their bell for a three-hour period and the protocol is described in the top right image.

Caregivers were instructed to end calls if the birds displayed “discomfort” behavior (walking/flying/looking away from the device or pacing). Each video was reviewed in a timely manner for missed disengagement clues (leading to further support and training provided to caretakers), any stress behavior from the bird when interacting with the bell/screen (leading to additional stress-supplemental stress-reduction protocol designed by a bird behaviorist including bird becoming first caller, limiting calls to 1 min and switching from touching the screen to touching a printed paper version), or any fear from the other bird (leading to supplemental fear-reduction protocol and reorganizing groups for Phase 2). A total of 18 parrots (P1–18) completed Phase 1. Three birds (P3, P5, and P15) were released from the study due to a lack of engagement and potential stress behavior despite fear reduction protocols.

4.3.2 Phase 2: Open Calls.

For Phase 2, the 15 remaining participants were reorganized into six groups (see Figure 3) to adjust for the released birds and affinities. The participants scheduled eight open-call sessions of three continuous hours each during which all of the birds in their group were available to receive and make calls and their bell made available to the bird. If a bird rang the bell, the caregiver presented the picture(s) of possible callee birds, and if the bird selected one of them by touching their picture, a call was made. For sustainable time commitment and caretaker effort, each bird was allowed up to two outgoing calls of a maximum of five minutes each. As there is no prior research on how frequently birds would choose to make video calls, we limited the birds to two outgoing calls per session to mitigate any potential risk and fatigue. As the birds were in groups of participants, this limited the receiving calls of four to six (two from each bird), preventing fatigue from the birds receiving and making calls and the caregivers managing the calls.

Participants were provided with demo videos and scripts (see Figure 5, top right). Similarly to most of Phase 1, the tablets were placed in a stationary location at 15-45 centimeters away from the birds, depending on the bird’s size. Participants were reminded to look for disengagement criteria that should lead to ending the call before five minutes (i.e. walking or flying away from the call, sharp vocalizations, wing fluttering, and moving/flying away from the screen). After each session, caretakers uploaded the video recordings and submitted a post-session diary. Videos were timely reviewed to ensure participants’ adherence to protocol and the birds’ continued engagement level. At the end of Phase 2, each participant filled out a post-study survey and were given an individual 30-minute post-study interview.

4.4 Ethical Standards and Enrichment Focus

In alignment with HCI and ACI ethical standards [55, 78], we sought to minimize any discomfort or fear during the training process while emphasizing free choice and consent in the animals’ interactions. Several rules were adopted as part of the protocol to ensure animal comfort and reduce stress both for the birds and the humans. Four axes were chosen based both on prior works [55, 78] and insights from the pilot to establish the ethics guidelines for the study protocol: 1) training, 2) support, 3) stress & fatigue, and 4) synchronization. Despite unavoidable power differences between birds and human caretakers, this parallel approach aims to create more balanced dynamics. For each axis, we present specific rules adopted to ensure ethical and positive experiences for both human and bird participants.

Table 1:
Human participantsBird participants
Training - Instructions & feedback (setup, tablet handling) - Training to recognize their bird’s fear/aggression reactions- 2 weeks of training on association bell/screen/call - Individually meet all birds from their group, - Groups changed if interpersonal tensions developed - Tablet muted & distant until bird showed comfort
Support- Live meetings and webinars during Phases 1 and 2 - Personal support from a parrot training expert - Access to a dedicated website (videos, instructions, FAQs) - Regularly monitored group chat- Extra fear-free training protocols if hesitancy/fear arose - Accommodations if the bird was uncomfortable with touch (printed pictures) - Accommodations if fear of bell (trained to land on tablet)
Stress/ Fatigue- Group rearrangement to accommodate schedules - Only 8 Phase 2 sessions with a clear end date - Regularly monitored group chat- Maximum of five-minute calls - Videos checked to ensure ongoing comfort of birds - Max of two outgoing calls per bird to limit activity level
Schedule/ Synch- Birds grouped to fit time zones & availability - Support in scheduling sessions - Extra time allowed if there were scheduling conflicts - Regularly monitored group chat- 3h sessions to let birds time to utilize their agency to call - Calls timed for optimum activity level of the birds - Two outgoing calls max per bird to limit activity level - Bell placement synchronized to reduce frustration - In trios, bell removed during calls for third bird

Table 1: Ethics-based rules developed and followed during the protocol design and study

Skip 5ANALYSIS AND RESULTS Section

5 ANALYSIS AND RESULTS

Based on prior work and best practices both in the field of animal behavior and cognition and ACI/HCI, we focused our data analysis method on assessing the calls on four relevant variables: perception, agency, engagement, and perceived benefits. Dropout decisions were made at the end of Phase 1, consequently, dropout parrot data were analyzed for the perception analysis that only takes into account Phase 1 data. None of their data was included for agency, engagement and perceived benefits. We later discuss potential negative outcomes that include the dropout parrot experience.

5.1 Data

We collected the video submissions and diary inputs. The processed data was gathered from the videos and verified using their diary inputs. Based on these submissions, we measured the parrots’ interaction times with the bell, video, and tablet, as well their engagement times as caller and callee birds. Regarding the lengths of calls and disengagements, although human participants were instructed to stop the calls when their birds exhibited disengagement behaviors (walking, flying away, facing away from the screen for an extended period of time, etc.) the end call time was also confirmed by the second author, an expert parrot behaviorist with 10 years of professional avian experience, to further verify if the birds showed earlier signs of disengagement that should have ended the call earlier but may have gone unnoticed to the human caretakers. We noted eight such cases and adjusted them in our analysis. When such disengagement signs were overlooked, the caretakers were notified to further inform them of behaviors that indicated they should end the call.

5.2 Perception Analysis (Phase 1)

To assess whether the bird participants appeared to perceive the other bird on the screen and potentially make sense of the video calls, we looked at the meet-and-greet sessions (Phase 1), as they represented their first introduction to video calling. For each bird, we encoded behaviors that could denote perception and sense-making by drawing from prior works presented in the background. We did not aim to evaluate the bird’s cognitive ability to understand the situation of the video calling, but instead looked for signs of visual and auditory perception as well as denote interactive behaviors. As reviewed in Section 3.a, we specifically looked at changes in behavior between instances when the other bird was present versus absent from the screen. Thus, we focused on timely responses (i.e. responses less than two seconds after the event) to the other bird appearing or disappearing from the screen. We specifically took note of the following:

Distance the bird chose to stand from the screen (close/medium/far). Each bird was given one meter of leeway on their perch to get close or far from the screen. If the bird stayed on the other end of their perch or flew away, it was considered “far,” if the bird moved back and forth or did not approach the screen to the closest point, it was counted “medium”. If the bird spent most of their time as close as they could to the screen, leaned towards it, or touched it, it was counted as “close”.

Presence of a timely response to the other bird either appearing on screen, entering or leaving the view frame.

Clear indication that the bird is “following” the other bird on the screen, either by touching continuously at the location of the other bird, or by clearly following the movements of the other bird visually on screen

Presence of timely responses to auditory stimuli coming from the device.

Two authors independently labelled each video and obtained an inter-rater reliability of 89% measured by percent agreement between raters. The main disagreements were on closeness to the screen, as some birds did not have a way to be closer to a screen due to their perch positions. Two other researchers subsequently reviewed this categorization, and any disagreements were resolved through discussions among all four authors.

Table 2:
IDDistanceReaction to bird appearing on screenReaction to bird going off frameFollows bird on screenReaction to audio
P1closeyes (headbobs, vocalises, looks intensely)yes (rings bell, looks around screen)yes (visually, with movements)yes (N, V, TT)
P2mediumnononoyes (V, TT, says "hello"))
P3faryes (head twists)nonono
P4mediumnononoyes (N, V)
P5closeyes (looking, walking away)yes (looks behind screen)noyes (N, V)
P6closeyes (vocalises, leans in, tries to touch)yes (looks behind where bird left)yes (follows, touches)yes (RC, vocalises back)
P7closeyes (comes closer, then focuses on caregiver)noyes (touches)no
P8closeyes (vocalises, headbobs)noyes (touches)yes (N)
P9closeyes (movement, puffed head feather)nonoyes (V)
P10mediumyes (looks, flies away, returns with P9)nonoyes (N)
P11mediumyes (vocalisations, looks when bird appears)nonoyes (V, N, TT)
P12closenononoyes (vocalises back)
P13closeyes (touches, puts head on screen)yes (touches where bird disappears)yes (constant touch)yes (N, V)
P14closeyes (touches, vocalises when bird appears)yes (comes closer, looks behind tablet)noyes (V)
P15mediumnononoyes (listens, V, TT)
P16mediumyes (looks, headbobs)nonoyes (N, V, RC)
P17mediumyes (looks, comes closer, vocalises)nonoyes (N, V, RC)
P18mediumyes (flies away when bird appears)nonono

Table 2: Perception-related results during Phase 1: Meet-and-greet, including IDs, bird names, distances to screen, timely reactions (<2 seconds) to the other bird appearing on screen or moving off frame, following the other bird’s movements on screen, reaction to audio (V: reaction to vocalisations, N: reaction to hearing their own name from screen, RC: responds to commands, and TT: turn taking).

Table 2 summarizes the perception-related results for each bird participant. Regarding standing distance from the screen, 9/18 (50%) of the birds stood close to the screen, exhibiting frequent touching and leaning behaviors (P1, P5, P6 P7, P8, P9, P12, P13, and P14).

Regarding timely responses to the other bird moving into and off the frame, 14 out of 18 (78%) birds exhibited clear timely responses to the other bird appearing on screen or entering the frame. Among these 14 birds, three reacted by looking and walking/flying away to create some distance (P5, P10, and P18), while six reacted by coming closer, leaning and or touching the screen (P6, P7, P8, P13, P14, and P17). Six birds (P1, P6, P8, P11, P14, and P17) reacted by vocalizing, and three (P1, P8, and P16) reacted with head bobbing behaviors.

Five out of 18 (28%) parrots reacted when the other bird moved off-screen (P1, P5, P6, P13, and P14). All of which also reacted to the other bird appearing on-screen. Reactions included looking around/behind the screen (P1, P5, P6, and P14) or touching the location where the bird disappeared from as well as immediately returning to ring the bell (P13). Moreover, five out of 18 (28%) birds appeared to "follow" the other birds on the screen (P1, P6, P7, P8, and P13). All of which also reacted to the other bird appearing on screen. These following behaviors included repeatedly touching the screen at the exact location of the other bird (P6, P7, P8, and P13) or staring at the other bird’s position (P1).

Besides reactions to visual stimuli, 16 out of 18 (83%) birds showed timely responses to the audio coming from the device (all except P3, P7, and P18). Out of these 16 birds, four reacted to the audio but did not show any reaction to the other bird visually appearing or disappearing from the screen (P2, P4, P12, and P15). Nine birds reacted to hearing their own names uttered by the caretaker on the other end of the call, although two of them (P8 and P10) did not appear to react to any other sounds from the calls. Thirteen birds (72%) reacted to vocalizations from the other bird on the call.

5.3 Agency (Phase 2)

To evaluate whether birds made use of their newfound agency during Phase 2 (open call), we adopted a quantitative approach to measure agency. For each bird and each session, we measured how many times the bird triggered an outgoing call (up to two total outgoing calls per bird per session), and looked at the distribution and variability between and within groups. We measured the birds’ individual social score as the ratio between the number of calls triggered and the total potential number of calls over the eight sessions. Pearson’s correlation coefficient was used to test the linear relationships between the number of calls the parrots made and the number of calls they received.

During Phase 2, every bird had the possibility to trigger up to two calls per session for a maximum of five minutes per call. All six groups ran eight sessions except for Group 1, which only ran seven. Out of a total of 234 possible calls, the birds made 147 calls. All of the birds triggered at least one call during Phase 2. Over the eight sessions, the minimum number of outgoing calls was three (P4), and the maximum was 16 (P1 and P17), which was also the maximum possible number allowed by our protocol.

Figure 6:

Figure 6: Overall illustration of all calls made during Phase 2. For each group, the calls triggered by every bird are color coded and represented on the timeline of the session’s three hours (horizontal axis: time in h:m). Five minute calls are represented by two colored cells, and shorter calls are represented by single-colored cells.

Figure 6 illustrates all calls made during Phase 2. Each call was triggered by the bird ringing their bell and then successfully corroborating their intention to make a call by touching/pointing at the photo of the bird to call on their tablet/printed paper. Contrary to the pilot in which some calls went unanswered, during the study, every call triggered was answered by the other bird/human. Figure 7 (left) summarizes the number of calls for each bird and each session and their social score. On average, across all participants, the birds triggered their two maximum outgoing calls 53% of the time, one call 21% of the time and no calls 25% of the time; in essence, at least one call was triggered 74% of the time on average across all birds.

Figure 7:

Figure 7: Call distribution per bird and outgoing vs incoming

The broad distribution of calls made, depending on the bird, suggests high variability between birds and between groups. While Groups 3, 4 and 5 triggered calls most frequently (respectively 81%, 94% and 90% of all possible calls), Group 1 triggered calls parsimoniously (26% of possible calls). Group 2 showed an evolution from few to more call counts, and inversely, Group 6 started with a high call count and slightly decreased over time. Although the variability in the number of outgoing calls triggered between groups was high (SD=4.3), the variability within groups was relatively low for all groups except Group 2 (SD_G1 = 0.6, SD_G2 = 4.0, SD_G3 = 0.0, SD_G4 = 1.4, SD_G5 = 1.5, and SD_G6 = 0.70). A Pearson’s coefficient test showed a significant positive correlation between the number of calls received and the number of calls made (R2 0.667, p = .0072 <0.05) (see Figure 7 (right)). This indicates that the birds who were called frequently also made a high number of calls, suggesting a balancing of bird social motivation within groups which led to some groups with high social motivation and others with less motivation. This highlights the social nature created by the use of the system.

5.4 Engagement (Phase 2)

Aside from the number of outgoing calls triggered, we also looked quantitatively at the engagement of the birds through the calls, indicative of their interest by how long they were engaged in calls they themselves triggered using the corrected duration of calls. We examined the correlation between the use of their agency (individual social scores) and their engagement during calls. In addition to engagement within calls, we also investigated engagement throughout the sessions by measuring the time it took to trigger their first call once their bell became available. Time-to-first-call was measured using the actual duration between the session start and the first call, corrected by subtracting the duration of previous calls triggered by other birds, as during that time, the bird could not trigger new calls.

Looking at the call lengths, 59.8% of the calls lasted the maximum call duration of five minutes without disengagement by either bird, 13.6% of calls ended before the maximum five minutes when the callee bird disengaged, and 4.7% of calls ended because of unrelated technical issues (three instances of battery issues and four instances of network issues). In total, 21.7% of the calls ended before the maximum five minutes because the caller bird disengaged. With regard to assessing the engagement of the bird who triggered the call, they appear to stay engaged in 86.0% of the calls. This was measured by adding up the cases in which the call lasted five minutes, the calls wherein the callee bird disengaged and the calls that ended due to technical difficulties.

Table 3:
Full 5minutesCallee birddisengagedCaller birddisengagedOther(tech issues)
Number8820377
Percentage59.8%13.6%21.7%4.7%

Table 3: Reason for call endings summarizing the number of calls that ended due to reaching the maximum time of five minutes, cases where the caller or callee bird disengaged, and calls that ended due to technical difficulties

For each bird, we measured an engagement score, which represents the number of calls they triggered during which they did not disengage, divided by the total number of calls they made. An engagement score of zero meant that no calls were ended by the caller bird disengagement, and a score of one meant that all the calls were ended by the caller bird disengaging. We observed a significant correlation between the bird’s social score (i.e. use of their agency) and their engagement score (p = 0.02 <0.05), meaning that the more a bird uses their agency, the more likely they are to stay engaged during the calls they trigger. This consistency further supports their interest in the calls and the potential benefits and enrichment from the experience of using the system.

Besides engagement in individual calls, we also measured how fast the birds trigger their first call. The box plot in Figure 8 illustrates the distribution of time-to-first-call per bird for every session. Although not significant, we observe a trend of lower average time-to-first call for birds with higher social scores (p=0.099 >0.05), suggesting a possible link between how many calls birds trigger and how fast they trigger the calls once their bell becomes available.

Figure 8:

Figure 8: Box plot of corrected time-to-first-call in minutes for each bird throughout all the sessions with participant pairewise p values (a) and social score versus average corrected time to first call with linear correlation (R2 0.176 and p=0.099>0.05) (b)

5.5 Bird Caregiver Survey and Interview

At the end of Phase 2, during the individual 30-minute semi-structured interviews, caretakers were asked about perceived benefits for their birds, potential behavior changes, and interest in continuing to provide parrot calls for their birds. Caution is needed when considering caretakers’ comments as human perception of animal experience may be less objective than measured behavior from videos by trained parrot experts.

Although many participants commented on the steep learning curve and difficulties with time commitments and synchronization, all of them considered the experience worthwhile for their birds and themselves. When asked about the perceived benefits and value of the system for the birds, 100% of the participants reported positive experiences. 71.4% of the caretakers responded that they believed their bird had a very positive experience during calls, while the remaining 28.5% reported that they believed the bird had a moderately positive experience, depending on the calls. None of the caretakers reported a negative experience. Likewise, when asked whether they believed their birds benefited from the calls beyond immediate enjoyment, all but one (92.9%) answered yes, with the remaining caretaker responding “maybe.” Various observed benefits were reported for the bird participants, the human participants themselves, and their relationships. We thematically organize insights in terms of 1) benefits for the birds through general behavior and newly learned/mirrored behaviors and 2) benefits for the humans, and interspecies bonding.

5.5.1 Benefits for the Birds.

The caretakers reported various new observed behaviors supporting a beneficial outcome to the birds, e.g., P7’s caretaker reported “She came alive during the calls.”. Caretakers also commented on their bird’s intrinsic interest in the experience. P10’s caretaker stated, “He still got treats for several of the apps, but not for Messenger. Yet he would choose Messenger above the others every time when it was available.” P2’s caretaker wrote, “He was engaged because he was watching the screen, talking to [P1], inviting [P1] to come play with him. When [P1] walked off the screen, he called for [P1] to come back.” Additional feedback regarding interest included, “He enjoyed the learning process and would come to the play stand when I set up the equipment and be very interested” (P16) and “He would clearly ring the bell, and very decisively select a friend” (P17). Some of the caretakers expressed that the birds behaved onscreen the way they would react to a real person or real bird: “[She] did what she does when someone comes in the room.” (P8). “It surprised me that he treated the call differently than anything else on the tablet." (P18). These statements reinforced the sense-making abilities from on-screen stimuli.

Some of the caretakers’ observations of their birds from the study included more confidence (P10, P11), calmer behavior (P17, P10, P13, P14), new foraging behaviors (P10), and new flying behaviors (P11). No negative behavior was reported. P11 seemed to become more confident, as the caretaker reports: “After the calls with [P10], she started flying more. I think she realized that flying is an okay behavior. [...] I’m very happy about this!” Similarly, P13’s human reported that: “She seemed calmer in general during the weeks of the study. I believe she benefited from [it].”

Caretakers also reported interactive-mirroring behaviors between the birds during calls (P8, P10, P9, P11, P1, P2, and P16) including both birds playing with toys, foraging, head-bobbing, dancing, waving, saying “hello”, vocalizing, learning new sounds, and “singing” together. For example, “[P7] waved back, that’s something she had not done before” (P7). This was particularly relevant in the P10/P9 and P2/P1 pairs, both of which had high social scores. “He learned foraging behaviors that I have tried and tried for a year to teach him. This is the first time he would forage” (P10) or “He would preen with [P10]. He would play with his toys as if to show [P10] his toys." Or “If [P1] disappeared from the screen, [P2] looked for him - under the iPad - and also tried to call him back. The birds sang and danced together, [...] they also preened at the same time.” (P2)

5.5.2 Benefits for the Caretakers and Interspecies Bonding.

Human participants expressed having learned more about their birds and being better able to recognise behaviors: “It was wonderful to meet new birds and human friends. The feedback we were able to share with one another helped me to learn more about [P9] and other behaviors.” (P9) “I enjoyed the study. I enjoyed watching [P2] try to figure out where [P1] actually was, looking behind the iPad and off to the side, when [P1] went off-screen.” (P2). The sessions also provided special bonding time between humans and birds, mediated by technology. Although some birds appear to get intrinsic enjoyment from the calls themselves, for others, the enjoyment might also come from special attention they receive during calls, specifically in multi-birds households. The improved understanding of the caretakers for their animals may provide a healthy ground for interspecies bonding. Some birds were more involved and comfortable during calls when perched on their owners (P10, P4). Some reports included: “We did something together” (P7) “We got a stronger relationship” (P8), “It was engaging for us to learn something new, it helped strengthen our bond through learning” (P6). Despite the time-consuming work involved for caretakers, a majority (64%) reported intending to continue providing their bird with opportunities to call other birds.

Skip 6DISCUSSION Section

6 DISCUSSION

6.1 Perception

We analyzed specific, timely behavior to determine whether the birds demonstrated a perceptible response to activity and audio from the screens. Our results support the idea of parrots’ diverse range of “sense-making” to the screen triggered by visual and auditory stimuli. Most birds appeared to react to the presence of another bird on screen both visually (78%) and/or from sound (72%). Given the distribution, touching the other birds on the screen or reacting to the other bird leaving the frame may be seen as a higher-order “sense-making” than reacting to the other bird appearing on screen. Indeed, all of the birds that followed the birds on screen or reactions to their correspondent disappearing also reacted to the other bird appearing on screen.

Finding 1: Most parrots reacted to, and appeared to make sense of, the presence of another bird on the screen.

Although these results do not provide evidence of acute visual/auditory perception nor full understanding of the context of the birds’ understanding of the system, they support the potential to use screen-based parrot-parrot video-calling for social enrichment. While most birds reacted to another bird’s appearance on the screen by touching the screen, vocalizing, greeting, or head bobbing behavior, three of the birds reacted by flying away. In most calls, one bird in particular, P10, a 10cm parrotlet, flew away and then, only with the presence of one particular caller, P9, flew back. P1 and P2, both older male macaws, also reacted to the other bird’s presence or absence on the screen. When calls were initiated, P2 often vocalized, “Hi! Come here! Hello!” If P1 left the screen, P2 rang his bell, ostensibly asking him to return. Notwithstanding the uncertainty about birds’ abilities to engage with screens due to CFF, these behaviors suggest some form of sensory perception from screen interactions.

Finding 2: Parrots reacted differently depending on who was on the call and personal preferences for call behaviour.

6.2 Agency

Every bird in Phase 2 triggered calls and participated in the video calls. One key finding was the significant correlation between the number of outgoing calls and the number of received calls leading to the development of a social score for each bird and each group. This connection may be explained by several factors. Some factors that could increase motivation to call include increased familiarity with the other bird leading to rapport and increased special attention from caretakers. Another interesting finding is “who called whom?” Finally, the groups of two bird participants had faster call frequency than the groups of three, as coordinating schedules was easier, leading us to believe that in future development, groups of two may allow for more and easier enrichment opportunities through socialization.

Finding 3: The more the parrot received calls, the more the parrot made video calls.

6.3 Engagement

The validity of the proposed social scores is further supported by the correlation between high social scores and low disengagement from calls. Various factors may also influence engagement, including size, age, household composition, human daily availability, etc. Small size and higher FFR lead to higher speed of processing which may result in shorter engagement times but not necessarily reflect a lower value to calls. Instead, P10 generally flew away as soon as the call began but then only returned if the callee was P9. However, no significant correlation was found between bird size and engagement. Additional longitudinal validation and co-specific grouping are needed to establish the effects of specific factors. Regardless of individual factors, we observed that the more a bird uses their agency, the more likely they are to stay engaged during the calls they trigger.

Finding 4: The more a bird uses their agency, the more likely they are to stay engaged during the calls they trigger.

6.4 Caretaker Involvement

Both from the pilot and full study, the role of the human in the loop appeared as instrumental - from initial training in the calling system to their continued presence and encouragement during the calls. The video call system was thought to enrich the bird’s everyday context in which humans already play an important role. However, this required balance and monitoring as caretakers’ enthusiasm and desire for their bird to engage could be stressful for parrots. This reinforces the idea that, in any future systems, caretaker training and protocol adherence are paramount to the successful deployment of a calling device centered around birds’ agency, or birds may choose to disengage because they were forced into the interactions.

Finding 5: The role of the human is paramount to an ethical, successful usage of the system that benefits bird welfare.

6.5 Potential risks

Although the research team provided timely feedback and corrections, we observed three potential negative aspects of bird-calling devices. The first one was the risk of caretakers’ over-enthusiasm, which had to be managed to avoid forced interaction with the birds. Indeed, enthusiastic caretakers tend to miss cues indicating the birds’ discomfort, creating stress for the animal. A second potential risk relates to stimuli on the screen that may frighten the bird (surprise reactions to sound or sudden visual changes). These were all taken into account in the protocol but required additional stress-reducing protocols. Every bird that exhibited a fear reaction responded well to the protocol and continued through the study. The third potential drawback relates to habituation and/or frustration from the virtual nature of the experience. Although for all the birds from Phase 2, the observed benefits (inability to ever meet other birds in person, risks of violent behavior, as noted in the introduction) were deemed to outweigh the drawbacks, long-term outcomes need to be established longitudinally before recommending such an approach for long-term use.

Finding 6: Potential risks include birds’ discomfort from human enthusiasm or screen stimuli and long term reactions

6.6 Study Limitations

Our approach required a large time commitment from caretakers and the need to synchronize the schedules. While all participants reported interest in continuing the calls, it is unclear whether calls are sustainable long-term without continual support and structure as provided by the research team. The limited range of the touchscreen device-housed cameras for the video calls also brought limitations as birds regularly appeared off-screen to the other bird (out of camera range), making the viewing experience inconsistent with some settings.

There also could be an argument that the birds may have responded similarly if they were presented with a video replay of a parrot call rather than a live bird. Future work could investigate this question. However, several factors kept us from attempting the experiment during the current study. First, although it is ubiquitous for humans to be exposed to played-back stimuli (audio, video, etc.), it is a more natural state of things to consider all stimuli to be live, present, and interactive for animals. To the question, “Would the bird react the same to static videos of other birds?” Rather than an answer that could concern their intelligence, we are more interested in questioning what it means to expose them to pre-recordings. The birding and ACI communities have both exposed ethical concerns about deceiving animals. Although play-back experiments may be scientifically meaningful to decode species signalling better, they rely on the animal responding genuinely to stimuli they might believe to be live. This study aimed to bring meaningful enrichment to the bird and not to evaluate their cognitive processing. Finally, although some may argue that a bird might be more easily entertained by watching videos rather than interacting with another live bird, we believe not only that this may teach them an appropriate species identity, but also that by connecting them live with another parrot, we are providing enrichment and socialisation to two birds with one call.

6.7 Design Implications

Based on our experiment and prior research, we derived a set of design implications for the implementation of a video-calling system for parrot-parrot remote interaction. Regarding overall setup, the bird should have a great amount of choice over their distance - close to or far from - the screen. This can be realized using a long perch in an area that allows for approaching, retreating, and even hiding. In our study, not all birds used a perch and some preferred to perch on their humans during the interaction. This should be established case-by-case to meet the parrots’ needs. To optimize parrot perception, the “selfie” camera and the display should ideally run at a high frame rate, enabling UV capture and offering a wide view angle to allow the bird to move around in their environment. It should be placed far enough from the screen to keep the bird in view while leaning toward and touching the screen. The screen itself should ideally be curved to adapt to lateral vision and adapted to the bird’s size. This could initially be implemented using three synchronized tablet displays. The screen should not be mobile but fixed relative to the perch. To improve the privacy of the human home, the physical setup could contain a backdrop, or the system should use an efficient virtual background system adapted to bird recognition. The audio system should capture and play at a high sampling rate, and a cardioid microphone can be used to minimize ambient noise and capture private audio. The speaker’s frequency response should be adapted for the species and allow for left/right panning. Upon answering the calls, the audio should start at low volume and gradually increase to optimal volume. The interaction to trigger calls could be either done through a dedicated button or another automatic trigger. Special considerations should be taken for cases where the other bird is unavailable to respond. This could include showing an enlarged bird photo for a few minutes.

6.8 Future Work

While this study provided a foundation for exploring live parrot-to-parrot video calls, many open questions remain to be answered. Chief among them is whether a significant difference arises between birds grouped with others of the same species or whether interaction levels remain similar for birds even when paired across species. Similarly, do different populations of species of parrots find video calls more (or less) meaningful as enrichment. It would be important to examine optimised time duration and call frequencies for parrots and whether the “high-pay” training method involving a bell ring and selection should be lowered to an easier threshold. Research could also explore the role of video call interactions between absent caretakers, for instance, for birds that need to be physically distant from their human due to travel, adoption, etc. The current study assesses agency and engagement quantitatively; however, a wealth of behavioral information was also captured in the session from video recordings which could provide another angle to understand the animal experience during the calls. Finally, this study expands on the potential of animal enrichment to improve animal well-being using technology. Crucially, it provides a framework for force-free training and agency-oriented explorations of online video interactions for other animals as well. In all of this work, choice and control are central to animal agency and well-being. This approach also opens the door to a broader "animal internet" centered around animal agency, ethics and interspecies relationship.

Skip 7CONCLUSION Section

7 CONCLUSION

Following a pilot experiment and expert survey, we ran a three-month study with 18 pet birds to evaluate the potential, value and usability of a parrot-parrot video-calling system. We assessed the system in terms of perception, agency, motivation, engagement strength and overall perceived benefits. With 147 corroborated bird-triggered calls, our results show that every bird used the system, that most of them exhibited high engagement, and that all caretakers reported perceived benefits. The role of the caretaker appeared central in the birds’ interaction with technology. The experience appeared to not only provide benefits to the parrots through birds-specific interactions and behavior learning, but also to the human participants by learning more about their animal companions. Leveraging web technology ultimately allowed the birds to bond socially with other birds while also intimately reconnecting with their human companion.

Skip ACKNOWLEDGMENTS Section

ACKNOWLEDGMENTS

We wish to thank all the birds and their humans for their participation. We also thank Dr. Irene Pepperberg, Ken Ramirez, Helen Dishaw, Cassie Malina, and Jenna Duarte Stallard for their expert feedback and support. The ethics communities from Glasgow University approved this project for animals (number EA 01/22) and humans subjects (number 300210172). This work was partially supported by the Royal Society of Edinburgh Fellowship [grant 2514].

Skip Supplemental Material Section

Supplemental Material

3544548.3581166-talk-video.mp4

Pre-recorded Video Presentation

3544548.3581166-video-figure.mp4

Video Figure

References

  1. Syrina Al Aïn, Nicolas Giret, Marion Grand, Michel Kreutzer, and Dalila Bovet. 2009. The discrimination of discrete and continuous amounts in African grey parrots (Psittacus erithacus). Animal cognition 12, 1 (2009), 145–154.Google ScholarGoogle Scholar
  2. GV Amdam and AL Hovland. 2011. Measuring animal preferences and choice behavior. Nature Education Knowledge 3, 10 (2011), 74.Google ScholarGoogle Scholar
  3. American Pet Products Association 2018. 2017–2018 APPA national pet owners survey. americanpetproducts. org [Internet]. APPA(2018).Google ScholarGoogle Scholar
  4. Eric PS Baumer. 2015. Usees. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. 3295–3298.Google ScholarGoogle Scholar
  5. Peter H Beynon, Neil A Forbes, and Martin PC Lawton. 1996. BSAVA manual of psittacine birds. British Small Animal Veterinary Association, Kingsley House, Shardington.Google ScholarGoogle Scholar
  6. Gay A Bradshaw, Joseph P Yenkosky, and Eileen McCarthy. 2009. Avian affective dysregulation: Psychiatric models and treatment for parrots in captivity. In Proceedings of the 30th Annual Association of Avian Veterinarians Conference, Minnesota. Available online.Google ScholarGoogle Scholar
  7. Leyre Castro and Edward A Wasserman. 2017. Feature predictiveness and selective attention in pigeons’ categorization learning.Journal of Experimental Psychology: Animal Learning and Cognition 43, 3(2017), 231.Google ScholarGoogle Scholar
  8. Leyre Castro, Edward A Wasserman, and Marisol Lauffer. 2018. Unsupervised learning of complex associations in an animal model. Cognition 173(2018), 28–33.Google ScholarGoogle ScholarCross RefCross Ref
  9. Laura Chouinard-Thuly, Stefanie Gierszewski, Gil G Rosenthal, Simon M Reader, Guillaume Rieucau, Kevin L Woo, Robert Gerlai, Cynthia Tedore, Spencer J Ingley, John R Stowers, 2017. Technical and conceptual considerations for using animated stimuli in studies of animal behavior. Current Zoology (2017).Google ScholarGoogle Scholar
  10. Katherine A Clements, Suzanne L Gray, Brya Gross, and Irene M Pepperberg. 2018. Initial evidence for probabilistic reasoning in a grey parrot (Psittacus erithacus).Journal of Comparative Psychology 132, 2 (2018), 166.Google ScholarGoogle Scholar
  11. Erin N Colbert-White, Michael A Covington, and Dorothy M Fragaszy. 2011. Social context influences the vocalizations of a home-raised African Grey parrot (Psittacus erithacus erithacus).Journal of Comparative Psychology(2011).Google ScholarGoogle Scholar
  12. Jennifer Cunha and Carlie Rhoads. 2020. Use of a Tablet-Based Communication Board and Subsequent Choice and Behavioral Correspondences in a Goffin’s Cockatoo (Cacatua goffiana). In Proceedings of the Seventh International Conference on Animal-Computer Interaction. 1–9.Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. RICHARD B D’EATH. 1998. Can video images imitate real stimuli in animal behaviour experiments?Biological reviews 73, 3 (1998), 267–292.Google ScholarGoogle Scholar
  14. Bonnie Duncan. 1997. Avian Intelligence, Parrot Behavior and Toys. AFA Watchbird 24, 4 (1997), 13–16.Google ScholarGoogle Scholar
  15. Ian JH Duncan. 1992. Measuring preferences and the strength of preferences. Poultry science 71, 4 (1992), 658–663.Google ScholarGoogle Scholar
  16. Ian JH Duncan. 2005. Science-based assessment of animal welfare: farm animals. Revue scientifique et technique-Office international des epizooties 24, 2(2005), 483.Google ScholarGoogle Scholar
  17. Kazumasa Ebisawa, Shunya Nakayama, Chungyu Pai, Rie Kinoshita, and Hiroshi Koie. 2021. Prevalence and risk factors for feather-damaging behavior in psittacine birds: Analysis of a Japanese nationwide survey. PloS one 16, 7 (2021), e0254610.Google ScholarGoogle ScholarCross RefCross Ref
  18. CS Evans and P Marler. 1991. On the use of video images as social stimuli in birds: audience effects on alarm calling. Animal Behaviour 41, 1 (1991), 17–26.Google ScholarGoogle ScholarCross RefCross Ref
  19. Mark Evans. 2001. Environmental enrichmment for pet parrots. In Practice 23, 10 (2001), 596–605.Google ScholarGoogle ScholarCross RefCross Ref
  20. Fiona French, Clara Mancini, and Helen Sharp. 2017. Exploring research through design in animal computer interaction. In Proceedings of the Fourth International Conference on Animal-Computer Interaction. 1–12.Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. SG Friedman and Laurie Hess. 2009. A clinic-based protocol for solving behavior problem. In Proc Annu Conf Assoc Avian Vet March. 10–13.Google ScholarGoogle Scholar
  22. Joseph P Garner, Cheryl L Meehan, and Joy A Mench. 2003. Stereotypies in caged parrots, schizophrenia and autism: evidence for a common mechanism. Behavioural brain research 145, 1-2 (2003), 125–134.Google ScholarGoogle Scholar
  23. Lori A Gaskins and Laurie Bergman. 2011. Surveys of avian practitioners and pet owners regarding common behavior problems in psittacine birds. Journal of avian medicine and surgery 25, 2 (2011), 111–118.Google ScholarGoogle ScholarCross RefCross Ref
  24. Brett M Gibson, Edward A Wasserman, Lloyd Frei, and Keith Miller. 2004. Recent advances in operant conditioning technology: A versatile and affordable computerized touchscreen system. Behavior Research Methods, Instruments, & Computers 36, 2 (2004), 355–362.Google ScholarGoogle ScholarCross RefCross Ref
  25. Luis M Gómez-Laplaza. 2005. The influence of social status on shoaling preferences in the freshwater angelfish (Pterophyllum scalare). Behaviour 142, 6 (2005), 827–844.Google ScholarGoogle ScholarCross RefCross Ref
  26. Temple Grandin, Stanley E Curtis, Tina M Widowski, and John C Thurmon. 1986. Electro-Immobilization Versus Mechanical Restraint in an Avoid-Avoid Choice Test for Ewes. Journal of Animal Science 62, 6 (1986), 1469–1480.Google ScholarGoogle ScholarCross RefCross Ref
  27. Reinhard Gupfinger and Martin Kaltenbrunner. 2018. Animals make music: A look at non-human musical expression. Multimodal Technologies and Interaction(2018).Google ScholarGoogle Scholar
  28. Reinhard Gupfinger and Martin Kaltenbrunner. 2019. Animal-Centred Sonic Interaction Design: Musical Instruments and Interfaces for Grey Parrots. In Proceedings of the Sixth International Conference on Animal-Computer Interaction. 1–11.Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Juan Haladjian, Johannes Haug, Stefan Nüske, and Bernd Bruegge. 2018. A wearable sensor system for lameness detection in dairy cattle. Multimodal Technologies and Interaction 2, 2 (2018), 27.Google ScholarGoogle ScholarCross RefCross Ref
  30. Liisa Hämäläinen, Johanna Mappes, Hannah M Rowland, Marianne Teichmann, and Rose Thorogood. 2020. Social learning within and across predator species reduces attacks on novel aposematic prey. Journal of Animal Ecology(2020).Google ScholarGoogle Scholar
  31. Franziska Hausmann, Kathryn E Arnold, N Justin Marshall, and Ian PF Owens. 2003. Ultraviolet signals in birds are special. Proceedings of the Royal Society of London. Series B: Biological Sciences 270, 1510(2003), 61–67.Google ScholarGoogle Scholar
  32. Kevin Healy, Luke McNally, Graeme D Ruxton, Natalie Cooper, and Andrew L Jackson. 2013. Metabolic rate and body size are linked with perception of temporal information. Animal behaviour 86, 4 (2013), 685–696.Google ScholarGoogle Scholar
  33. Ilyena Hirskyj-Douglas and Vilma Kankaanpää. 2021. Exploring how white-faced sakis control digital visual enrichment systems. Animals 11, 2 (2021), 557.Google ScholarGoogle ScholarCross RefCross Ref
  34. Ilyena Hirskyj-Douglas and Vilma Kankaanpää. 2022. Do Monkeys Want Audio or Visual Stimuli? Interactive Computers for Choice with White-Faced Sakis in Zoos. In Designing Interactive Systems Conference. 1497–1511.Google ScholarGoogle Scholar
  35. Ilyena Hirskyj-Douglas and Andrés Lucero. 2019. On the Internet, Nobody Knows You’re a Dog... Unless You’re Another Dog. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 1–12.Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Ilyena Hirskyj-Douglas, Roosa Piitulainen, Andrés Lucero, 2021. Forming the Dog Internet: Prototyping a Dog-to-Human Video Call Device.Proc. ACM Hum. Comput. Interact. 5, ISS (2021), 1–20.Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. Stephanie L Jayson, David L Williams, and James LN Wood. 2014. Prevalence and risk factors of feather plucking in African grey parrots (Psittacus erithacus erithacus and Psittacus erithacus timneh) and cockatoos (Cacatua spp.). Journal of Exotic Pet Medicine 23, 3 (2014), 250–257.Google ScholarGoogle ScholarCross RefCross Ref
  38. Jeffrey R Jenkins. 2001. Feather picking and self-mutilation in psittacine birds. Veterinary clinics of North America: Exotic animal practice 4, 3(2001), 651–667.Google ScholarGoogle Scholar
  39. Jill Jessen, Frank Cardiello, and Mara M Baun. 1996. Avian companionship in alleviation of depression, loneliness, and low morale of older adults in skilled rehabilitation units. Psychological reports 78, 1 (1996), 339–348.Google ScholarGoogle Scholar
  40. Cathy Johnson-Delaney. 1992. Feather picking: Diagnosis and treatment. Journal of the Association of Avian Veterinarians 6, 2(1992), 82–83.Google ScholarGoogle ScholarCross RefCross Ref
  41. Sabrina Karl, Magdalena Boch, Anna Zamansky, Dirk van der Linden, Isabella C Wagner, Christoph J Völter, Claus Lamm, and Ludwig Huber. 2020. Exploring the dog–human relationship by combining fMRI, eye-tracking and behavioural measures. Scientific reports 10, 1 (2020), 1–15.Google ScholarGoogle Scholar
  42. Heather Y McDonald Kinkaid, Daniel S Mills, Steve G Nichols, Rebecca K Meagher, and Georgia J Mason. 2013. Feather-damaging behaviour in companion parrots: an initial analysis of potential demographic risk factors. Avian Biology Research(2013).Google ScholarGoogle Scholar
  43. Richard D Kirkden and Edmond A Pajor. 2006. Using preference, motivation and aversion tests to ask scientific questions about animals’ feelings. Applied Animal Behaviour Science 100, 1-2 (2006), 29–47.Google ScholarGoogle ScholarCross RefCross Ref
  44. Rébecca Kleinberger, Anne HK Harrington, Lydia Yu, Akito Van Troyer, David Su, Janet M Baker, and Gabriel Miller. 2020. Interspecies interactions mediated by technology: An avian case study at the zoo. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–12.Google ScholarGoogle ScholarDigital LibraryDigital Library
  45. Rebecca Kleinberger, Megha Vemuri, Janelle Sands, Harpreet Sareen, and Janet Baker. 2022. TamagoPhone: A Framework for Augmenting Artificial Incubators to Enable Vocal Interaction Between Bird Parents and Eggs. In Proceedings of the 2022 ACI Conference on Animal Computer Interaction.Google ScholarGoogle ScholarDigital LibraryDigital Library
  46. Shaun Lawson, Ben Kirman, Conor Linehan, Tom Feltwell, and Lisa Hopkins. 2015. Problematising upstream technology through speculative design: the case of quantified cats and dogs. In Proceedings of the 33rd annual ACM conference on human factors in computing systems. 2663–2672.Google ScholarGoogle ScholarDigital LibraryDigital Library
  47. Stephen EG Lea and Winand H Dittrich. 2000. What do birds see in moving video images. Picture perception in animals(2000), 143–180.Google ScholarGoogle Scholar
  48. Shang Ping Lee, Adrian David Cheok, Teh Keng Soon James, Goh Pae Lyn Debra, Chio Wen Jie, Wang Chuang, and Farzam Farbiz. 2006. A mobile pet wearable computer and mixed reality system for human–poultry interaction through the internet. Personal and Ubiquitous Computing 10, 5 (2006), 301–317.Google ScholarGoogle ScholarDigital LibraryDigital Library
  49. Olle Lind, Mindaugas Mitkus, Peter Olsson, and Almut Kelber. 2014. Ultraviolet vision in birds: the importance of transparent eye media. Proceedings of the Royal Society B: Biological Sciences 281, 1774(2014), 20132209.Google ScholarGoogle ScholarCross RefCross Ref
  50. Anna K Lindholm, Kerstin Musolf, Andrea Weidt, and Barbara König. 2013. Mate choice for genetic compatibility in the house mouse. Ecology and evolution(2013).Google ScholarGoogle Scholar
  51. Madonna Livingstone. 2018. Foraging toys and environmental enrichment for parrots. Companion Animal 23, 8 (2018), 462–469.Google ScholarGoogle ScholarCross RefCross Ref
  52. Andrew Luescher. 2008. Manual of parrot behavior. John Wiley & Sons.Google ScholarGoogle Scholar
  53. Caroline Marques Maia and Gilson Luiz Volpato. 2016. A history-based method to estimate animal preference. Scientific Reports 6, 1 (2016), 1–8.Google ScholarGoogle ScholarCross RefCross Ref
  54. Clara Mancini. 2011. Animal-computer interaction: a manifesto. interactions (2011).Google ScholarGoogle ScholarDigital LibraryDigital Library
  55. Clara Mancini. 2017. Towards an animal-centred ethics for Animal–Computer Interaction. International Journal of Human-Computer Studies (2017).Google ScholarGoogle ScholarDigital LibraryDigital Library
  56. Natalia D Mankowska, Anna B Marcinkowska, Monika Waskow, Rita I Sharma, Jacek Kot, and Pawel J Winklewski. 2021. Critical Flicker Fusion Frequency: A Narrative Review. Medicina 57, 10 (2021), 1096.Google ScholarGoogle Scholar
  57. Kenneth M Martin. 2006. Psittacine behavioral pharmacotherapy. Manual of parrot behavior(2006), 267–279.Google ScholarGoogle Scholar
  58. CL Meehan, JP Garner, and JA Mench. 2004. Environmental enrichment and development of cage stereotypy in Orange-winged Amazon parrots (Amazona amazonica). Developmental Psychobiology 44, 4 (2004), 209–218.Google ScholarGoogle ScholarCross RefCross Ref
  59. J Miesle. 2018. Understanding Avian Ganglioneuritis. IVIS Rev Vet Med (2018).Google ScholarGoogle Scholar
  60. The Humane Society of America. 2022. Pets by the Number. https://humanepro.org/page/pets-by-the-numbers. Accessed: 2022-09-10.Google ScholarGoogle Scholar
  61. Tadatoshi Ogura. 2012. Use of video system and its effects on abnormal behaviour in captive Japanese macaques (Macaca fuscata). Applied animal behaviour science 141, 3-4 (2012), 173–183.Google ScholarGoogle Scholar
  62. Mark O’Hara, Alice MI Auersperg, Thomas Bugnyar, and Ludwig Huber. 2015. Inference by exclusion in Goffin cockatoos (Cacatua goffini). PLoS One 10, 8 (2015), e0134894.Google ScholarGoogle Scholar
  63. Irene M Pepperberg and Steven R Wilkes. 2004. Lack of referential vocal learning from LCD video by Grey parrots (Psittacus erithacus). Interaction Studies 5, 1 (2004), 75–97.Google ScholarGoogle ScholarCross RefCross Ref
  64. Bonnie M Perdue, Theodore A Evans, Washburn David A, Duane M Rumbaugh, and Michael J Beran. 2014. Do Monkeys Choose to Choose. In Learning and Behavior. Springer, Article 42, 164-175 pages.Google ScholarGoogle Scholar
  65. Germain regulation. 2022. German Minimum Requirements for the keeping of parrots. http://www.thebirdschool.com/caring-for-parrots/parrot-cages-parrot-cages/german-minimum-requirements-for-the-keeping-of-parrots/. Accessed: 2022-09-10.Google ScholarGoogle Scholar
  66. Diana Reiss. 2006. Enriching animals while enriching science: Providing choice and control to dolphins. In Proceedings of the Seventh International Conference on Environmental Enrichment, Vol. 31. 26–31.Google ScholarGoogle Scholar
  67. Diana Reiss. 2011. The dolphin in the mirror: Exploring dolphin minds and saving dolphin lives. Houghton Mifflin Harcourt.Google ScholarGoogle Scholar
  68. Walter J Rosskopf Jr and Richard W Woerpel. 1991. Pet avian conditions and syndromes of the most frequently presented species seen in practice. Veterinary Clinics of North America: Small Animal Practice 21, 6(1991), 1189–1211.Google ScholarGoogle ScholarCross RefCross Ref
  69. Jörg Rychen, Julie Semoroz, Alexander Eckerle, Richard HR Hahnloser, and Rébecca Kleinberger. 2022. Full-duplex acoustic interaction system for cognitive experiments with cetaceans. bioRxiv (2022).Google ScholarGoogle Scholar
  70. Damian Scarf. 2022. Getting out of the lab: The development of a free-range learning apparatus for pigeons (FLAP). (2022).Google ScholarGoogle Scholar
  71. Damian Scarf, Harlene Hayne, and Michael Colombo. 2011. Pigeons on par with primates in numerical competence. Science 334, 6063 (2011), 1664–1664.Google ScholarGoogle ScholarCross RefCross Ref
  72. Becky Scheel. 2018. Designing digital enrichment for orangutans. In Proceedings of the Fifth International Conference on Animal-Computer Interaction. 1–11.Google ScholarGoogle ScholarDigital LibraryDigital Library
  73. Sara J Shields, Joseph P Garner, and Joy A Mench. 2004. Dustbathing by broiler chickens: a comparison of preference for four different substrates. Applied Animal Behaviour Science 87, 1-2 (2004), 69–82.Google ScholarGoogle ScholarCross RefCross Ref
  74. Burrhus F Skinner. 1960. Pigeons in a pelican.American Psychologist 15, 1 (1960).Google ScholarGoogle Scholar
  75. Marek Špinka. 2019. Animal agency, animal awareness and animal welfare. Animal welfare 28, 1 (2019), 11–20.Google ScholarGoogle Scholar
  76. Marek Špinka and Françoise Wemelsfelder. 2011. Environmental challenge and animal agency. (2011).Google ScholarGoogle Scholar
  77. Claudia Stephan, Anna Wilkinson, and Ludwig Huber. 2012. Have we met before? Pigeons recognise familiar human faces. Avian Biology Research 5, 2 (2012), 75–80.Google ScholarGoogle ScholarCross RefCross Ref
  78. Heli K Väätäjä and Emilia K Pesonen. 2013. Ethical issues and guidelines when conducting HCI studies with animals. In CHI’13 Extended Abstracts on Human Factors in Computing Systems. 2159–2168.Google ScholarGoogle Scholar
  79. Yvonne RA van Zeeland, Berry M Spruit, T Bas Rodenburg, Bernd Riedstra, Yvonne M van Hierden, Bart Buitenhuis, S Mechiel Korte, and Johannes T Lumeij. 2009. Feather damaging behaviour in parrots: A review with consideration of comparative aspects. Applied Animal Behaviour Science 121, 2 (2009), 75–95.Google ScholarGoogle ScholarCross RefCross Ref
  80. Michelle Westerlaken and Stefano Gualeni. 2014. Felino: the philosophical practice of making an interspecies video game. (2014).Google ScholarGoogle Scholar
  81. Liz Wilson. 2001. Biting and screaming behavior in parrots. Veterinary Clinics of North America: Exotic Animal Practice 4, 3(2001), 641–650.Google ScholarGoogle ScholarCross RefCross Ref
  82. James T Winslow. 2003. Mouse social recognition and preference. Current protocols in neuroscience 22, 1 (2003), 8–16.Google ScholarGoogle Scholar
  83. Jay Withgott. 2000. Taking a Bird’s-Eye View… in the UV: Recent studies reveal a surprising new picture of how birds see the world. BioScience (2000).Google ScholarGoogle Scholar
  84. Jocelyn M Woods, Adrienne Eyer, and Lance J Miller. 2022. Bird Welfare in Zoos and Aquariums: General Insights across Industries. Journal of Zoological and Botanical Gardens 3, 2 (2022), 198–222.Google ScholarGoogle ScholarCross RefCross Ref
  85. Anthony A Wright, Robert G Cook, Jacquelyne J Rivera, Stephen F Sands, and Juan D Delius. 1988. Concept learning by pigeons: Matching-to-sample with trial-unique video picture stimuli. Animal learning & behavior 16, 4 (1988).Google ScholarGoogle Scholar
  86. Anna Zamansky, Aleksandr Sinitca, Dirk van der Linden, and Dmitry Kaplun. 2021. Automatic animal behavior analysis: opportunities for combining knowledge representation with machine learning. Procedia Computer Science 186 (2021).Google ScholarGoogle Scholar

Index Terms

  1. Birds of a Feather Video-Flock Together: Design and Evaluation of an Agency-Based Parrot-to-Parrot Video-Calling System for Interspecies Ethical Enrichment.

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        HTML Format

        View this article in HTML Format .

        View HTML Format