Blog

Advancements in combining electronic animal identification and augmented reality technologies in digital livestock farming | Scientific Reports

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

Scientific Reports volume  13, Article number: 18282 (2023 ) Cite this article Rfid Transponder

Advancements in combining electronic animal identification and augmented reality technologies in digital livestock farming | Scientific Reports

Modern livestock farm technologies allow operators to have access to a multitude of data thanks to the high number of mobile and fixed sensors available on both the livestock farming machinery and the animals. These data can be consulted via PC, tablet, and smartphone, which must be handheld by the operators, leading to an increase in the time needed for on-field activities. In this scenario, the use of augmented reality smart glasses could allow the visualization of data directly in the field, providing for a hands-free environment for the operator to work. Nevertheless, to visualize specific animal information, a connection between the augmented reality smart glasses and electronic animal identification is needed. Therefore, the main objective of this study was to develop and test a wearable framework, called SmartGlove that is able to link RFID animal tags and augmented reality smart glasses via a Bluetooth connection, allowing the visualization of specific animal data directly in the field. Moreover, another objective of the study was to compare different levels of augmented reality technologies (assisted reality vs. mixed reality) to assess the most suitable solution for livestock management scenarios. For this reason, the developed framework and the related augmented reality smart glasses applications were tested in the laboratory and in the field. Furthermore, the stakeholders’ point of view was analyzed using two standard questionnaires, the NASA-Task Load Index and the IBM-Post Study System Usability Questionnaire. The outcomes of the laboratory tests underlined promising results regarding the operating performances of the developed framework, showing no significant differences if compared to a commercial RFID reader. During the on-field trial, all the tested systems were capable of performing the task in a short time frame. Furthermore, the operators underlined the advantages of using the SmartGlove system coupled with the augmented reality smart glasses for the direct on-field visualization of animal data.

Gabriele Sara, Giuseppe Todde & Maria Caria

Dmitrii Kaplun, Irina Shpakovskaya, … Sergey Romanov

Katarzyna Filus, Sławomir Nowak, ... Jakub Duda

In recent decades, many technologies have been introduced into the livestock farming sector. Among these, one of the earliest technologies was radio frequency identification (RFID), which is used for the electronic identification (EID) of animals1. RFID systems are composed of two parts, a transponder or tag (ear tags, rumen bolus, or injectable glass tags) and a transceiver (portable or fixed)2. The use of EID is mandatory in Europe for sheep and goats (Reg. CE n. 21/2004), while it is voluntary for cattle. RFID tags can use two communication protocols, half-duplex (HDX) and full-duplex (FDX). As described in ISO 11785:19963, these two technologies differ in the modulation of the response signal, return frequencies, encoding and bit rate of transmission. However, even if the response telegram structure differs for HDX and FDX systems, the structure of the unique animal code (64 bits) is the same and is regulated by ISO 11784:19964. Even if this technology is considered established for identification, tags can only store a few bits (128 bits in FDX and 112 bits in HDX), which correspond to the unique identification code of the animal, giving no additional information. Moreover, a large variety of data are collected in modern farms from different sensors thanks to the spread of precision livestock farming (PLF) technologies. The most common definition of PLF is “individual animal management by continuous real-time monitoring of health, welfare, production/reproduction, and environmental impact”5. However, PLF is not the only term used to describe this kind of approach to livestock farming; among the others, "smart livestock farming” and "smart animal agriculture” are the most commonly used6. All these terms refer to the use of process engineering principles or technologies to manage livestock production through smart sensors, monitoring animal growth, production, diseases, behavior and components of the macroenvironment7. PLF Sensors can be fixed (e.g., cameras or weather station) or wearable by the animal (e.g., bolus, collars or ear tags)8. Fixed sensors, especially cameras, can be used for non-invasive monitoring of animals, thanks to the use of machine vision. This technology can provide very specific and precise information concerning animal behavior and health status, but the use for individual identification still requires further investigation9. In general, PLF technologies have made available a large amount of data; however, their consultation and interpretation by farmers is often considered a difficult and time-consuming task10. Often, the interoperability between different sensors is limited and data are stored in different databases accessible through PCs or mobile devices, such as smartphones and tablets. However, the visual presentation of raw or summarized data, especially directly on-field, represents a crucial part for an effective use of sensors outcomes11. Moreover, even when the data can be consulted through mobile devices, the process implies a stop in normal farm management activities because the smartphone or the tablet occupies the operator’s hand.

In this scenario, smart glasses for augmented reality (ARSG), if connected to EID, could allow the possibility of consulting specific animal databases directly on-field, leaving the operator hands-free12. ARSGs are wearable head-up displays, connected to, or integrating a miniaturized computer, that adds virtual information to the user’s reality. There are many types of ARSG that can be grouped according to price, weight, powering system (internal or external battery), visualization systems (video, optical or retinal), operating system (Android-based, Windows, etc.), interaction methods and resistance to bumps, dust and water13. Augmented Reality (AR) consists of the visualization of digital information superimposed on the real environment, providing additional information to users and helping them to solve tasks at the same time14. At this moment, AR is not a diffuse technology in the agricultural domain but is more commonly used in manufacturing15,16,17, industrial sectors18,19, medicine20,21, psychology22,23 and education24,25,26. Several studies have already shown that AR can be a useful technology in agricultural contexts27,28. Azuma (1997)29 defined the three basic characteristics of the AR system and expanded the definition in 2001. However, in recent years, with the advancement of technology and the diversification of devices that can implement AR technology, a new definition of AR was needed. In fact, Rauschnabel et al.30 redefined AR as a hybrid experience consisting of context-specific virtual content that is merged into a user’s real-time perception of the physical environment through computing devices. AR can further be refined based on the level of integration of the digital elements in the real world. The level of integration defines a specific AR spectrum that ranges from assisted reality (low integration) to mixed reality (high integration). This more comprehensive definition allows us to include a wider variety of technologies in the AR spectrum, as in the case of assisted reality. This technology consists of the visualization of head-stable content not connected to real-world objects, and it is commonly implemented in mobile devices, such as smartphones, and in smart glasses (SG). Another level of the AR spectrum is mixed reality (MR), which was first described by Milgram and Kishino31 as “a subclass of virtual reality (VR) related technologies that involve the merging of the real and virtual worlds”. Different from assisted reality systems, MR increases the user’s spatial and visual interaction possibilities32. MagicBook is considered one of the first examples of an MR system33. This book can be read and held as a normal one but with a specific MR display a set of digital information, and 3D models, aligned with the real-world book, are shown. The device also allows the user to be immersed in the virtual scene, exploring the entire virtual continuum spectrum with one system. Currently, one of the most advanced MR devices is the Microsoft HoloLens (Microsoft, USA), which has the capability to show digital information and virtual objects in the form of holograms. Those objects are aligned to the real world thanks to real-time environment scanning and can interact with the use of bare hands. The principal situation in which it is better to choose MR over AR is when there is the need to manipulate and physically interact with virtual objects34.

The aim of this study was to design and develop a framework that allows the connection of animal electronic identification to different types of ARSG and to evaluate the performance of the developed system in the laboratory and on-field. Moreover, a comparison of assisted and mixed reality systems was carried out to assess the most suitable solution in livestock farm environments.

SmartGlove (SMGL; Fig. 1) is currently a TRL-3 (technology readiness level-3: analytical and experimental critical function and/or characteristic proof of concept) prototype that allows reading the unique animal code from RFID tags and sending it to ARSG to display all the information related to that specific animal. It is composed of an Arduino control unit with integrated Bluetooth and an RFID reader board connected to a 125 kHz antenna. All the components are enclosed in a 3D-printed plastic case that can be used as a bracelet, with the antenna extended in the back of the hand attached to a glove (Fig. 1).

SmartGlove hardware components: (A) 3.7 V 2000mAh Li-Ion battery, (B) 3D printed case, (C) connection status led, (D) 125 kHz copper antenna, (E) Arduino Bluetooth motherboard, (F) FDX-B RFID reader controller, (G) support glove.

The SMGL is connected, via Bluetooth, to custom software for SG, which displays the following animal information related to the tag’s identification code: The animal ID code, group (A or B), name, date of the last parturition, age (in years), milk production of the last day (kg), milk production of the last week (kg), the number of parturitions and presence of mastitis (P for positive or N for negative). The SMGL can be connected to different types of ARSG. The first ARSG adopted in this study are the Epson Moverio BT–300 (BT300), an Android-based assisted reality device with an optical, binocular, see-through display. The second ones are the Microsoft HoloLens 2 (HL), a Microsoft Windows-based MR device, which has a holographic display. A complete list of the characteristics of both SG can be found in Table 1.

To design and develop the supporting applications that run on the BT-300 and HL devices we used Android SDK for the former and Universal Windows Platform (UWP) for the latter. Both SDKs enable the developer to create the same experience for their respective target devices with no noticeable difference in the use of the resulting applications. For both application versions the same implementation architecture has been adopted, so the main differences between them are platform-dependent and at a code level.

The development environments enabled us to create the same user interface in both the resulting applications. Indeed, both APIs share some common design practices:

The management of the sensors of the device is handled by a Hardware Abstraction Layer, that gives the developer the possibility to interact with them without knowing how to access them directly;

It is possible to visually design the interfaces using the built-in tools in the Integrated Development Environment applications.

Database management can be handled in the same way since both of the APIs implement the same database engine (SQLite).

Before accessing any device features both APIs need the developer to declare beforehand which features the application will ask authorization from the user (either on application install or at runtime)

Despite these similarities, the APIs differs in some other features:

the programming languages are different (Android SDK supports Java and Kotlin for the majority of the use cases, with the possibility to use C++ as a performance-critical, native language; UWP supports C#, Visual Basic and C++ with WinRT as possible alternatives)

The operating systems the APIs target are different (Android SDK targets Android devices, UWP targets Windows devices).

the user interface, while it can be designed in visual ways from both SDKs, it requires different markup languages for customizing them using the underlying code (Android SDK uses XML-based layouts, UWP uses XAML-based pages).

The notification system includes on-screen popout messages (“toast notifications”), bottom application messages (“Snackbar notifications”) or system notifications on Android SDK, and only system notifications on UWP.

Moreover, strictly speaking about the target devices, interaction techniques and visualization of the interface itself differs in a key difference: while BT-300 shows a semi-transparent Android interface in front of the user, it is stationary with respect to the movement of head of the user and it can be interacted with a touchscreen joypad, HL uses a billboarding technique for keeping the interface in a 3D space (with the possibility to enable a "tag-along" feature for moving the interface with the user), and it can be interacted using air gestures and voice recognition.

The software solution consists of three modules (Fig. 2), the glove hardware manager, the database and the headset application. The SMGL is controlled through an Adafruit board connected to an RFID antenna, which reads rumen bolus or ear tags to recognize the animal and sends the identifier to the headset through a Bluetooth connection. When the interface receives a new identifier, it displays all the information about the animal stored in the database.

Overview of the software solution for smart glasses.

In the current prototype, the database structure is straightforward. It consists of a Google spreadsheet farmers can modify to adapt to their needs. This means they can add the information they would like to store about the animals by adding columns to the spreadsheet. The only assumption is that the RFID identifier is the first column in the spreadsheet, serving as the primary key. We can upload a copy of the shared spreadsheet in the interface application as Comma Separated Values (CSV) for offline work, and synchronize the changes when the network is available. The categories are not fixed in the code. Indeed, they are defined by the CSV file once it has been downloaded from the Google Spreadsheet, making it possible to receive data from different database models. Farmers can access the database in the field using the headset interface, and it can be displayed as an overlay on their field of view in the real world through the headset's screen. The application pairs with the glove at startup by showing an interface reporting the Bluetooth scanning result and connecting with the device through the "Connect" button. Once selected, the application establishes a connection and exchanges the required information using the GATT protocol for Bluetooth Low Energy devices, which preserves the glove battery. After that, the glove can be used to scan the bolus or the animal's ear. If the RFID code matches an entry in the database, the application retrieves all the related data, presenting it in a tabular view. Otherwise, the application shows a descriptive message (e.g., the RFID code is not included in the database). Farmers can customize the visualization by hiding columns of the table they are not interested in. By pressing the "Parameters" button, the application shows a list of all the columns in the spreadsheet, and farmers can select or deselect them.

The application allows offline work by locally caching the changes to the database and synchronizing it back on the cloud by pressing the "Update Database" button. The application notifies the farmer in case of database update issues or when the application has no locally saved data.

To evaluate the operative performances of the different systems, two sets of trials were designed. The first one was a completion of the experiments described in Todde et al.35 and were carried out in the laboratory to assess the complete operativity of the SMGL and to evaluate its performance of RFID tag reading and connection with ARSG in a controlled environment. The second one was carried out at the experimental sheep farm of the University of Sassari to assess the operating capabilities of the developed framework in a real farm scenario. During all the tests, a commercial RFID reader (F1 Reader produced by Datamars, Spain) was used to compare the developed systems with a conventional tool.

All the procedures of this research including the trials involving animals (sheep) were carried out in accordance with relevant guidelines and regulations. All animal data used in this study were collected as part of standard farming practices, where the animals were handled exclusively by the technicians of the farm. As such, no part of this research was subject to the Italian and EU legislation's approval (D.L. 26/2014 and Council Directive 2010/63/EU) or approval of an ethics committee. Moreover, based on previous research protocols the ATS Sardegna ethical committee reported that, when no healthcare investigators interact with the procedures of the study, the adopted protocol does not require an opinion from the ethics committee. Finally, informed consent was obtained from all the human participating subjects for publication of identifying information and images in an online open-access publication.

To evaluate the operative performance of SMGL, preliminary tests were carried out in the laboratory of the Department of Agriculture of the University of Sassari. First, the activation distance of tags with the SMGL was measured. The transponders were located for 3 s at determined distances (10 cm, 5 cm, 4 cm, 3 cm, 2 cm, 1 cm), with 50 repetitions for every distance. For this test, the SMGL was disconnected from the ARSG, however, the valid activation of the transponder was confirmed by the vibration feedback of the SMGL. Second, the reading time process of the tags was measured with a stopwatch. The process ranged from the activation of the transponder at 2 cm to the complete visualization of the data in the ARSG display. The measurement was repeated 50 times for every ARSG. All the tests were performed with two types of tags, the FDX ear tag (ET) and FDX rumen bolus (RB), and with both ARSG (BT300 and HL).

On-field trials were carried out to evaluate the performance of the SMGL systems for real livestock activity (Fig. 3). Specifically, the task consisted of the identification of a specific animal through the reading of the RB, and the subsequent finding of specific information in the farm database. In this case, the information to identify was the group in which the animal was collocated (A or B). A total of 18 people were recruited for the study. The group was gender balanced and consisted of students, graduates and local stakeholders. Their mean (minimum–maximum) age and body mass were 34 (22–60) years old and 66.4 (49–93) kg, respectively.

On-field tests with operators reading sheep bolus with the developed systems and consulting individual animal information through smart glasses: (A) operator wearing SmartGlove + Epson Moverio BT-300, (B) operator wearing SmartGlove + Microsoft HoloLens 2.

Additionally, 83% of the participants had at least a bachelor’s degree, and 59% stated that they had at least basic knowledge of AR and SG concepts and functions. All the participants reported being healthy, having normal or corrected vision with contact lenses or glasses and having no current or recent injuries that limited their normal physical ability. Before any data collection, participants were asked to complete and sign an informed consent approved by the University of Sassari. All the participants were asked to perform the task with three different types of hardware, HL connected to the SMGL, BT300 connected to the SMGL, and, finally, with the commercial RFID reader and a paper list. Five animals were placed in a rack, and the participants could choose from which side of the rack to begin the task. When the supervisor gave the signal, the operator could start to scan the animals’ RB. To verify the correct identification of the animal, the participants had to communicate to the supervisor the last three digits of the animal ID and the group in which it is collocated. The task was considered completed when all five animals were identified. During the trials, the times for each scan, the total amount of time needed to complete the task, the number of activation failures of the RFID tags and the number of wrong consultations of the database from the operator were measured. Tests were performed to evaluate the influence of the hardware type on the execution time, usability and workload. Prior to the test session, participants received a training session to become familiar with both the devices and methods. After each task simulation, they completed a set of questionnaires for the given conditions. The first questionnaires used in the research were the NASA Task Load Index (NASA-TLX), which is considered a valid tool to assess the perceived workload of a task36,37. This questionnaire uses six categories to assess the mental workload, measuring mental demand, temporal demand, performance, effort and frustration. Participants were asked to rate these categories on a bipolar twenty-step scale. The second questionnaire was the IBM Post Study System Usability Questionnaire (PSSUQ)38, in which participants are provided with different statements (in this case, 6 out of the 19 originals) and were asked how much they agree with such statements on a five-point Likert-scale (from strongly disagree to strongly agree). After completion of all the experimental conditions, the participants were asked to provide any opinion or thought about the tested devices and to select the most and least preferred system.

For the analysis of the influence of different types of systems on operative performances, a one-way ANOVA was used. Descriptive statistics (arithmetic average, standard deviation) were calculated for each of the weighted scores of the NASA-TLX and for each category of the PSSUQ. The Kruskal‒Wallis test was used to compare the overall scores of the NASA-TLX and PSSUQ due to nonparametric data trends. RStudio (version 2022.07.2 build 576) was used to perform the statistical analysis.

Table 2 shows the results of the activation distance for the commercial RFID reader and the SMGL with both types of transponders (ET and RB). A maximum activation distance of 5 cm resulted for both the SMGL (5% ET and 25% RB) and the commercial RFID reader (95% ET and 65% RB), with both types of transponders adopted but with different success rates. Regarding the SMGL, an acceptable success rate was observed at a distance of 3 cm (70% ET and 60% RB), while a 100% success rate was obtained at 1 cm for both types of transponders. The F1 reader obtained similar results with an acceptable success rate (95% for ET and 65% for RB) at 5 cm, and a secure activation of the transponders, both ET and RB, at 2 cm.

In Table 3, the results of the reading process time are presented. The time framework recorded with BT300 was 2.16 s with ET, and 2.22 s with RB, while for HL, it was 4.30 s with ET, and 3.80 s with RB. HL showed a higher variability in the reading process time in comparison with BT300, having a difference between the minimum and maximum time of 7.90 s for HL, and 2.87 s for BT300.

The type of system used for animal identification affected the operating performances with statistical relevance (p < 0.001). Participants performed the task in a shorter time with the conventional system (mean = 59.79, SD = 15.40) than with the ARSG system (mean = 82.72, SD = 32.81) or with the MRSG system (mean = 98.33, SD = 35.04). The same results were obtained with the average time per tag reading with the conventional method, which was faster (mean = 12.45, SD = 2.93), followed by the ARSG system (mean = 16.86 SD = 4.34) and finally the MRSG system (mean = 21.32, SD = 7.39). With all the tested systems, no errors occurred (Table 4).

The results of the analyses of the NASA-TLX did not highlight significant differences in the type of system used for every category of the questionnaire (Table 5). However, differences in the scores between each type of system were appreciable. Specifically, the mental demand, temporal demand and frustration levels were lower in the BT300 + SMGL and HL + SMGL systems than in the conventional system (F1 reader + paper list), while physical demand and performance satisfaction showed no particular differences. Finally, the overall workload score was higher for the conventional system (29.72), followed by the HL + SMGL system (27.56) and the BT300 + SMGL system (25.19).

Similar to the outcomes of the NASA-TLX questionnaire, the results obtained with the PSSUQ showed no significant differences. However, appreciable variances of scores in some categories of the questionnaire can be found (Fig. 4). Regarding the speed of the system, the conventional systems showed the best scores (4.67) in comparison to the BT300 + SMGL system (4.22) and the HL + SMGL system (3.83). Moreover, the easiness of the information findings showed substantial differences between the systems, where BT300 + SMGL and HL + SMGL obtained higher scores (4.56 and 4.50, respectively) than the conventional methods (4.00).

Mean score by type of system (F1 reader + paper list, BT300 + SMGL, HL + SMGL) per question of the IBM-PSSUQ questionnaire.

This study aimed to develop a functional framework that allows the visualization of specific animal data directly on-field thanks to the connection between animal RFID tags and smart glasses for augmented reality. The individuation of specific information on an animal in a livestock farm could improve specific on-farm activities, such as animal grouping based on the number of lambs or the average milk yield39. Moreover, the operative performances of the systems developed were evaluated during laboratory tests and on-field livestock management operations. In addition, as the system is capable of connecting with different types of ARSG, the suitability of various types of SG and different levels of AR technology in the agricultural environment was evaluated. The laboratory tests confirmed the complete capability of the SMGL system to function in a controlled environment, showing a performance similar to that of a commercial RFID reader. These outcomes allowed the upgrade of the SMGL from a TRL-3 (analytical and experimental critical function and/or characteristic proof of concept) to a TRL-4 prototype (component and/or breadboard functional verification in laboratory environment). Moreover, the on-field task showed that the developed systems can be used as a possible tool for the identification of livestock animals in agricultural environments, upgrading the TRL of the device to level 5 (component and/or breadboard critical function verification in a relevant environment). However, the use of ARSG led to longer operative times compared to conventional systems, in coherence with similar studies made in other industry sectors (e.g., assembly) where the use of AR systems showed a longer assembly time compared to the use of paper, video, or tablet systems40,41. However, in this study, the longer operative time in the identification task is imputable to the lower success rate of the tag activation of the SMGL in comparison to the conventional RFID reader. Additionally, the low level of optimization is underlined by the difference, in terms of time per reading, between the laboratory (Table 3) and the on-field trials (Table 4). In addition, it must be considered that for the conventional methods, paper lists have to be prepared before the start of operations, while both developed systems can be considered ready-to-go. As observed by Drouot et al.42, the level of familiarity of the user with new technology may have an impact on user performance. In fact, even if the participants received a brief training, most of them (12 out of 18) reported no, or low, previous knowledge of AR or SG. Moreover, as confirmed by the NASA-TLX scores, the animal identification task was straightforward, while the AR systems showed better results in complex operations43,44. A possible explanation for the difference between BT300 + SMGL and HL + SMGL, in terms of operative performances, could be related to the spatialization of the information of the MR system. In fact, while in assisted reality systems, the display with all the information is always visible in front of the operator, in the MR system, information are in a precise position of the real environment and, without familiarity with this technology, can be difficult to find45. Additionally, the different scores of mental workloads could also suggest that the use of ARSG improve the levels of situational awareness (the ability to perceive, understand, and effectively respond to a specific situation), which is one of the most relevant elements for the evaluation of operator safety46. Furthermore, the scores regarding the statements for “simplicity of use” and “speed of the system” in the IBM–PSSUQ were higher for the conventional system. In fact, the high level of familiarity of participants with the conventional tools allowed them to complete the task in a shorter time. Nevertheless, as shown by the “easiness of finding information” statement, the ARSG systems permitted a faster localization of the information. The elements discussed previously (i.e., low optimization level of the SMGL, lack of familiarity with the AR technology and the simplicity of the task) may also have contributed to the low difference among the scores of the three different systems in the poststudy questionnaires. However, participants underlined the advantages of AR systems in comparison to conventional systems, such as the benefits of voice commands and the possibility of a hands-free operation. The participants were also asked to select a preferred and least preferred system, and 11 out of 18 selected HL + SMGL, 7 out of 18 selected BT300 + SMGL and no one selected the conventional method, which was indicated as the least preferred by 12 out of 18 participants.

In this study, the development and testing of a smart RFID wearable reader that aims to bridge the technological gap between the electronic identification of livestock animals and real-time access to individual information was analyzed. The device, named SmartGlove, was developed by combining the functionalities of an RFID reader with smart glasses for augmented reality. In addition, to connect SMGL and ARSG, two specific applications for Microsoft HoloLens 2 and Epson Moverio BT-300 were coded. The laboratory and on-field tests underlined promising operating performances allowed the upgrade, in terms of TRL of the device from level 3 to level 5. The on-field trials allowed stakeholders to visualize animal information directly on-farm, and superimposed on the specific animal, in a short time interval. The participants’ feedback confirmed a low cognitive impact and a high usability level for the use of the SMGL connected to ARSG. Finally, the operators underlined the preference for SMGL systems over conventional systems for animal identification and the consequent visualization of animal data. However, the SMGL is still in a prototype phase, and further improvements are needed, focusing on the miniaturization of the hardware and the design of a more ergonomic and comfortable tool shape. Finally, future works will also focus on upgrading the ARSG software, with a completer and more intuitive user interface and a more comprehensive and automated system for the precision management of animal data.

The datasets generated during the current study are available from the corresponding author on reasonable request.

Aquilani, C., Confessore, A., Bozzi, R., Sirtori, F. & Pugliese, C. Precision Livestock Farming technologies in pasture-based livestock systems.Animal 16(1), 100429 (2022).

Article  CAS  PubMed  Google Scholar 

Kampers, F. W. H., Rossing, W. & Eradus, W. J. The ISO standard for radiofrequency identification of animals. Comput. Electron. Agric. 24(1–2), 27–43 (1999).

International Organization for Standardization. Radio frequency identification of animals: Technical concept (ISO standard No. 11785:1996). https://www.iso.org/standard/19982.html.

International Organization for Standardization. Radio frequency identification of animals—code structure (ISO standard No. 11784:1996). https://www.iso.org/standard/25881.html.

Berckmans, D. General introduction to precision livestock farming. Anim. Front. 7(1), 6–11 (2017).

Tedeschi, L. O. et al. A glimpse of the future in animal nutrition science. 1. Past and future challenges. Rev. Bras. Zootec. 46, 438–451 (2017).

Tedeschi, L. O., Greenwood, P. L. & Halachmi, I. Advancements in sensor technology and decision support intelligent tools to assist smart livestock farming. J. Anim. Sci. https://doi.org/10.1093/jas/skab038 (2021).

Article  PubMed  PubMed Central  Google Scholar 

Andonovic, I., Michie, C., Cousin, P., Janati, A., Pham, C. & Diop, D. Precision livestock farming technologies. In 2018 Global Internet of Things Summit (GIoTS), Bilbao, Spain 1–6 (2018).

Oliveira, D. A. B., Pereira, L. G. R., Bresolin, T., Ferreira, R. E. P. & Dorea, J. R. R. A review of deep learning algorithms for computer vision systems in livestock. Livest. Sci. 253, 104700 (2021).

Werkheiser, I. Technology and responsibility: A discussion of underexamined risks and concerns in precision livestock farming. Anim. Front. 10(1), 51–57 (2020).

Article  PubMed  PubMed Central  Google Scholar 

Van Hertem, T. et al. Appropriate data visualization is key to Precision Livestock Farming acceptance. Comput. Electron. Agric. 138, 1–10 (2017).

Caria, M., Sara, G., Todde, G., Polese, M. & Pazzona, A. Exploring smart glasses for augmented reality: A valuable and integrative tool in precision livestock farming. Animals 9(11), 903 (2019).

Article  PubMed  PubMed Central  Google Scholar 

Syberfeldt, A., Danielsson, O. & Gustavsson, P. Augmented reality smart glasses in the smart factory: Product evaluation guidelines and review of available products. IEEE Access 5, 9118–9130 (2017).

Hurst, W., Mendoza, F. R. & Tekinerdogan, B. Augmented reality in precision farming: Concepts and applications. Smart Cities 4(4), 1454–1468 (2021).

Runji, J. M., Lee, Y. J. & Chu, C. H. Systematic literature review on augmented reality-based maintenance applications in manufacturing centered on operator needs. Int. J. Precis. Eng. Manuf. Green Technol. 10, 1–19 (2022).

Sahu, P., & Balfour, D. (2022). Smart Manufacturing with Augmented Reality (No. 2022-26-0026). SAE technical paper.

Etonam, A. K., Di Gravio, G., Kuloba, P. W. & Njiri, J. G. Augmented reality (AR) application in manufacturing encompassing quality control and maintenance. Int. J. Eng. Adv. Technol. 9(1), 197–204 (2019).

Boboc, R. G., Gîrbacia, F. & Butilă, E. V. The application of augmented reality in the automotive industry: A systematic literature review. Appl. Sci. 10(12), 4259 (2020).

Mladenov, B., Damiani, L., Giribone, P., & Revetria, R. (2018). A short review of the SDKs and wearable devices to be used for AR applications for industrial working environments. In Proceedings of the World Congress on Engineering and Computer Science, Vol. 1, 23–25.

Moro, C., Štromberga, Z., Raikos, A. & Stirling, A. The effectiveness of virtual and augmented reality in health sciences and medical anatomy. Anat. Sci. Educ. 10(6), 549–559 (2017).

Barsom, E. Z., Graafland, M. & Schijven, M. P. Systematic review on the effectiveness of augmented reality applications in medical training. Surg. Endosc. 30(10), 4174–4183 (2016).

Article  CAS  PubMed  PubMed Central  Google Scholar 

Jun, H., & Bailenson, J. (2020). Effects of behavioral and anthropomorphic realism on social influence with virtual humans in AR. In 2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) 41–44. IEEE.

Radu, I. (2012). Why should my students use AR? A comparative review of the educational impacts of augmented-reality. In 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) 313–314. IEEE.

Saritha, R. C., Mankad, U., Venkataswamy, G., & Bapu, S. B. (2018). An Augmented Reality ecosystem for learning environment. In 2018 IEEE International Conference on Advanced Networks and Telecommunications Systems (ANTS) 1–6. IEEE.

Wu, H. K., Lee, S. W. Y., Chang, H. Y. & Liang, J. C. Current status, opportunities and challenges of augmented reality in education. Comput. Educ. 62, 41–49 (2013).

Dunleavy, M., Dede, C. & Mitchell, R. Affordances and limitations of immersive participatory augmented reality simulations for teaching and learning. J. Sci. Educ. Technol. 18(1), 7–22 (2009).

Sara, G., Todde, G. & Caria, M. Assessment of video see-through smart glasses for augmented reality to support technicians during milking machine maintenance. Sci. Rep. 12(1), 15729 (2022).

Article  ADS  CAS  PubMed  PubMed Central  Google Scholar 

Katsaros, A., Keramopoulos, E., Salampasis, M. A prototype application for cultivation optimization using augmented reality. In 8th International Conference on Information and Communication Technologies in Agriculture, Food & Environment, September 2017, Crete, Greece.

Azuma, R. T. A survey of augmented reality. Presence Teleoperators Virtual Environ. 6(4), 355–385 (1997).

Rauschnabel, P. A., Felix, R., Hinsch, C., Shahab, H. & Alt, F. What is XR? Towards a framework for augmented and virtual reality. Comput. Hum. Behav. 133, 107289 (2022).

Milgram, P. & Kishino, F. A taxonomy of mixed reality visual displays. IEICE Trans. Inf. Syst. 77(12), 1321–1329 (1994).

Holz, T. et al. Mira-mixed reality agents. Int. J. Hum. Comput. Stud. 69(4), 251–268 (2011).

Billinghurst, M., Kato, H. & Poupyrev, I. The MagicBook: A transitional AR interface. Comput. Graph. 25(5), 745–753 (2001).

Maas, M. J. & Hughes, J. M. Virtual, augmented and mixed reality in K–12 education: A review of the literature. Technol. Pedagogy Educ. 29(2), 231–249 (2020).

Todde, G., Sara, G., Pinna, D., Artizzu, V., Spano, L.D., & Caria, M. Smart glove: Development and testing of a wearable RFID reader connected to mixed reality smart glasses, in AIIA 2022: Biosystems Engineering Towards the Green Deal. AIIA 2022. Lecture Notes in Civil Engineering (eds Ferro, V., Giordano, G., Orlando, S., Vallone, M., Cascone, G., & Porto, S.M.C.), Vol. 337. Springer (2023).

Hart, S. G., & Staveland, L. E. Development of NASA-TLX (Task Load Index): Results of empirical and theoretical research. In Advances in Psychology, Vol. 52 139–183. North-Holland (1988).

Hart, S. G. (2006). NASA-task load index (NASA-TLX); 20 years later. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Vol. 50, No. 9 904–908. Sage Publications.

Lewis, J. R. IBM computer usability satisfaction questionnaires: psychometric evaluation and instructions for use. Int. J. Hum. Comput. Interact. 7(1), 57–78 (1995).

Caria, M., Todde, G., Sara, G., Piras, M. & Pazzona, A. Performance and usability of smartglasses for augmented reality in precision livestock farming operations. Appl. Sci. 10(7), 2318 (2020).

Min, J. S., Kwak, G. & Hwang, W. Comparison among types of assembly manuals based on paper, video and augmented reality. ICIC Express Lett. 14(3), 303–310 (2020).

Funk, M., Kosch, T., & Schmidt, A. (2016). Interactive worker assistance: comparing the effects of in-situ projection, head-mounted displays, tablet, and paper instructions. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing 934–939.

Drouot, M., Le Bigot, N., de Bougrenet, J. L., & Nourrit, V. (2021). Effect of context and distance switching on visual performances in augmented reality. In 2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW) 476–477. IEEE.

Havard, V., Baudry, D., Jeanne, B., Louis, A. & Savatier, X. A use case study comparing augmented reality (AR) and electronic document-based maintenance instructions considering tasks complexity and operator competency level. Virtual Real. 25(4), 999–1014 (2021).

Wiedenmaier, S., Oehme, O., Schmidt, L. & Luczak, H. Augmented reality (AR) for assembly processes design and experimental evaluation. Int. J. Hum. Comput. Interact. 16(3), 497–514 (2003).

Drouot, M., Le Bigot, N., Bricard, E., De Bougrenet, J. L. & Nourrit, V. Augmented reality on industrial assembly line: Impact on effectiveness and mental workload. Appl. Ergon. 103, 103793 (2022).

Grabowski, M., Rowen, A. & Rancy, J. P. Evaluation of wearable immersive augmented reality technology in safety-critical systems. Saf. Sci. 103, 23–32 (2018).

This work was supported by Agritech National Research Center and received funding from the European Union Next-GenerationEU (NATIONAL RECOVERY AND RESILIENCE PLAN (PNRR)—MISSION 4 COMPONENT 2, INVESTMENT 1.4—DD 1032 17/06/2022, CN00000022).This manuscript reflects only the authors' views and opinions, neither the European Union nor the European Commission can be considered responsible for them.INTERDISCIPLINARY RESEARCH PROJECTS—DM 737/2021 RESOURCES 2021–2022 “Knowledge and sustainable management of agricultural and forestry systems with the sustainable improvement of primary production: the case of cattle breeding in Sardinia”.

Department of Agricultural Sciences, University of Sassari, Viale Italia 39/A, 07100, Sassari, Italy

Daniele Pinna, Gabriele Sara, Giuseppe Todde, Alberto Stanislao Atzori & Maria Caria

Department of Mathematics and Computer Science, University of Cagliari, Via Ospedale 72, 09124, Cagliari, Italy

Valentino Artizzu & Lucio Davide Spano

You can also search for this author in PubMed  Google Scholar

You can also search for this author in PubMed  Google Scholar

You can also search for this author in PubMed  Google Scholar

You can also search for this author in PubMed  Google Scholar

You can also search for this author in PubMed  Google Scholar

You can also search for this author in PubMed  Google Scholar

You can also search for this author in PubMed  Google Scholar

D.P.: Conceptualization, Methodology, Data curation, Formal analysis, Investigation, Writing—original draft, Writing—Reviewing and Editing; G.S.: Conceptualization, Methodology, Data curation, Formal analysis, Investigation, Writing—original draft, Writing—Reviewing and Editing; G.T.: Conceptualization, Methodology, Data curation, Formal analysis, Investigation, Writing—original draft, Writing—Reviewing and Editing; A.S.A. Conceptualization, Methodology, Writing—Reviewing and Editing; V.A.: Software, Writing—Reviewing and Editing; L.D.S.: Conceptualization, Software, Writing—Reviewing and Editing; M.C.: Conceptualization, Methodology, Formal analysis, Writing—Reviewing and Editing, Supervision and Funding acquisition.

The authors declare no competing interests.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Pinna, D., Sara, G., Todde, G. et al. Advancements in combining electronic animal identification and augmented reality technologies in digital livestock farming. Sci Rep 13, 18282 (2023). https://doi.org/10.1038/s41598-023-45772-2

DOI: https://doi.org/10.1038/s41598-023-45772-2

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.

Scientific Reports (Sci Rep) ISSN 2045-2322 (online)

Advancements in combining electronic animal identification and augmented reality technologies in digital livestock farming | Scientific Reports

EMV Bank Debit Card Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.