Abstract

BioTracker: An Open-Source Computer Vision Framework for Visual Animal Tracking


Hauke J. Mönck1, Andreas Jörg1,2, Tobias von Falkenhausen1, Julian Tanke1, Benjamin Wild1, David Dormagen1, Jonas Piotrowski1, Claudia Winklmayr1,3 David Bierbach4, Tim Landgraf1,⁢,

1 Freie Universität Berlin, FB Mathematik u. Informatik, Arnimallee 7, 14195 Berlin, Germany

2 University of Applied Science Kempten, Bahnhofstraße 61, D-87435, Germany

3 Humboldt-Universität zu Berlin, Bernstein Center for Computational Neuroscience, Unter den Linden 6, 10099 Berlin

4 Leibniz-Institute of Freshwater Ecology and Inland Fisheries, Müggelseedamm 310, 12587 Berlin, Germany

*corresponding author: tim.landgraf@fu-berlin.de

Keywords: computer vision, tracking software, framework, animal tracking, animal behavior, ecology, guppy, molly, zebrafish

Abstract

The study of animal behavior increasingly relies on (semi-) automatic methods for the extraction of relevant behavioral features from video or picture data. To date, several specialized software products exist to detect and track animals’ positions in simple (laboratory) environments. Tracking animals in their natural environments, however, often requires substantial customization of the image processing algorithms to the problem-specific image characteristics. Here we introduce BioTracker, an open-source computer vision framework, that provides programmers with core functionalities that are essential parts of a tracking software, such as video I/O, graphics overlays and mouse and keyboard interfaces. BioTracker additionally provides a number of different tracking algorithms suitable for a variety of image recording conditions. The main feature of BioTracker is however the straightforward implementation of new problem-specific tracking modules and vision algorithms that can build upon BioTracker’s core functionalities. With this open-source framework the scientific community can accelerate their research and focus on the development of new vision algorithms.

Introducing BioTracker

Animal tracking, e.g., extracting the spatial position or even specific movement patterns from a video source, has become increasingly popular in biological research [Dell et al., 2014], which is reflected in an increasing number of software solutions available. Many of these software programs can handle only a specific tracking problem and are tailored to the image processing tasks defined by the experimental conditions and the animal models under observation.

Throughout the last decade we encountered several image processing tasks that could not be handled by existing tracking algorithms: cascades of ripples on water, fish shoals in the wild, biomimetic robots, bumblebees or honey bees in their colonies, either with or without markers.[Landgraf and Rojas, 2007] [Hussaini et al., 2009] [Jin et al., 2014] [Landgraf et al., 2014] [Wario et al., 2015] [Landgraf et al., 2016] [Wario et al., 2017] [Bierbach et al., 2018] [Wild et al., 2018] While each tracking task was solved with a custom algorithm, a significant portion of the source code comprised functionalities that were required in all applications, such as the user interface, interaction handling, reading and writing video data and exporting tracking results. Developing new tracking solutions therefore involved reimplementing the same components, effectively consuming development resources for the actual image processing algorithm.
We therefore developed the open-source computer vision framework “BioTracker” that implements several functionalities for reoccurring tasks and allows to dynamically load specific tracking algorithms that can call framework components for displaying results or user interactions.

Such a separation of core components from dynamically loaded modules is not only beneficial for developers but also further helps users to find the best suitable tracking solution for their problems at hand. With BioTracker, testing several existing algorithms (or even developing new ones) for a given use-case does not require installing new applications or learning new interaction procedures. Users with several different use-cases thus benefit from using the same look-and-feel and potentially save time customizing their subsequent analyses to different tracking output formats.

A similar approach is taken by “SwisTrack” [Correll et al., 2006] and “AnTracks” (www.antracks.org), which allow constructing a custom processing pipeline via a graphical user interface. Graphical programming might serve developers in exploring new algorithms, but does not provide the same flexibility as a native coding environment. Also, to be able to use these building blocks meaningfully, it may require some technical knowledge, which constrains the target audience.

The Java-based framework “ImageJ” [Schindelin et al., 2015] provides a similar framework to the developer but is focused on processing single images rather than tracking objects in a sequence of images. It nonetheless does allow the implementation of tracking algorithms, but some of the core features, such as visualization and data export, have to be implemented on the module level.

The BioTracker has been employed successfully in some of the aforementioned tracking tasks. For instance, it has been used in the tracking of fish in an experimental tank as done in [Bierbach et al., 2018]. Here the BioTracker was part of a closed loop interaction scenario as a live tracking instance. Details on how to get and operate the program can be found here: https://github.com/BioroboticsLab/biotracker_core/wiki

Software Design

BioTracker consists of four components.
The “core” component provides the shared functionality, being the key to reducing implementation overhead. Functionality such as loading external trackers as plugins, reading videos, pictures and camera streams, and exporting tracking data to a generic format can be done. It also provides various standard graphical user interfaces such as playback, seeking, recording of videos, zooming images, and manipulating tracking objects. Multiple objects can be tracked while basic trajectory manipulations are possible. For instance adding, deleting, moving, correcting and annotating trackpoints can be done.
The “tracking module” component implements the respective tracking and image processing algorithm. This is done by implementing an interface loaded by the core component. The ”interfaces” component acts as a connector between core and tracking component by defining interaction methods between tracking module and core.
The optional “utility” component provides functionality for the most common tracking issues to simplify tracking module implementation.
The core and interfaces depend on OpenCV and Qt as external libraries and the core additionally on Boost. Consequently tracking modules have to depend on Qt and OpenCV as well. The code conforms to C++11 and targets the largest build platforms Windows, Linux and Mac and follows the Model-View-Controller pattern.

Graphical User Interface

Fig. 1: Main view of the BioTracker. Here 11 sulfur mollies (Poecilia sulphuraria) are being tracked using the Background Subtraction Tracker. The green circles indicate location of a tracked entity, while the black line indicates orientation. Tracks and annotations handling can be done using the control bar to the left. Video playback is handled using the bottom control bar. The top control bar allows data handling and tracker selection. Finally on the right hand side visualization options are presented in the depicted tab, as well as tracking module specific parameters in another tab.

The GUI is split into three parts.

The first part is the tracking area which displays generic overlay and custom overlay on top of the input. The former contains generic trajectory visualization, as illustrated in figure 1, handles mouse events and coordinate rectification. The latter is optionally provided by the tracking component to extend or replace these core visualizations. Mouse events cover e.g. right clicks on the tracking area, which will present a menu providing basic control over the tracking data and visualization. Entities can be added, removed, marked, have their color and transparency changed, etc.
The second part are options, presented on the right hand side. This includes the view control, which modulates generic visualization in the tracking area. Other tabs are reserved for experiment control, notifications and parameters of the tracking component, e.g. binarization thresholds for background subtraction.
The last part are the control toolbars, which can be moved around freely. The video controls bar manages all the playback and recording aspects. The managing includes starting, stopping, seeking, recording the video with or without the overlay and more. It also displays the current frame number and playback speed and grants control over the maximum playback speed. The content controls allow for instance adding tracked elements or annotations, loading of videos and trackers and saving tracked data.
We implemented a simple visualization for tracked objests as points with tails indicating movement direction for generic point-like data, such as illustrated in the scenario in figure 1. More generic methods are to be implemented, e.g. for polygon shaped objects.

Input rectification and the coordinate system

Tracking systems are usually required to produce world coordinates (cm or mm) rather than image coordinates (pixels). When recording a planar setup at an angle, the image of the arena is distorted. We have implemented a basic camera calibration procedure which is used to rectify this perspective distortion. Tracking modules can use these calibration and rectification features optionally and therefore handle different camera setups correctly.

Output

Detections can be saved for further analysis generically as comma-separated values (.csv), Json and a binary format, including an option to load and edit previous tracking results. This is done by serializing Qt properties one after another. This way developers can extend existing types and annotate which data needs to be stored additionally. It is also possible to record the video, optionally zoomed in or including overlaid tracking information. As BioTracker can handle live video streams from various USB cameras, it may also serve as a free-to-use video recorder.

Available tracking modules

The BioTracker includes two different example trackers:
(1) The “Background Subtraction Tracker” implements a well-known algorithm used best for tracking tasks with contrastive objects on uniform background. This method covers standard scenarios, such as animals on a static, white background. It has been used successfully in different tracking software such as ToxTrack, [Rodriguez et al., 2017] and publications such as [KaewTraKulPong and Bowden, 2001]. Every image is integrated into the previous background according to a weight variable. For detection, new images are subtracted from the background, binarized and filtered. Then, ellipses are fitted to the blobs found in the foreground image via image moments [Bradski, 1998]. The algorithm can be susceptible for sudden changes in lighting (such as large shadows or sudden spotlights) and movement of the camera. It works stably on slowly changing background (e.g. slow change in ambient lighting) and single, static objects on the background (e.g. dirt, maze structures). This tracker analyzes successive images separately and merges the detected ellipses from adjacent frames to a track.
(2) The second example is the “Lucas-Kanade Tracker”. The algorithm computes image derivatives of designated points across time, as introduced by [Lucas and Kanade, 1981] according to [Bouguet, 2000]. Motion of an object is estimated via the optical flow in an region of interest around it. This is susceptible to fast movement, but does not require static background.

Evaluating the output

Fig. 2: The python based “Data Analysis Module” can be used to evaluate data created by the BioTracker and any other application producing .csv trajectories. The "File Selection" section allows loading files and defining agents and their respective data columns. The user can also select time and location slices to gain a more in-depth insight into the data. Applicable metrics can be plotted ad-hoc, as well as writing them to files along with .csv data including all selected metrics.

Along with the BioTracker comes a python tool named “Data Analysis Module” to evaluate the CSV output produced by the tracking modules. It has a simplistic user interface and provides an easy way to calculate a number of metrics for arbitrary CSV files by using user annotated columns. Metrics like speed or agent trajectories can be plotted ad-hoc. It calculates metrics like speed, inter-individual distance, transfer entropy and cross-correlation for all individuals. This is done pairwise where applicable and is written back to new CSV files. There is also the option to filter arbitrary columns and plot those metrics (see figure 2). The Data Analysis Module stands alone and can handle and analyze CSV files from different sources, apart from the BioTracker.

Conclusion

BioTracker is a novel framework to streamline implementation of tracking and image processing applications while allowing developers to focus on the task at hand - the tracking algorithms. On one hand the software design is simplistic to serve minimalistic trackers. This has been showcased in the Lucas-Kanade tracker. On the other hand it it is rich in generic, optional features to accommodate for most unique and complex tasks. Algorithms profit from the performance of a native C++ implementation, which makes the BioTracker a feasible solution of real-time application as well as computation extensive tasks. The GUI implements common features for user interaction and encapsulates data serialization and management. As a crowd-improved open source platform the probability of introducing bugs in trivial features is low while granting a maximum of flexibility, standardized formats and usage patterns. Two sample trackers showcase implementation and usage of the framework including a third simplistic tracker for demo purposes. In order to enable users to go the whole way from having an animal on video to its tracked movement parameters, BioTracker also provides a tool to analyze and visually evaluate trajectories.

Acknowledgements

The BioTracker as part of the RoboFish project was supported by the DFG (to TL: LA 3534/1-1, to DB: BI 1828/2-1). Furthermore, HM and DD have been supported by the Andrea von Braun foundation through a PhD fellowship. Special thanks goes to all the testers of the BioTracker software, as well as students implementing the foundations of the Background Subtraction Tracker.

Authors’ contributions

Conceived the idea: TL
Supervised the work flow: TL, HM, DB, BW
Software developed: HM, AJ, TF, JT, BW, DD, TL
Visualization: JP, TF, JT, AJ, TL
Python evaluation (Data Analysis Module): CW
Wrote the manuscript: HM, DB, TL
Tested the software: BW, DD, DB, TL, JP

References

  • [Bierbach et al., 2018] Bierbach, D., Lukas, J., Bergmann, A., Elsner, K., Höhne, L., Weber, C., Weimar, N., Arias-Rodriguez, L., Mönck, H. J., Nguyen, H., Romanczuk, P., Landgraf, T., and Krause, J. (2018). Insights into the Social Behavior of Surface and Cave-Dwelling Fish (Poecilia mexicana) in Light and Darkness through the Use of a Biomimetic Robot. Frontiers in Robotics and AI, 5.
  • [Bouguet, 2000] Bouguet, J.-Y. (2000). Pyramidal Implementation of the Lucas Kanade Feature Tracker Description of the algorithm. 00012.
  • [Bradski, 1998] Bradski, G. R. (1998). Computer Vision Face Tracking For Use in a Perceptual User Interface. Intel Technology Journal Q2 ‘98.
  • [Correll et al., 2006] Correll, N., Sempo, G., De Meneses, Y., Halloy, J., Deneubourg, J.-l., and Martinoli, A. (2006). SwisTrack: A Tracking Tool for Multi-Unit Robotic and Biological Systems. pages 2185–2191. IEEE. 00062.
  • [Dell et al., 2014] Dell, A. I., Bender, J. A., Branson, K., Couzin, I. D., de Polavieja, G. G., Noldus, L. P., Pérez-Escudero, A., Perona, P., Straw, A. D., Wikelski, M., and Brose, U. (2014). Automated image-based tracking and its application in ecology. Trends in Ecology & Evolution, 29(7):417–428. 00099.
  • [Hussaini et al., 2009] Hussaini, S. A., Bogusch, L., Landgraf, T., and Menzel, R. (2009). Sleep deprivation affects extinction but not acquisition memory in honeybees. Learning & memory, 16(11):698–705.
  • [Jin et al., 2014] Jin, N., Landgraf, T., Klein, S., and Menzel, R. (2014). Walking bumblebees memorize panorama and local cues in a laboratory test of navigation. Animal Behaviour, 97:13–23.
  • [KaewTraKulPong and Bowden, 2001] KaewTraKulPong, P. and Bowden, R. (2001). An improved adaptive background mixture model for real-time tracking with shadow detection. Video-based surveillance systems(1):135–144.
  • [Landgraf et al., 2016] Landgraf, T., Bierbach, D., Nguyen, H., Muggelberg, N., Romanczuk, P., and Krause, J. (2016). RoboFish: increased acceptance of interactive robotic fish with realistic eyes and natural motion patterns by live Trinidadian guppies. Bioinspiration & biomimetics, 11(1):015001.
  • [Landgraf et al., 2014] Landgraf, T., Nguyen, H., Schröer, J., Szengel, A., Clément, R. J., Bierbach, D., and Krause, J. (2014). Blending in with the shoal: robotic fish swarms for investigating strategies of group formation in guppies. In Conference on Biomimetic and Biohybrid Systems, pages 178–189. Springer International Publishing.
  • [Landgraf and Rojas, 2007] Landgraf, T. and Rojas, R. (2007). Tracking honey bee dances from sparse optical flow fields. Technical Report B 07-11, Freie Universi tät Berlin, Institut für Informatik.
  • [Lucas and Kanade, 1981] Lucas, B. D. and Kanade, T. (1981). An Iterative Image Registration Technique with an Application to Stereo Vision (IJCAI). Proceedings of the 7th International Joint Conference on Artificial Intelligence (IJCAI ’81), pages 674–679.
  • [Rodriguez et al., 2017] Rodriguez, A., Zhang, H., Klaminder, J., Brodin, T., Andersson, P. L., and Andersson, M. (2017). ToxTrac: A fast and robust software for tracking organisms. Methods in Ecology and Evolution. 00000.
  • [Schindelin et al., 2015] Schindelin, J., Rueden, C. T., Hiner, M. C., and Eliceiri, K. W. (2015). The ImageJ ecosystem: An open platform for biomedical image analysis: THE IMAGEJ ECOSYSTEM. Molecular Reproduction and Development, 82(7-8):518–529. 00000.
  • [Wario et al., 2015] Wario, F., Wild, B., Couvillon, M. J., Rojas, R., and Landgraf, T. (2015). Automatic methods for long-term tracking and the detection and decoding of communication dances in honeybees. Behavioral and Evolutionary Ecology, page 103.
  • [Wario et al., 2017] Wario, F., Wild, B., Rojas, R., and Landgraf, T. (2017). Automatic detection and decoding of honey bee waggle dances. PLoS ONE. arXiv: 1708.06590.
  • [Wild et al., 2018] Wild, B., Sixt, L., and Landgraf, T. (2018). Automatic localization and decoding of honeybee markers using deep convolutional neural networks.
Comments 0
Request Comment
You are adding the first comment!
How to quickly get a good reply:
  • Give credit where it’s due by listing out the positive aspects of a paper before getting into which changes should be made.
  • Be specific in your critique, and provide supporting evidence with appropriate references to substantiate general statements.
  • Your comment should inspire ideas to flow and help the author improves the paper.

The better we are at sharing our knowledge with each other, the faster we move forward.
""
The feedback must be of minimum 40 characters and the title a minimum of 5 characters
   
Add comment
Cancel
Loading ...
131543
This is a comment super asjknd jkasnjk adsnkj
Upvote
Downvote
""
The feedback must be of minumum 40 characters
The feedback must be of minumum 40 characters
Submit
Cancel

You are asking your first question!
How to quickly get a good answer:
  • Keep your question short and to the point
  • Check for grammar or spelling errors.
  • Phrase it like a question
Test
Test description