Ethomics is a field of study concerned with the application of computer vision techniques to characterise animal behaviour. The Ethoscope platform, developed by the Gilestro lab, is designed for high throughput Ethomic experiments with Drosophila Melanogaster (the common fruit fly), an important model organism for behavioural studies. The platform consists of a camera connected to a Raspberry Pi microcomputer suspended over a lit behavioural arena containing the flies [1]. The lab has developed an offline multiple object tracking system to record the locations of fruit flies in video footage captured on the Ethoscope platform. This system is built on the back of the Viola-Jones object detection framework - a computationally efficient, yet inaccurate object detection system [2].
This report has two main aims:
1) To improve the performance of the previously developed Viola-Jones object detection system, extending it to a real-time (online) tracking system.
2) To develop a more accurate tracking pipeline using more contemporary Convolutional Neural Networks (CNNs) for object detection and tracking.
Object detection concerns locating instances of an object within an image. Given an input image an object detection system should return the number of instances within the image and their location (typically in the form of a bounding box - four sets of coordinates corresponding to the rectangular edges that bound the object).
In this project, the Viola-Jones algorithm along with two implementations of Convolutional Neural Network (Faster R-CNN & YOLOv4) were employed for object detection[3][4].
Click the buttons below to learn more about each method:
Object tracking is the process by which instances of detected objects are assigned persistent IDs associated with individual tracks. In this study, SORT (Simple Online Realtime Tracking) as well as it’s deep learning extension, DeepSORT are employed[5][6].
More on Tracking Methods