M3T: Multi-class multi-instance multi-view object tracking for embodied aI tasks

Abstract

In this paper, we propose an extended multiple object tracking (MOT) task definition for embodied AI visual exploration research task - multi-class, multi-instance and multi-view object tracking (M3T). The aim of the proposed M3T task is to identify the unique number of objects in the environment, observed on the agent’s way, and visible from far or close view, from different angles or visible only partially. Classic MOT algorithms are not applicable for the M3T task, as they typically target moving single-class multiple object instances in one video and track objects, visible from only one angle or camera viewpoint. Thus, we present the M3T-Round algorithm designed for a simple scenario, where an agent takes 12 image frames, while rotating 360° from the initial position in a scene. We, first, detect each object in all image frames and then track objects (without any training), using cosine similarity metric for association of object tracks. The detector part of our M3T-Round algorithm is compatible with the baseline YOLOv4 algorithm [1] in terms of detection accuracy: a 5.26 point improvement in AP75. The tracker part of our M3T-Round algorithm shows a 4.6 point improvement in HOTA over GMOTv2 algorithm [2], a recent, high-performance tracking method. Moreover, we have collected a new challenging tracking dataset from AI2-Thor [3] simulator for training and evaluation of the proposed M3T-Round algorithm.

RAS ID

57991

Document Type

Conference Proceeding

Date of Publication

1-1-2023

Volume

13836 LNCS

School

School of Science / School of Engineering

Copyright

subscription content

Publisher

Springer

Comments

Khan, M., Abu-Khalaf, J., Suter, D., & Rosenhahn, B. (2023, February). M3T: Multi-class multi-instance multi-view object tracking for embodied aI tasks. In Image and Vision Computing: 37th International Conference, IVCNZ 2022, Auckland, New Zealand, November 24–25, 2022. Selected Papers (pp. 246-261). Cham: Springer Nature Switzerland.

https://doi.org/10.1007/978-3-031-25825-1_18

Share

 
COinS
 

Link to publisher version (DOI)

10.1007/978-3-031-25825-1_18