Activity Forecasting in Routine Tasks by Combining Local Motion Trajectories and High-level Temporal Models

Abstract

Human activity forecasting from videos in routine-based tasks is an open research problem that has numerous applications in robotics, visual monitoring and skill assessment. Currently, many challenges exist in activity forecasting because human actions are not fully observable from continuous recording. Additionally, a large number of human activities involve fine-grained articulated human motions that are hard to capture using frame-level representations. To overcome thesechallenges, we propose a method that forecasts human actions by learning the dynamics of local motion patterns extracted from dense trajectories using longshort-term memory (LSTM). The experiments on a pub-lic dataset validated the effectiveness of our proposed method in activity forecasting and demonstrate large improvements over the baseline two stream end-to-endmodel. We also learnt that human activity forecasting benefits from learning both the short-range motion pat-terns and long-term dependencies between actions.