Searching Live Meetings: "Show me the Action"


Web-based tools for remote collaboration are quickly becoming an established element of the modern workplace. During live meetings, people share web sites, edit presentation slides, and share code editors. It is common for participants to refer to previously spoken or shared content in the course of synchronous distributed collaboration. A simple approach is to index with Optical Character Recognition
(OCR) the video frames, or key-frames, being shared and let user retrieve them with text queries. Here we show that a complementary approach is to look at the actions users
take inside the live document streams. Based on observations of real meetings, we focus on two important signals: text editing and mouse cursor motion. We describe the detection
of text and cursor motion, their implementation in our WebRTC-based system, and how users are better able to search live documents during a meeting based on these detected and indexed actions.