MeetingMate and MixMeetWear

At hand access to meeting content

While many distributed meeting tools support users
connecting to a meeting from several different devices, they tend to expect relatively symmetric user goals and contexts. For example, in a three-way meeting between a group of three people in a conference room, a remote user at a desk, and a mobile user, the first group might stream a webcam and microphone attached to a desktop, the second a laptop webcam and mic, and the third might use a phone app and the mobile device’s camera and mic. However, with current distributed meeting technologies, each user would be represented more-or-less equivalently and have similar controls. Based on prior observations of many different types of meetings, we believe that differences in user contexts in meetings often require different interface designs, representations, and features.

With MeetingMate and MixMeetWear, we focus on situations in which users need to maintain awareness of a meeting while accomplishing other tasks, including searching for or capturing content related to the meeting topic. MeetingMate supports collocated members of distributed meetings who contribute to the through backchannels rather than as fully fledged participants, while MixMeetWear supports users who are mobile, connected to the meeting via a smartwatch, and need to accomplish some other task.

Technical Contact

Related Publications

2016
Publication Details
  • CSCW 2016
  • Feb 27, 2016

Abstract

Close
We present MixMeetWear, a smartwatch application that allows users to maintain awareness of the audio and visual content of a meeting while completing other tasks. Users of the system can listen to the audio of a meeting and also view, zoom, and pan webcam and shared content keyframes of other meeting participants' live streams in real time. Users can also provide input to the meeting via speech-to-text or predefined responses. A study showed that the system is useful for peripheral awareness of some meetings.