To view PDF files

You need Adobe Reader 7.0 or later in order to read PDF files on this site.
If Adobe Reader is not installed on your computer, click the button below and go to the download site.

Feature Articles: Platform Technologies for Services that Utilize Lifelogs

Lifelog Remote Control for Collecting Operation Logs Needed for Lifelog-based Services

Tomoki Watanabe, Youichi Takashima, Minoru Kobayashi, and Masanobu Abe

Abstract

In this article, we describe our concept of using personal remote controls to collect data for subsequent utilization and the remote control’s user interface design, focusing on the technology that makes it possible, for the first time, to collect truly useful lifelogs.

PDF
NTT Cyber Solutions Laboratories
Yokosuka-shi, 239-0847 Japan

1. Introduction

To provide lifelog-based services, we need to accurately estimate the user’s preferences and behavioral patterns in daily life from various activities. The key to accurate estimations is gathering sufficient data to identify each user’s likes and preferences. In recent years, various services that provide recommendations have been developed. They process data about the user’s Internet access activities such as online shopping and visited websites. Other services process information about the user’s location by defining places through a GPS (global positioning system) function. To use those services, however, users still need to register their likes and preferences in advance, but that is so inconvenient for users that their preferences and behavioral patterns can never be fully captured.

We believe that continuously receiving and storing users’ lifelogs over long periods of time will yield a structure that can collect data more easily without imposing a heavy load on the user.

Our approach is centered on utilizing a personal remote control that each user carries and uses to operate all home devices. The operation data is saved within it and sent to service providers only when the user permits. In this way, each user’s personal information can be protected.

In developing our lifelog collecting approach, we are focusing on two major points: (1) a remote control that allows users to control various devices and saves operation logs within itself and (2) an attractive interface that encourages users to use the remote control all the time. In this article, we describe the technologies underlying the lifelog collecting system and its user interface.

2. Information acquired from operation logs

The operation of home devices is assumed to reflect the user’s free will choices. That is, those actions reflect the user’s likes or preferences. For example, if the user turns on a television (TV) and selects a station that offers music programs, we can assume that the user is interested in music, whereas if it is a movie channel, we can assume that the user likes movies. If the user sets the air conditioner to 17¡ëC everyday, we can assume that the user tends to feel comfortable at that temperature. When the user turns room lights on and off, we can determine what time the user comes home and goes to sleep. Such operation data is, we believe, a strong indicator of the user’s likes, preferences, and behavioral patterns.

Below, we describe our ongoing work.

3. Lifelog collecting system

NTT Cyber Solutions Laboratories has developed a lifelog collecting system that allows users to operate home devices from a single remote control. We targeted devices with infrared sensors because most modern home devices are equipped with such sensors.

The structure of the lifelog collecting system is shown in Fig. 1. As the remote control, we chose to use a smartphone because its operation screen is easy to customize.


Fig. 1. Structure of the lifelog collecting system.

Command relay devices, wirelessly driven by the smartphone, are set throughout the house so that the intended device can receive its control signal.

We use wireless local area network (WLAN) connections to link the smartphone to the command relay devices. To stabilize each WLAN connection and ensure that the control signals reach the intended device, we mounted the command relay devices on ceiling lights (Fig. 2). This arrangement allows infrared rays to be sent in eight different directions at the same time, which provides coverage over a wide area.


Fig. 2. Infrared repeaters in ceiling lights.

Remote control applications are written for each appliance and loaded into the smartphone (Fig. 3). Since each appliance uses a different remote control, the program creates a virtual remote control on the smartphone. The user selects the appliance and then presses the virtual keys to input the command desired. The corresponding signal is sent to the pre-set command relay device. The command relay device converts the signal into the appropriate control signal and then transmits the infrared signal. The command relay device nearest the device that the user wants to operate, such as the TV, should be selected as this makes signal reception more reliable. The smartphone automatically makes a list of available command relay devices, and the user can select the desired one from the list.


Fig. 3. Remote control applications are loaded on the smartphone.

System implementation proceeds as follows.

(1) Connect all command relay devices to the home’s WLAN.

(2) Connect the smartphone to the WLAN.

(3) Select each device (manufacturer and product) to be operated with the remote control and install the corresponding program in the smartphone.

(4) For each device, select the nearest command relay device and register it in the smartphone program.

(5) Repeat steps 3 and 4 for each device to be operated.

(6) Activate the application for the device and operate the device via the virtual remote control on the smartphone. To operate another device, activate its application.

The user’s remote control application operation data is first saved as logs on the smartphone. Those logs are sent to providers that offer lifelog-based services if the user allows the data to be sent.

4. Device status estimation from operation logs

The operation logs of each user can be collected individually by the lifelog collection system. Although the operation logs contain information about when and which button was pressed, it is difficult to know the status of each device accurately. For example, from an operation log entry that says “the user pressed the TV power button” on the smartphone, you cannot tell if the user turned the TV on or off. If the station 1 button was pressed, we can assume that station 1 was selected, but if the user used the channel up-down button to choose the station, you need to know which station was on before to figure out which station the user chose. Furthermore, new terrestrial digital broadcasting services offer more complicated programs in Japan. Two or three different programs are offered under a single channel button (Fig. 4). In such cases, it is very difficult to determine which channel the user selected from a single operation log.


Fig. 4. Different TV status transitions corresponding to different buttons being pressed.

We have been developing a method of estimating the device status as accurately as possible. It tracks TV status transition and checks the status transition against the operation log containing pressed buttons and the time lag between button presses. For example, if there is no record of any operations immediately after the power button was pressed, we can assume that the user turned the TV off. If the history shows some operations, such as adjusting the volume or changing channels, we can assume that the user turned the TV on. Moreover, from operation data for lights and air conditioners, we can estimate the user’s lifestyle and behavioral patterns by knowing when the user turned lights on or off and what temperature was usually selected.

To obtain users’ preferences and behavioral patterns, we have to estimate the status of each device from information acquired through operation logs. One option would be to determine the status from the device, but if we can determine each device’s status through the remote control log, the usage range will be wider.

5. Attractive user interface that invigorates lifelog-based services

Since a personal remote control must be used for long periods of time to gather sufficient data for lifelog-based services, it must be easy to use.

5.1 “Coool” remote control

For prolonged use, the remote control should be easy to hold and fit comfortably in the user’s hand. Buttons must also be easy to press. We developed the “Coool” remote control to satisfy these requirements (Fig. 5). It is round and each button is placed for easy access by the user’s thumb. The rocker switch in the center can be pressed up or down and left or right. This allows the user to change display contents dynamically and access various functions with fewer button pushes. This remote control is equipped with a GPS function, temperature and acceleration sensors, a WLAN, and infrared and Bluetooth transmitters. It has a stylish design that will attract users while having all the hardware required to provide lifelog-based services. We will continue our research and development activities on remote applications for smartphones and other devices to create various lifelog-based services using the Coool remote control.


Fig. 5. Coool remote control for lifelog services.

5.2 Contents menu for lifelog-based services

Users can display the contents menu on a TV screen. That makes it easier for users to select an appliance and issue detailed commands. As an example, My Menu sent by the Coool remote control is shown in Fig. 6. My Menu lists commonly accessed appliances and programs that the user likes and watched recently as well as contents recommended by lifelog-based services.


Fig. 6. Example of My Menu.

5.3 Simultaneous display of multiple My Menus

Each user has a different personal remote control, but multiple users can place their My Menus on the same screen at the same time (Fig. 7). Since My Menu may include information that a user does not want to share with others, the Coool remote control automatically alters the My Menu contents when another user is physically nearby.


Fig. 7. Example of multiple My Menus displayed on one screen.

6. Future plans

To create convenient and fun lifelog-based services, we will carry out feasibility studies of lifelog collection and device status estimation in actual home environments as well as validity tests of lifelog-based services as soon as possible.

Tomoki Watanabe
NTT Cyber Solutions Laboratories.
He received the B.E. degree in electrical computer engineering from Yokohama National University, Kanagawa, in 1992. In 1992, he joined NTT Human Interface Laboratories, where he was engaged in human interface research. From 1995 to 2000, he worked on the development of a data transmission and collection system. His research interests are user interfaces and network protocols.
Youichi Takashima
Senior Research Engineer, Network Appliance and Services Project, NTT Cyber Solutions Laboratories.
He received the B.E., M.E., and Ph.D. degrees in electrical computer engineering from Yokohama National University, Kanagawa, in 1985, 1987, and 1990, respectively. His research interests include information security including watermark technology, video coding, and ubiquitous networked identifier standardization. He is a member of the Institute of Electronics, Information and Communication Engineers (IEICE) of Japan.
Minoru Kobayashi
Senior Research Engineer, Network Appliance and Services Project, NTT Cyber Solutions Laboratories.
He received M.S. degrees from Keio University, Kanagawa, and Massachusetts Institute of Technology, USA, and the Ph.D. degree from Keio University. His research interests include human-computer interaction, computer-supported cooperative work, and video-based groupware design. He is a member of the Association for Computing Machinery, IEEE Computer Society, Information Processing Society of Japan, IEICE, and Virtual Reality Society of Japan.
Masanobu Abe
Professor of the Department of Computer Science, Division of Industrial Innovation Sciences, Graduate School of Natural Science and Technology, Okayama University.
He received the B.E., M.E., and Ph.D. degrees in electrical engineering from Waseda University, Tokyo, in 1982, 1984, and 1992, respectively. He joined the Yokosuka Electrical Communications Laboratories of Nippon Telegraph and Telephone Public Corporation (now NTT) in 1984. From 1987 to 1991, he worked at ATR Interpreting Telephony Research Laboratories. In 1989, he joined the Laboratory Computer Science, MIT, USA, as a visiting researcher. From 2007 to 2010, he served as an Executive Manager, Senior Research Engineer, Supervisor of NTT Cyber Solutions Laboratories. He has been a professor at Okayama University since 2010. His research interests include digital speech processing, home networking, consumer electronic appliances, data mining algorithms for lifelogs, and human interfaces. He is a member of the Acoustical Society of Japan (ASJ), IEICE, IEEE, and the Association for Computing Machinery. He received a Paper Award from ASJ in 1996. He is co-author of “Recent progress in Japanese speech synthesis” (Gordon and Breach Science Publishers, 2000).

↑ TOP