Feature Articles: Media Robotics as the Boundary Connecting Real Space and Cyberspace

Vol. 19, No. 3, pp. 19–21, Mar. 2021. https://doi.org/10.53829/ntr202103fa3

Toward Cyber-physical Interaction for Natural Connection of Real Space and Cyberspace

Shigekuni Kondo, Atsushi Sagata, Kenichi Minami,
and Akihito Akutsu

Abstract

To actualize the concept of IOWN (Innovative Optical and Wireless Network), we need a new “environment” for connecting real space and cyberspace that fundamentally changes our lives without needing to be literate in information and communication technology. Also essential is a natural means of incorporating that environment into our daily lives, which is to say a function for natural cyber-physical interaction. The following Feature Articles in this issue introduce the most recent technological trends concerning cyber-physical interaction at the boundary between real space and cyberspace.

Keywords: perception, body, cybernetics

PDF

1. Introduction

In fusing real space and cyberspace by using the Innovative Optical and Wireless Network (IOWN) [1], we can expect more precise simulations that will enable better predictions, thus expanding the range of human activities. In such an environment where there is a fusion of real space and cyberspace and our lives are fundamentally changed, it will be possible for anyone, regardless of their information and communication technology (ICT)-literacy level, to benefit from prediction. To achieve this, we believe it is essential to have natural cyber-physical interaction that involves a natural means of information presentation for fusing our daily activities with the environment.

The Feature Articles in this issue introduce the most recent trends and technologies that concern cyber-physical interaction at the boundary between real space and cyberspace.

2. Overview of R&D on cyber-physical interaction

Research and development (R&D) on the fusion of real space and cyberspace is already progressing to the stage of practical application in many areas, and various types of content have been produced. In computer gaming, for example, game characters are displayed on smartphone screens and superimposed in real space in front of the user, creating the illusion of the characters existing in real space. There are also many games that immerse the user in game worlds. Regarding sports, there are, for example, online bicycle races.

In the future, it will be possible for people to jump into a virtual world (full-dive) and interact with real space through cyberspace. For example, people can share realistic places with others even if no one is actually there or provide realistic support for people who are not at that location, amplifying human knowledge and using human abilities to the maximum extent.

We believe that the user interface will play an even more important role than before in a future where real space and cyberspace are tightly coupled. Technology for information presentation and input in ways that do not interfere with human activities, new interaction technology that uses haptic sensory effects, technology for making the utmost use of human motor functions, and other such technology are needed for natural integration of people and the environment through natural information presentation, which is to say natural cyber-physical interaction. Cyber-physical interaction expands and develops individual environments by connecting several environments and exchanging well-being and other subjective information as well as objective information, such as efficiency, quality, and cost, between them. A core technology for such interaction provides perception and cognition control.

NTT laboratories will expand R&D on perception and cognition and focus on R&D in the field of cybernetics on the basis of physiology.

3. Current work on cyber-physical interaction

The following Feature Articles in this issue introduce control technology for perception and cognition, which is a core technology for the cyber-physical interaction that is a subject of current R&D by NTT laboratories.

“Improving Depth-map Accuracy by Integrating Depth Estimation with Image Segmentation” [2] introduces a system called HiddenStereo that enables natural three-dimensional (3D) viewing from monocular 2D images. This system is implemented by combining technology for improving the accuracy of depth maps, which represent the distance from the camera to each pixel in the image, and division of the image into regions.

“Affect-perception Control for Enhancing a Sense of Togetherness for Remote Spectators” [3] introduces elemental technology for going beyond simply transmitting and reproducing a sense of presence at an event venue to be experienced by remote viewers. This technology also captures the emotional responses (emotional actions) of the remote audience and creates a feeling of shared togetherness, interaction, and excitement through emotional feedback.

“Visible-light Planar Lightwave Circuit Technology and Integrated Laser-light-source Module for Smart Glasses” [4] introduces an ultra-compact RGB (red, green blue) laser-light-source module sized to fit into the temples of smart glasses. The module is implemented with an optical system that bundles light sources that produce the three primary colors of light (RGB) with a circuit that is drastically reduced in size.

“Fine-grained Hand-posture Recognition for Natural User-interface Technologies” [5] introduces research for establishing finger-shape recognition technology to implement operation of smart glasses through hand gestures.

“Information-display Method for Reducing Annoyance by Gaze Guidance” [6] introduces an information-display method that both reduces the user’s feeling of annoyance and increases the certainty of information access by using an imposed display technique that gives the user the feeling that the act of reading information was their own choice.

“Presenting Material Properties with Mid-air Pseudo-haptics” [7] introduces a mid-air pseudo-haptic technology that gives the user the perception of the material properties of virtual objects as the user manipulates them through their own action.

“Evaluation of Adaptability to Unfamiliar Environments Using Virtual Reality” [8] introduces research on creating technology for evaluating and improving the ability to adapt to the environment to achieve appropriate exercise and prevent accidents involving the elderly while walking or driving.

4. Conclusion

We described the R&D on technology for cyber-physical interaction at the boundary between real space and cyberspace, particularly the most recent research on perception and cognitive control. We are also investigating other core technologies for cyber-physical interaction, including technology for physiological control, emotion and desire control, communication of the five basic and other human senses, communication control, and social capital infrastructure.

Toward making IOWN a reality, NTT laboratories have been working to bring on-going work related to perception and cognition to maturity and will continue to promote R&D on cyber-physical interaction as an unprecedented user interface based on human proprioception for a seamless connection of bodies in cyberspace and those in real space in the field of cybernetics.

References

[1] NTT Technology Report for Smart World 2020,
https://www.rd.ntt/e/techreport/
[2] M. Ono, Y. Kikuchi, T. Sano, and S. Fukatsu, “Improving Depth-map Accuracy by Integrating Depth Estimation with Image Segmentation,” NTT Technical Review, Vol. 19, No. 3, pp. 22–26, Mar. 2021.
https://ntt-review.jp/archive/ntttechnical.php?contents=ntr202103fa4.html
[3] T. Sano, M. Makiguchi, H. Nagata, and H. Seshimo, “Affect-perception Control for Enhancing a Sense of Togetherness for Remote Spectators,” NTT Technical Review, Vol. 19, No. 3, pp. 27–30, Mar. 2021.
https://ntt-review.jp/archive/ntttechnical.php?contents=ntr202103fa5.html
[4] T. Hashimoto and J. Sakamoto, “Visible-light Planar Lightwave Circuit Technology and Integrated Laser-light-source Module for Smart Glasses,” NTT Technical Review, Vol. 19, No. 3, pp. 31–36, Mar. 2021.
https://ntt-review.jp/archive/ntttechnical.php?contents=ntr202103fa6.html
[5] Y. Kubo, “Fine-grained Hand-posture Recognition for Natural User-interface Technologies,” NTT Technical Review, Vol. 19, No. 3, pp. 37–39, Mar. 2021.
https://ntt-review.jp/archive/ntttechnical.php?contents=ntr202103fa7.html
[6] R. Saijo, T. Sato, S. Eitoku, and M. Watanabe, “Information-display Method for Reducing Annoyance by Gaze Guidance,” NTT Technical Review, Vol. 19, No. 3, pp. 40–44, 2021.
https://ntt-review.jp/archive/ntttechnical.php?contents=ntr202103fa8.html
[7] T. Kawabe, “Presenting Material Properties with Mid-air Pseudo-haptics,” NTT Technical Review, Vol. 19, No. 3, pp. 45–48, Mar. 2021.
https://ntt-review.jp/archive/ntttechnical.php?contents=ntr202103fa9.html
[8] T. Isezaki and T. Watanabe, “Evaluation of Adaptability to Unfamiliar Environments Using Virtual Reality,” NTT Technical Review, Vol. 19, No. 3, pp. 49–52, Mar. 2021.
https://ntt-review.jp/archive/ntttechnical.php?contents=ntr202103fa10.html
Shigekuni Kondo
Senior Research Engineer, Planning Section, NTT Service Evolution Laboratories.
He received a B.E. in applied physics and M.S. in integrated design engineering from Keio University, Kanagawa in 2003 and 2005 and joined NTT in 2005. His research interests include software modularity, home network, and user interfaces. He is a member of the Institute of Electronics, Information and Communication Engineers (IEICE).
Atsushi Sagata
Executive Research Engineer, Cybernetic Intelligence Research Project, NTT Service Evolution Laboratories.
He received a B.E. in electronic engineering from the University of Tokyo in 1994 and joined NTT the same year. He has been engaged in R&D of video coding systems and in the development of the digital high-definition TV transmission system at NTT Communications. He is a member of IEICE.
Kenichi Minami
Director of Research, Planning Section, NTT Service Evolution Laboratories.
He received a B.E. in electronic engineering and M.S. in biomedical engineering from Keio University, Kanagawa in 1991 and 1993. He received an MBA from Thunderbird, Global School of Management, Arizona, USA, in 2002. He has been engaged in R&D management in the development of “Kirari!” immersive telepresence technology since 2016. From 2012 to 2014, he was responsible for the development of mobile application services at NTT DOCOMO. His research interests include image and audio processing, user interfaces, and telepresence technologies. He is a member of IEICE.
Akihito Akutsu
Vice President, NTT Service Evolution Laboratories.
He received an M.E. in engineering from Chiba University in 1990 and Ph.D. in natural science and technology from Kanazawa University in 2001. Since joining NTT in 1990, he has been engaged in R&D of video-indexing technology based on image/video processing and man-machine interface architecture design. From 2003 to 2006, he was with NTT EAST, where he was involved in managing a joint venture between NTT EAST and Japanese broadcasters. In 2008, he was appointed director of NTT Cyber Solutions Laboratories (now NTT Service Evolution Laboratories), where he worked on an R&D project focused on broadband and broadcast services. In October 2013, he was appointed as executive producer of 4K/8K HEVC (High Efficiency Video Coding) at NTT Media Intelligence Laboratories. He received the Young Engineer Award and Best Paper Award from IEICE in 1993 and 2000, respectively. He is a member of IEICE.

↑ TOP