To view PDF files

You need Adobe Reader 7.0 or later in order to read PDF files on this site.
If Adobe Reader is not installed on your computer, click the button below and go to the download site.

Feature Articles: NTT DATA Technology Foresight 2017—Examining Future Technology Trends and How They Will Affect Us

Vol. 15, No. 10, pp. 22–24, Oct. 2017. https://doi.org/10.53829/ntr201710fa8

Environment-aware Robotics

Yuji Nomura

Abstract

Advancements in perception technology for images and voice are enabling robots to acquire enhanced environmental awareness and providing opportunities to exploit its use within products such as self-driving cars and drones. These higher-level operational capabilities will transform the industrial structure.

Keywords: environment recognition technology, self-driving car, drone

PDF PDF

1. Development of technologies for recognizing the external world

It is safe to say that robots have better eyesight than humans do. One reason for this is that deep learning technology has increased the accuracy of image recognition. This was proven at the 2016 ImageNet Large Scale Visual Recognition Challenge (ILSVC), in which a robot identified the name of an object in an image with 97.0% accuracy, as compared to 94.9% for humans. In addition, SLAM*, a technology that simultaneously estimates a robot’s own location and creates a map of its surroundings based on information from a camera and a sensor, has enabled highly accurate capture of a three-dimensional (3D) space. Yet another new technology allows the capture of a space with only a smartphone using a monocular camera, which will enable the effortless creation of indoor 3D maps. This technology will likely become widespread in many places such as commercial facilities, warehouses, and factories.

Voice recognition has also become more accurate, achieving human level and giving robots effective ears as well as eyes. Furthermore, robots can recognize things that humans cannot such as ultrasonic waves, infrared light, and magnetism. This capability is inherent to machines. It is anticipated that robots that possess the perceptual aptitude of humans in addition to other capabilities will rapidly expand their range of applications.

As the ability of spatial recognition improves, robotic contests, whose purpose is to enhance the functionality and performance of robots, are on the rise. For example, at the Amazon Picking Challenge, robots compete in their ability to place products on shelves and remove them, while at RoboCup, robots play soccer on a soccer field or compete in rescuing humans at a disaster site. In addition, at the inaugural event of a robotic car race called the DARPA Grand Challenge in 2004, none of the participants were able to reach the finish line. However, five cars completed the race in 2005, building the foundation for the self-driving technology of today.

* SLAM: simultaneous localization and mapping

2. Lower costs and evolution of hardware that supports robots

LIDAR (light detection and ranging, or laser imaging detection and ranging), a sensor system smaller than radio radar systems that uses light and recognizes 3D spaces, can measure extremely small particle sizes. Because it can even recognize the shape and moving speed of objects, it is receiving special attention as the potential eyes of self-driving cars. Although LIDAR is costly today, manufacturers are targeting a cost of 100 U.S. dollars or less in five years. Efforts are also underway to install a LIDAR sensor in a single microchip with a potential price of only 10 dollars. This type of LIDAR could be installed in many devices, and its use would quickly spread to robots and home electronics.

Concurrent with this, an effort to achieve a self-driving function without sensors such as LIDAR is underway using improved camera performance and artificial intelligence (AI) to recognize objects and measure distances. Either of these evolving technologies may become the optimal choice for robots’ eyes.

The growing field of biomimetics models the superior functions and structures that living organisms have attained and applies these results to technological development. Biomimetics is helping to further develop robotics. For example, robots equipped with a tactile sensor that models human pain are now able to feel discomfort upon impact and to act to avoid the impact. This will likely enable the reliable use of robots in situations where they need to work closely with humans.

3. Popularization of self-driving vehicles, drones, and task-oriented robots

A self-driving car may be considered a robot that operates autonomously while recognizing its surroundings. It is also the kind of robot that is garnering the most attention nowadays. Information technology companies have entered the self-driving car race, in addition to automobile manufacturers, accelerating the trend toward mergers and acquisitions. The year 2016 also saw proof-of-concepts of self-driving buses and experimental services by self-driving taxis on public streets. In addition, an autonomous trial run of the world’s first self-driving delivery truck was conducted on a 190-km stretch of expressway. Importantly, the arrival of deliveries by self-driving trucks is expected to significantly reduce the current truck driver shortage, which continues to increase due to the rapid expansion of e-commerce. It will take time before a completely autonomous car—which does not need human intervention under any conditions—is introduced. However, autonomous driving is already available under certain circumstances.

Additionally, drones are being utilized for a wide variety of business purposes including surveying, 3D map creation, inspections, security, search and rescue, investigations, deliveries, and for entertainment purposes. Drones can fly over difficult terrain and hard-to-reach locations at low cost and identify accurate spatial information of a location. As a result, they hold the potential to increase efficiency and provide new services in an incomparable way.

Robots are also spreading their working arena to commercial facilities, households, and public spaces. For example, there are now robots that use a camera and sensor to patrol the product display shelves to find out-of-stock products, wrong product placements, and messy displays, raising the potential for significantly reducing labor. To assist in everyday life, there are now self-driving vacuum cleaners and communication robots, as well as those that suggest recipes using ingredients stored in the refrigerator, and even those that can cook.

4. Advanced tasks and mass customization

In addition to the automation of simple tasks that humans have performed in the past, even advanced tasks that only experts were able to perform are now being automated. For example, agricultural applications include a drone equipped with a camera and sensor that sprays pesticide only over areas inhabited by pests, or that adjusts the amount of fertilizer depending on the crop growth conditions in a particular area. This drone can perform tasks with a precision far higher than that of humans, and such automation can result in significant savings on pesticides and fertilizers.

Individual customization of products has been difficult to bring to fruition due to cost-related problems. However, in the future, autonomous factories may emerge where robots acquire data from manufacturing machines and sensors, as well as from sales and material procurement departments, and use that data to autonomously determine the necessary materials, most efficient manufacturing process, and methods for coordinating with other machines, thus automatically changing production lines. As a result, mass customization may become a reality.

5. Economic impact of robot popularization

The global robot-related market is predicted to more than double from 91.5 million U.S. dollars in 2016 to 188 million dollars in 2020 [1], with the competition of functions and pricing of robots increasing in the future.

In particular, the automobile industry will very likely reach a significant tipping point. During the development phase of self-driving technology, the car’s driving performance has been the major focus. However, once a fully automatic self-driving car is introduced, proficient driving performance will be assumed, and the transportation experience itself will become the determining factor. This means that the car industry will likely shift from the traditional form of selling things to that of selling experiences and services. The definition of customers will also change from people who wish to own cars to all people who have transportation needs.

6. Discussions on required reforms of the legal system

High-performance robots will also bring higher risks of injuring humans and infringing on privacy. Accordingly, the future development of robots will require the resolution of these problems, including new legislation.

Various changes are inevitable in the mid- to-long-term future. Discussions are currently underway to impose a robot tax on owners under the assumption that robots are electronic humans. In addition, the introduction of basic income grants to all citizens in order to maintain a minimum standard of living has been discussed a great deal, with experiments starting in Finland and San Francisco. These discussions are based on the assumption that social structures will significantly change with robots and AI replacing people in the workforce. However, as with computers, new professions will emerge, but different skills will be required. In addition to systemic adjustments in taxes and life security, the education programs needed to fill this skill gap will become vital in the future.

Reference

[1] IDC, “Worldwide Semiannual Commercial Robotics Spending Guide,” 2017.

Trademark notes

All brand names, product names, and company names that appear in this article are trademarks or registered trademarks of their respective owners.

Yuji Nomura
Deputy Manager, Strategy Development Section, Research and Development Headquarters, NTT DATA Corporation.
He received an M.S. in science and technology from Keio University, Kanagawa, in 2005. Since joining NTT DATA in 2005, he has researched and developed a text processing technology system centered on information extraction technology. He is a member of the Information Processing Society of Japan.

↑ TOP