Skip to main content

Hitachi
Contact InformationContact Information

    1. AI Platform

    Hitachi has put together an artificial intelligence (AI) platform that provides an effective way to go about solution development utilizing the latest AI techniques and to satisfy diverse customer requirements.

    The core elements of the AI platform include open source software (OSS) for deep learning and other applications that make up the platform's suite of AI and analytics software, Hitachi's distinctive AI techniques [Hitachi AI Technology/H (AT/H) and machine learning constraint programming (AT/MLCP)], self-growing dialogue AIs, and explainable AIs that can explain their reasoning.

    The platform also hosts data processing software required for preparing data for use in AI, including data cleansing and blending, and a hardware environment for effective computing that includes general-purpose computing on graphics processing units (GPGPU) and complementary-metal-oxide-semiconductor (CMOS) annealing machines. These elements are implemented in a research and development (R&D) cloud where not only can data scientists and researchers further enhance the platform, but people from business divisions can easily create and prototype new solutions.

    The intention for the future is to make the environment available to customers and other partners and to build a solution ecosystem.

    1. AI platformAI platform

    2. Multi-AI Control Technology that Uses Competitive Self-play Learning without Relying on Actual Data

    2. Competition between groups of AIsCompetition between groups of AIs

    In general, AI that uses machine learning such as deep learning, predicts and makes decisions based on what it has learned from a large set of actual data prepared by humans. This makes accurate prediction or decision-making difficult in situations where such large amounts of data are not available. It has been considered that the use of conventional AI is not practical in applications such as supply chain optimization where data is spread across a number of different business entities, given the difficulties of bringing all of the data together in one place.

    In response, Hitachi has developed a multi-AI control technology that utilizes competitive self-play learning with the aim of creating a system that can control an entire supply chain. In the AI technology developed, by installing an AI agent using deep learning in each business entity, an AI group in which these AI agents are interlinked is utilized. A plurality of such AI groups are generated and these AI groups compete with each other on their computers to learn how to make optimal decisions while also generating large amounts of data. By embedding procedures for hybridization and evolution of the AIs in the competition process, they learn to make decisions that are optimal in broad perspective, meaning there is coordination between AI agents. When this AI technique was used in the Beer Game (a supply chain simulation game), it showed that costs due to overstock and stockout can be reduced to one-quarter the amount compared to when decisions are made on the basis of human experience.

    In the future, Hitachi intends to incorporate the new AI technology into services and products and utilize it in its Social Innovation Business in a wide range of fields including electric power, logistics, and finance.

    3. Data Scientist Training to Expand Digital Solutions

    3. Data scientist training to expand digital solutionsData scientist training to expand digital solutions

    Human resource development is an important management issue for expanding the business of digital services.

    Hitachi is taking steps to train its data scientists to help satisfy diverse customer requirements for utilizing data. What this means in practice is the devising of a new training program that defines the business knowledge and data analysis skills requirements for not only the IT sector but also in operational technology (OT) sectors such as railways and industry, and also the skills requirements for top-class researchers who deal with advanced AI technologies.

    Various issues that confront data scientists are dealt with by establishing a professional community that encourages the discussion of ways of resolving problems together with case studies of AI and other advanced technologies and the sharing of know-how. The specific objective is a community of experts capable of addressing diverse customer problems and needs while also acquiring the latest AI technologies. Their mutual enlightening and practical learning are encouraged in parallel with the exchange of information and advice among top-class researchers and practitioners with experience of digital transformation from Japan and overseas.

    Hitachi intends to establish ways of assessing data scientist skills in quantitative terms, and in the future to make an ecosystem towards societal problems, with skills requirements and training programs being made available to customers and other partners.

    4. Deep Learning Technique for Autonomous Control of Entire Robot Body that Combines Pre-learned Actions

    4. Learning how to open and pass through a door (left) and operation under autonomous control (right)Learning how to open and pass through a door (left) and operation under autonomous control (right)

    In order to make greater use of robots that support human activities as a means of overcoming the severe labor shortages resulting from the low birth rate and aging population, Hitachi has developed a deep learning technique that eliminates the large amounts of programming needed to implement such robots, which has impeded their wider use. The technique works by executing pre-learned actions at appropriate timings, having first determined whether or not the situation the robot finds itself in is one that it has learned before. A human operator only has to remotely operate the robot once to teach it specific actions, such as grasping an object and moving. Having done so, the robot is then able to combine these learned actions under autonomous control.

    To test this, firstly, an actual robot was trained to learn several different actions (approaching a door, opening the door, and moving through the door). The test demonstrated that the ability to open and pass through a door, a series of different actions that involves the entire body, both upper (hand actions) and lower (movement), working together, could be acquired in a few days by first learning the individual actions and then performing these learned actions together.

    As the technique is suitable for use in a wide range of robot applications such as daily-life support and assembly work, Hitachi intends to continue developing it for commercialization, including work on reliability and functional improvement.

    Note that the work described in this article was undertaken as joint research with Professor Tetsuya Ogata's Laboratory at Waseda University.

    5. Decentralized Security Operation

    5. Decentralized security operationDecentralized security operation

    The speed of cyber-attacks is rising every year, with a growing risk of multipole sites coming under attack over a short period of time. This is putting a limit on how much organizations can do to defend against such attacks on their own and making it more important than ever that they share information and work together.

    It is against this background that Hitachi has developed technology for decentralized security operations that simulates incidents to assess their impact, sharing incident information across organizations. The incident simulation draws on Hitachi's IT and OT knowledge to establish an environment that models that of the actual systems and involves simulating cyber-attacks in this sandbox environment to assess whether they will impact actual systems. Organizations can use the technology both to prevent incidents at their own sites and to share the simulation results with other organizations so that they too can pre-emptively put measures in place to defend themselves.

    To test how well the technology works, Hitachi's Open Laboratory Yokohama received monitoring data used for incident analysis from the Information Technology Center at Keio University and undertook testing in a sandbox environment it established for the incident simulation. The results indicated that incident simulation would take only 15 minutes to identify which in-house systems would be affected by an attack.

    In the future, Hitachi intends to use the technology to help improve the security of social infrastructure by extending its use to the electricity and other industries.

    6. Cybersecurity for Automobiles

    The spread of connected cars and autonomous vehicles means that cyber-attacks on automobiles are becoming a serious problem. Keeping vehicles secure requires both ways of improving their ability to defend against cyber-attack and of minimizing the consequences of any such incidents that do occur.

    With regard to defensive measures, while secure communications of the sort used by IT systems is essential to guard against spoofing and data tampering, the limited capacities of the microcomputers and communication links in automotive systems also mean that lightweight techniques are essential. To this end, Hitachi has developed a lightweight authentication technique that reduces the number of communication steps required by 33% and also a lightweight encryption technique that reduces the computational and memory requirements of encrypted communications. These techniques have been incorporated into over-the-air (OTA) remote updating of automotive software and into onboard control communications using controller area network (CAN).

    Meanwhile, reducing the negative consequences of cyber-attacks requires rapid responses to reduce the risk to human life, and to shorten the length of time the vehicle is out of operation when a cyber-incident occurs. To do this, Hitachi has developed technology for hierarchical security operation in which the anomaly detection and response techniques that run on the in-vehicle secure gateway work in tandem with incident analysis on a center. The former continuously monitor the vehicle network for problems and, if one occurs, put the vehicle in a safe mode in real-time, while the latter analyzes the logs sent from the vehicle to the cloud after such a problem and quickly determines how to recover the vehicle.

    Hitachi aims to implement these techniques on vehicles to make them safe and secure.

    6. Defensive measures for vehicles and technology for security operationDefensive measures for vehicles and technology for security operation

    7. Automatic Identification of Harmless Items in X-ray Baggage Inspection Using AI

    7. Identification of harmless items by X-ray baggage inspection systemIdentification of harmless items by X-ray baggage inspection system

    X-ray machines are used for baggage inspection at facilities such as airports and event venues that require high levels of security.

    While it has become standard practice to use systems that notify guards in the event of detecting the shape or material of a dangerous item such as a knife or explosive device, all baggage including those containing nothing dangerous still need to be inspected visually. The problem with this is that it makes it impossible to process a large amount of baggage quickly.

    In response, Hitachi has developed a technique that uses AI to identify the individual items in baggage and automatically identify which ones are threats based on factors such as the type and density of materials. This proprietary technique identifies which items are harmless, with consideration for detection reliability, by using deep learning to determine their shapes together with the extent to which X-rays pass through them to estimate their weight. This allows guards to simplify their visual inspections of baggage that is deemed to hold nothing threatening while only inspecting those items in other baggage that is not identified as harmless.

    In-house testing found that guards were able to inspect approximately 40% more baggage in a given period of time than when all baggage was inspected visually.

    Hitachi intends to use this technology to contribute to making society safer and more secure.

    8. Explosives Detection System for Railways

    8. Artist’s impression of explosives detection system installed on railway station platformArtist’s impression of explosives detection system installed on railway station platform

    Hitachi has built a prototype explosives detection system intended as an anti-terrorism measure for public transportation services such as railways or buses.

    Recipes for powerful improvised explosives made from everyday materials are in widespread circulation, with explosives being used in roughly half of all terrorist attacks. Japan, too, has seen many instances of home-made improvised explosives. While this makes explosives detection an important anti-terrorism measure, it is impractical in the case of public transportation to screen all passengers and bags.

    To deal with this, Hitachi has developed an on-site system with two functions: monitoring of the air and the identification of chemical residues present in the vicinity of suspicious objects. The system uses an internal mass spectrometer to monitor the air in the vicinity of the installation location so as to be ready to detect the dispersal of poisons or other harmful substances. In the event of an unattended bag or other suspicious object being found, the system can also use a swab sample taken from the exterior of the bag, for example, to identify any chemical residues that are present. Travelers can be evacuated from the site if the result indicates the presence of an explosive. If not, the object can be treated as lost property.

    As well as continuing with the development of security equipment for public transportation services, Hitachi also intends to proceed with commercialization through collaborative creation with customers.

    9. Integration and Analysis of Infrastructure Maintenance Data to Achieve Urban Resiliency

    9. Integration and analysis of infrastructure maintenance data to achieve urban resiliencyIntegration and analysis of infrastructure maintenance data to achieve urban resiliency

    To implement a platform for integrating infrastructure maintenance that can help to improve the resilience of social infrastructure to disaster and make its maintenance more efficient, Hitachi has developed the following technologies to address the challenge of finding ways to integrate sensor data with wide-area data including weather data that is anomalous in terms of timing or location, and to collect large amounts of sensing data in compressed form.

    1. A technology for the integrated processing of spatiotemporal data that generates information by analyzing a combination of different types of data.
    2. An Internet of Things (IoT) technology for the compression and wireless transmission of data from sensors fitted to infrastructure.

    Trials have been conducted to assess the usefulness of these technologies in readiness for practical deployment. This has included using them to detect leaks from water pipes and display repair work priorities based on the spatiotemporal integration of leak sensor information and data on year of pipe installation, to display which locations should be the first to be restored during a disaster based on the spatiotemporal integration of information such as that from leak sensors or relating to critical infrastructure, and to compress sensor data by a factor of a thousand to facilitate transmission.

    In the future, Hitachi intends to run demonstration projects that use these technologies, and to help enable more efficient maintenance and make cities resilient to disaster.

    10. Demonstration of Edge Computing Business Case with Multi-vendor Coordination

    10. Testbed for edge computing with multi-vendor coordinationTestbed for edge computing with multi-vendor coordination

    Amid interest in edge computing for the processing and analysis of the large amounts of data generated by the IoT close to its point of origin, Hitachi is participating in a multi-vendor consortium for building a testbed that aims to diversify the associated functions.

    In practice, this involves building a testbed for connecting to and exchanging data with: (1) Three-dimensional light detection and ranging (3D LiDAR) from Hitachi-LG Data Storage, Inc. used as the control edge that generates data on people, objects, and machines, (2) Gateways from Cisco Systems G.K. and Dell Japan Inc. as the OT edge that collects data from the field, (3) Hitachi's OT data collection platform as the IT edge that processes and analyzes data at the on-ramp to the cloud, and (4) The enterprise cloud of NTT Communications Corporation that handles data analysis and archiving.

    The findings from this work relating to the use of edge computing for data compression and privacy protection were promoted by exhibiting them as a demo at the Fog World Congress 2018 held in the USA. The plans for the future include utilizing the testbed for retailer, factory, and elevator monitoring use cases to verify the business case.

    11. Hardware Accelerator for Faster Big Data Analytics

    11. Runtime environment for SQL on Hadoop acceleratorRuntime environment for SQL on Hadoop accelerator

    Big data analytics is becoming increasingly important, involving the interactive analysis of large amounts of data using different conditions and a variety of perspectives and its use in services and other activities. Recent years have also seen the growing availability of OSS such as structured query language (SQL) on Hadoop* that enables data in Hadoop to be analyzed directly using SQL, the de facto standard for relational databases. On the other hand, large computing clusters are needed to achieve interactive execution speeds, something that is difficult to achieve when space or equipment is limited.

    Hitachi has developed an accelerator for SQL on Hadoop that speeds up execution by reducing the amount of in-memory processing performed by the central processing unit (CPU) prior to execution, also offloading the data selection and statistical operations available in SQL to a field-programmable gate array (FPGA) capable of pipeline processing at the same time as handling complex conditional decisions. By providing speeds faster than CPU execution, the accelerator enables high-speed analysis systems to be implemented even on small computing clusters.

    The intention for the future is to expand the scope of application of the technology by adding to the range of supported OSS.

    *
    See the list of “Trademarks.”

    12. Expanding Use of OSS on Lumada

    12. Node generator tool for assisting with the development of connectorsNode generator tool for assisting with the development of connectors

    A wide variety of companies in recent years have actively pursued an approach to function development that is based on open innovation, working cooperatively in the OSS development community.

    Hitachi has adopted the open source Node-RED* software, an industry standard for the IoT, as an application development environment for Lumada. In the development community, meanwhile, it has led developments that include Git integration, data persistence, multi-language support, and Node generator. Node generator in particular is a tool that assists with the development of connectors, providing users with a quick way to develop connectors for connecting Node-RED to external systems. Hitachi has gathered considerable outside opinion going right back to when the function was first proposed to the community and has been quick to undertake function development in accordance with user needs, leading to greater use both inside and outside Hitachi.

    In the future, Hitachi intends to expand open technical development and be quick to develop functions that take account of user needs by also working with the development community for standard OSS that supports the implementation as micro-services of applications such as containers and service meshes.

    *
    See the list of “Trademarks.”

    13. System Operations Spanning Multiple Companies for Blockchain Systems

    Blockchain has attracted interest in recent years as a technology to enable reliable business transactions and data sharing among companies. Blockchains have adopted the features of “smart contracts,” which are programmable user-defined business logic for business contracts and transactions and are executed over blockchains while establishing consensuses among companies. Blockchain-based systems will consist of computers owned by individual companies. This will trigger a problem about system-wide operations (e.g., updating of smart contracts) due to the necessity to unify and/or adjust operational procedures and schedules among companies on the blockchain. To overcome this problem, Hitachi has developed a system operations technology for blockchain-based systems.

    In this technology, Hitachi has focused on the point that blockchain-native features such as “smart contracts” and “distributed consensus” can also be applied to system operations and management. This technology defines system operations as a smart contract so that unified and synchronized operations across companies can be executed effectively based on the smart contract while establishing consensus among the computers that constitute the blockchain-based system.

    In the future, Hitachi intends to continue with research and development on blockchain including this technology, thereby contributing to achieving its aim of using blockchains for cross-industry coordination to create the super smart society.

    13. System operations for blockchain systemsSystem operations for blockchain systems

    14. CMOS Annealing Machine

    14. World-largest 100-kbit CMOS annealing machineWorld-largest 100-kbit CMOS annealing machine

    Hitachi's Social Innovation Business will in the future have a need for more efficient ways of solving combinatorial optimization problems to enable the optimization of infrastructure systems. As a means of achieving this, it has developed a CMOS annealing machine that works on a new principle.

    There are a number of ways of implementing an annealing machine. CMOS annealing machines use semiconductors and can achieve large scale by connecting a number of chips together. Hitachi has built an annealing machine by connecting 25 FPGAs (a type of reconfigurable semiconductor chip) together and has demonstrated its ability to run combinatorial optimization problems. At 100 kbit, the new machine is currently the largest ever built*.

    In the future, the practical realization of new computers will require the development not only of computer hardware but also of software for putting the hardware to use and applications that can run on the computers. The 100-kbit machine will be made available on the cloud to accelerate application development through collaborative creation with partners.

    *
    As of December 2018, based on research by Hitachi.

    15. Flash Storage

    15. Automatic selection of data quantity reduction techniqueAutomatic selection of data quantity reduction technique

    Companies are actively pursuing the use of data to create new businesses and business value. To handle the high-capacity and high-frequency data input and output that accompanies this, flash storage capable of high-speed data access is being installed at customer data centers.

    As the unit price of the flash memory used to store data in flash storage remains high, it is commonplace these days to seek to improve storage efficiency by measures such as compression and deduplication.

    For the latest Hitachi Virtual Storage Platform, Hitachi has developed two compression and deduplication methods; the first prioritizes quick response by performing duplication checking after the data is written, and the second prioritizes throughput by checking while data is being written. The storage system also monitors access patterns (such as data size and the access locality) and automatically switches between the two methods.

    This technology makes it possible to consolidate data from different types of application to one storage device, including online applications that require a quick response and data analyses that require throughput.

    Download Adobe Reader
    In order to read a PDF file, you need to have Adobe® Reader® installed in your computer.