Skip to main content

Contact InformationContact Information

1. Sustainable Finance Platform

[01]Sustainable finance platform[01]Sustainable finance platform

The sustainable finance market has expanded over recent years in an effort to address societal challenges such as climate change. However, reporting on the environmental and social impacts of their projects imposes a considerable workload on the companies that issue securities. Investors, for their part, are calling for impact indicators to have a high degree of transparency to facilitate accurate comparisons between potential investments.

The sustainable finance platform extracts data from the green design of investments and uses blockchain technology to calculate a variety of impacts in a way that cannot be forged. The platform was chosen for Japan’s first Digitally Tracked Green Bond issued by Japan Exchange Group, Inc. in June 2022 and is currently being trialed as a means both of improving the efficiency of impact reporting and of providing investors with a transparent way to assess impacts whenever they need.

In the future, Hitachi intends to help resolve the challenges of the sustainable finance market by expanding the functionality of the platform, including broadening the types of equipment covered and adding support for third-party verification.

2. Drug Selection Support AI for Diabetes Patients Requiring Combination Therapy

Diabetes is a chronic condition that manifests in about one of every 10 people worldwide. While the control of blood sugar level is vital to slowing the progression of the disease, combination therapy involving two or more drugs has only been used in a limited number of cases and it is difficult for anyone other than an experienced specialist to select an appropriate mix of drugs.

In response, Hitachi has partnered with the University of Utah Health and the Regenstrief Institute, Inc. to develop a technique that supports drug selection for type 2 diabetes patients requiring combination therapy based on medical data.

The technique works by modeling the relationship between drugs and their efficacy for groups of patients with similar disease states. It takes data from a number of different locations and facilities covering a wide range of patient characteristics and drug choices and then integrates it on the basis of disease state. By doing so, it is able to assist approximately 83% of patients by providing efficacy predictions, even for combination therapies that have only been used in a small number of cases. The value of the technique for optimizing the treatment of diabetes is currently being assessed at two hospital groups in Utah and Indiana.

In the future, Hitachi intends to utilize the technique to continue creating healthcare services that improve patient quality of life (QoL).

[02]Efficacy prediction model with integrated analysis of medical data from two different regions grouped by patient condition[02]Efficacy prediction model with integrated analysis of medical data from two different regions grouped by patient condition

3. Cross-domain Data Exchange Platform: CADDE

Anticipating that the transition to Society 5.0 will involve the creation of new businesses that utilize a variety of different data, domain-specific data platforms have been established in fields like civil defense and transportation. However, little progress has been made in the use of data across different fields.

Through an eight-company partnership, Hitachi has been involved in the development of the platform for cross-domain interoperability of data Connector Architecture for Decentralized Data Exchange (CADDE*), and has released it as open source software (OSS). The platform works by means of software modules called connectors that perform the various operations, including data discovery, contracting, exchange, and history tracking. These modules interconnect with external services that support tasks such as data search, authentication and certification, and history tracking. The plan for the future is for the platform to be further developed as part of DATA-EX, a scheme for creating a nationwide data space in Japan.

This work was supported by the Council for Science, Technology and Innovation, “Cross-ministerial Strategic Innovation Promotion Program (SIP), Big-data and AI-enabled Cyberspace Technologies.” (funding agency: NEDO)

[03]Block diagram of CADDE cross-domain data exchange platform[03]Block diagram of CADDE cross-domain data exchange platform

4. Water Leakage Detection Service for Innovation in Social Infrastructure Maintenance

While social infrastructure built during the era of rapid economic growth is increasingly showing its age, societies around the world are having to deal with a falling off in the quality of social infrastructure maintenance due to an aging and shrinking maintenance workforce. This is also true for water pipes, where regular manual inspection is a feature of maintenance work. As delays in the identification of leaks can lead to sink holes and other similar incidents, the sooner such leaks can be detected, the better.

Hitachi has commercialized a water leakage detection service that can remotely assess leakage in pipes using ultra-sensitive wireless vibration sensors with a seven-year battery life that were developed by Hitachi using micro-electro-mechanical system (MEMS) technology. The sensors incorporate a detection algorithm developed using more than 400,000 water leak data points. Even minor underground leaks can be detected and notification passed on to the cloud. The service achieves early detection of leaks by providing continuous monitoring over a wide area. The service officially commenced operation in FY2021 and has been adopted by a growing number of utilities, including the Kumamoto City Waterworks and Sewerage Bureau.

By continuing to develop and deploy the technology, Hitachi intends to continue contributing to innovation in water supply management with consideration for the future maintenance of social infrastructure.

[04]Overview of water leakage detection service[04]Overview of water leakage detection service

5. MVS v1.4 Solution for Enhanced Security and Surveillance

Multifeature video search (MVS) is a solution for the rapid identification of specific individuals in surveillance camera video based on their whole-body characteristics. It can run searches on the basis of more than 100 different personal attributes, with speed of search being a key feature. Rising concern about maintaining public order over recent years has brought demand from business operators wanting to prevent incidents in public places before they happen. In response, Hitachi has released MVS v1.4 featuring two new functions based on technology that has been recognized at international conferences for its world-leading accuracy*1: (1) A wide-area monitoring function for detecting behaviors indicative of an incident about to happen*2, and (2) A lost property and suspicious object function that can detect items that have been abandoned or taken.

These functions provide earlier warning of potential incidents in public places. With studies already underway looking at commercial deployment of the solution at airports or in industries such as electric power, Hitachi plans to proceed with more detailed trials in readiness for product launch.

The associated technology has been adopted at leading industry conferences on image analysis, including the Computer Vision and Pattern Recognition Conference (CVPR) 2021 and International Conference on Image Processing (ICIP) 2022.
The function is able to identify nine different actions, such as looking around, and can be trained to recognize additional actions.

[05]Performance of new image recognition algorithms added to MVS 1.4[05]Performance of new image recognition algorithms added to MVS 1.4

6. Collaborative Creation Using CMOS Annealing

Around the world, people’s lives are being disrupted by the pandemic and by abnormal weather events and other natural disasters. Situations like these call for timely action to ensure the efficient operation and rescheduling of social infrastructure and systems.

Having developed complementary metal-oxide semiconductor (CMOS) annealing technology that can obtain practical solutions to combinational optimization problems at high speed, Hitachi is now engaging in collaborative creation (co-creation) with a variety of partners aimed at putting the technology to use to address societal challenges.

One example is the work Hitachi is doing with the National Institute of Maritime, Port and Aviation Technology (MPAT) to optimize the planning of relief supply delivery during disasters. This involves the efficient delivery of a large volume of goods to multiple evacuation sites without favoring one site over another. Trials have been conducted to verify that rapid calculations using CMOS annealing can produce delivery schedules that fulfill these requirements, and that the solutions it comes up with are realistic.

To make the technology more widely available for co-creation, Hitachi has been progressively upgrading the Annealing Cloud Web content that provides web access to CMOS annealing. By doing so, it is accelerating its use for the resolution of societal challenges.

Part of this work is based on results obtained from a project, JPNP16007, commissioned by the New Energy and Industrial Technology Development Organization (NEDO).

[06]Co-creation with MPAT (left) and Annealing Cloud Web (right)[06]Co-creation with MPAT (left) and Annealing Cloud Web (right)

7. Digital Maintenance Platform Service for Power Distribution Using Digital Twins

Power distribution faces the challenges of maintaining security of supply while also reducing maintenance costs and coping with workforce shortages. To overcome these, the industry is pursuing digital transformation (DX), making use of AI to improve work efficiency and utilizing unmanned aircraft systems (UASs) to automate inspection work. In turn, making a success of DX calls for integration of the enterprise asset management (EAM) systems that manage the capital equipment being maintained, with a wide variety of different systems including newly adopted technologies like AI platforms, IoT sensors, robots, and three-dimensional (3D) city models.

To enable this, Hitachi builds digital twins for electrical distribution systems that serve as a central repository for a wide variety of operational information. By replicating the current and past state of the distribution system in digital space, these digital twins facilitate the seamless integration of EAM with these diverse systems. With the digital twin of the city playing a core role, this digital maintenance platform service for electricity distribution interoperates with operational systems from other industries that are involved in the functioning of the city, such as commercial precincts and mobility. In doing so, it contributes to the economic development of the city and the resolution of societal challenges.

[07]Block diagram of digital maintenance platform service for power distribution[07]Block diagram of digital maintenance platform service for power distribution

8. Technology for Zero-emission Data Centers

Amid an acceleration in decarbonization efforts aimed at addressing global climate change, achieving zero emissions has become a requirement for data centers also as the digitalization of society results in their consuming more power than ever. These background factors have prompted research and development aimed at realizing zero-emission data centers based on the idea of using information and operational technologies (IT and OT) to reduce costs through sophisticated prediction and control techniques for data center power consumption and by optimizing supply and demand for renewable energy. Specifically, the following technologies are being developed in recognition of the particular challenges posed by the monitoring of power use at data centers and the use of electric power derived from intermittent renewable energy sources.

  1. Data-driven analysis and prediction of data center power consumption
    By automatically determining the characteristics of each server room using only easily obtained data, this analyzes and predicts their power consumption. Doing so facilitates improvements in the efficiency of electric power use and planning for the adoption of renewable energy.
  2. Green IT workload control
    This optimizes when and where IT workloads are run across a number of geographically distributed data centers to control the pattern of electric power consumption in a way that utilizes green energy.

In the future, Hitachi plans to contribute to the decarbonization of digital society through the commercial deployment of these technologies.

[08]Technology for zero-emission data centers[08]Technology for zero-emission data centers

9. Auto-scaling for Event-driven Distributed Applications

Driven by the need to satisfy service level agreements (SLAs) while also delivering reduced operating costs and interoperation with other systems, the SI market for social infrastructure systems, such as those for communications and electric power, is seeing rising demand for migration to the cloud. Unfortunately, the use of cloud services such as managed containers can bring high costs due to the need to maintain a safety margin of additional resources over long periods of time. To address this issue, Hitachi has developed an auto-scaling mechanism that monitors application workloads (events) and passes this information on to an orchestration service.

This auto-scaling works for container applications running on Hitachi Application Framework/Event Driven Computing (HAF/EDC), a platform for event-driven distributed applications that handles the control, notification, and distribution of events across multiple resources. It determines processing workloads by collecting data at regular intervals on how many and what type (processing time) of events are queued awaiting action by HAF/EDC container applications and passes this information on to an orchestration service to scale-up or scale-down resource availability. This keeps the cost of container resources to a minimum while still satisfying the SLA.

By further developing this technology into a technique for reducing bills in cloud environments, Hitachi intends to help minimize resource use as systems migrate to the cloud.

[09]Block diagram of auto-scaling for event-driven distributed applications[09]Block diagram of auto-scaling for event-driven distributed applications

10. Wireless Digital Twin Utilizing Ultra-high-speed Radio Transmission Simulator

When installing and operating wireless systems such as fifth-generation (5G) mobile networks and wireless local-area networks (LANs), the performance of radio communications can be degraded by changes in conditions on the ground, such as the rearrangement of the site layout or the movement of people and goods.

To address these changes in environment that disrupt communications, Hitachi has developed an ultra-high-speed radio transmission simulator that can be used to evaluate a site prior to installation, quickly assessing a wide range of conditions for its 3D digital space.

In the past, assessing radio signal quality for the digital space at a site required an analysis that considered hundreds or thousands of receiving points and took hours to perform. In contrast, the new simulator is able to complete an analysis in just a few seconds (around 10,000 times faster than before) thanks to the development of a radio environment analysis algorithm optimized for execution on a graphics processing unit (GPU) that can run analyses for multiple receiving points in parallel. By providing the ability to perform rapid pre-installation site evaluations that consider a wide variety of potential changes, this enables wireless systems to be installed and operated reliably.

[10]Use of ultra-high-speed radio transmission simulator to build wireless digital twin[10]Use of ultra-high-speed radio transmission simulator to build wireless digital twin

11. AI Interview Service for Assessing Skills of Diverse Human Capital

[11]Code-free generation of AI model replicating expertise of experienced interviewers[11]Code-free generation of AI model replicating expertise of experienced interviewers

An increasing number of companies have, over recent years, adopted the practice of conducting interviews with employees aimed at facilitating their career advancement. Unfortunately, a reliance on subjective assessment by interviewers has made it difficult to achieve fairness and consistency in the outcomes of these interviews. Furthermore, because the interview candidates heavily outnumber the small pool of available interviewers, the actual number of interviews that can be completed is small.

In response, Hitachi has used code-free techniques to generate an AI model that replicates the expertise of experienced interviewers and, when applied to videos of interviews, is able predict their conclusions with a high degree of accuracy. This can be used to minimize the variability between the conclusions of different interviewers. Moreover, the development of a self-interviewing function using an avatar in place of a live interviewer means that the interviews no longer need to take place in-person and can be completed at any time.

Hitachi Solutions, Ltd. has launched the AI Interview Service, which utilizes this technology. As the service is able to support a wide range of online activities, it is hoped that it will contribute to overcoming the challenges of online communication in such diverse industries as temporary staffing, education, and healthcare.

12. Compliance Verification Using VCP Model

[12]Compliance verification using VCP model[12]Compliance verification using VCP model

While the post-COVID-19 new normal comes with expectations that retail and other facilities will take the initiative in providing information about their hygiene management, the inadequate provision of such information in the past has compromised the safety and security of facility users. In response, a compliance verification technique was developed as part of Cyber Physical Security for IoT Society of the SIP program Phase 2. The technique can determine whether conditions at such sites have deviated from the rules and regulations covering people, systems, components, data, and processes. It uses a value creation process (VCP) model that is able to express these hygiene requirements in machine-readable format and works by collecting hygiene management data from onsite sensors (including CO2 concentration, noise level, temperature, humidity, and so on) and then checking these against the VCP model to verify compliance. It helps retail and other facilities to attract customers and ensure their safety and security by presenting real-time information on hygiene management conditions at the site.

The intention for the future is to deploy the technology at sites such as shopping centers, office buildings, and public facilities, extending its use to a wide range of applications such as at restaurant chains, real estate companies, local government, and educational institutions.

Part of the work described in this article was undertaken by the Cyber Physical Security for IoT Society project run by NEDO under the Cabinet Office’s SIP program.

13. Solution for Demand Prediction and Automatic Reordering Based on Demand Characteristics

Supply and demand have become even more erratic amid the rapid changes in the post-COVID era, resulting in frequent instances of overstocking and sudden changes in production levels. Unfortunately, the requirement for more fine-grained corporate management cannot be satisfied when tasks such as decision making and the revision of inventory and procurement planning are dependent on the empirical knowledge of individuals. This means that utilizing technologies like big data and AI to expedite DX in supply chains has a major impact on corporate competitiveness.

This solution focuses on inventory and procurement planning and is intended to make procurement management more efficient and to reduce stocking levels while still satisfying demand. Numerous solutions already exist in this space that use demand prediction as a basis for inventory optimization. However, when the product range is very large, AI demand prediction may work well for some items while being less reliable for others. This means that adequate performance cannot be achieved by relying on a single methodology to reorder goods based on demand prediction.

In response, the solution achieves its targets by performing a multi-dimensional analysis of the characteristics of demand for goods and materials and then using this to automatically determine the prediction method and reordering practices that best suit these characteristics. The system has been supplied to a large brewing company in China where it has reduced inventory by 16% by value.

[Hitachi (China) Ltd.]

[13]Model for automatically matching reordering practices to demand characteristics[13]Model for automatically matching reordering practices to demand characteristics

14. VOSP Development and Business Collaboration Approaches

The acceleration of DX in China has led to the need for visual analysis of onsite situations and optimization of business operations for purposes such as safety, efficiency, and regularity. However, the conventional case-by-case approach taken to address various customization requirements faces difficulty in solving various customer issues in the early stages at a fast pace. Under these circumstances, Hitachi developed an integrated common platform, the video operation service platform (VOSP), to promptly provide systems that meet various customer needs. VOSP contains frameworks with modular blocks holding common functions abstracted from customer cases. It also contains data interfaces and tuning methods as tools to quickly adjust common functions in various customer scenarios. State of the art technologies including video analysis, machine learning, and data simulation have achieved recognition at conferences and in journals. With advanced technologies, VOSP is successfully applied in manufacturing safety video solution development to reduce the solution construction time by two-thirds. Through these applications, VOSP has helped to create a win-win relationship with local customers and effectively scaled local DX business in China.

[Hitachi (China), Ltd.]

[14]Structure and advantages of VOSP platform[14]Structure and advantages of VOSP platform

15. Building Energy Management Solution for China

Prompted by environmental regulations such as the Chinese government’s National Standard for Building Carbon Emission Calculation, building owners and real estate companies are wanting to manage their properties in ways that combine regulatory compliance with economic activity. Achieving this requires the accurate prediction of electricity usage and the fine-grained control of things like air conditioning parameter settings (chillers, fan coil units, etc.), photovoltaic power usage, and purchases of renewable energy.

The research and development (R&D) division of Hitachi (China) Ltd. has drawn on its expertise in the highly accurate prediction of air conditioning power demand to develop a building energy management solution. The solution can be trained using both full data sets and partial data that contains monthly, weekly, and hourly characteristics, and the results can be combined to further improve prediction accuracy.

At the Global AI Challenge for Building E&M Facilities, where teams from more than 90 countries or regions participated in the world’s largest AI competition for smart buildings and cities, the solution was one of five grand prize winners in the open group, achieving an accuracy of 75% in its prediction of hourly cooling load demand three months ahead while keeping the various error rates low.

[Hitachi (China) Ltd.]

[15]Building energy management solution[15]Building energy management solution

16. Dataset Quality Assessment for AI Software

AI software differs from conventional software in that it is developed recursively from training data. In other words, obtaining high-quality datasets is a prerequisite for the development of high-quality AI software. For a detailed assessment of the quality of a dataset, it is important to look not only at single attributes such as the tagging of objects that appear in video data, for example, but also at multiple attributes such as object shape or background color, and to determine whether the data for these are adequate.

Accordingly, Hitachi has extended the technology of a variational autoencoder to develop a way to extract and analyze multiple feature values from a single set of data, implementing it in the form of a tool. The tool was tested by applying it to a dataset containing forms with handwritten text. The results demonstrated that it was able to identify groups of data with similar characteristics to those identified by the person who evaluated the quality of the dataset, which included not only the characters in the text but also multiple attributes such as thickness, extent of divergence from the standard shape, and the level of noise.

In the future, Hitachi intends to use this technique to assess adequacy and other quality considerations for a wide variety of datasets, including using it as a basis for quantifying the adequacy of a dataset.

[16]Evaluation of dataset quality by multiple attribute analyses[16]Evaluation of dataset quality by multiple attribute analyses

17. Security Digital Twin to Assure Business Continuity when Implementing Security Measures

The rise in cyberattacks targeting OT systems over recent years means that security measures are urgently required at a level similar to those found in IT systems. Unfortunately, it is often more difficult to halt the OT systems in comparison to IT systems, and likewise difficult to determine whether to install software patches or make changes to firewall rules given the risk that those changes may cause unintended system outages. As a result, vulnerabilities end up being left unresolved for long periods.

In response, Hitachi has developed a way of using digital twins for security, utilizing them to formulate security measures for reducing the damage caused by cyberattacks that exploit these vulnerabilities. This Security Digital Twin is able to assess each security measure in cyberspace prior to actual deployment, to determine the impact of system outages or other side effects on business continuity.

In the future, Hitachi intends to use this to help ensure the correct and efficient implementation of security measures at customers’ plants, electric utilities, and other social infrastructure as well as in connected cars and medical devices and systems.

[17]Technology for using digital twins in security[17]Technology for using digital twins in security

18. High-speed Data Retrieval to Improve Product Safety and Reliability

[18]Data-driven quality management[18]Data-driven quality management

Data-driven quality management is becoming increasingly crucial as safety awareness grows; inspecting all products before shipment requires high-speed retrieval of production data in manufacturing. High-speed retrieval requires a complete set of production data covering every process, from raw materials and parts to finished items associated with each product at shipment determination.

Hitachi Advanced Database (HADB)*1 achieves inspection of all products by retrieving all processes concurrently by using the principle of out-of-order execution*2. HADB reduces data retrieval time by a factor of 20*3 and enables all products to be checked.

As a result, HADB significantly improves product (inspection) quality and reliability in many manufacturing industries such as automotive, pharmacy, materials, and foods by executing quality inspections on all processes.

Using results from the study “Development of the Fastest Database Engine for the Era of Very Large Database and Experiment and Evaluation of Strategic Social Services Enabled by the Database Engine” (Core researcher: Masaru Kitsuregawa, University Professor at the University of Tokyo and Director-General of the National Institute of Informatics), which was supported by the Funding Program for World-Leading Innovative R&D on Science and Technology (Cabinet Office, Japan).
An execution principle devised by Professor Kitsuregawa (the University of Tokyo and Director General of the National Institute of Informatics) and Associate Professor Goda (the University of Tokyo)
Based on comparative testing by Hitachi, Ltd.

19. Simplification Technology for Accelerating AI Deployment in Mission-critical Applications

Hitachi has developed an AI simplification technology that can convert a black-box AI into a predictive formula that provides clear reasons for why it generates specific results. An issue with conventional black-box AIs is a lack of clarity regarding the basis of their decisions, a consequence of their use of complex formulas to improve prediction accuracy. This brings a risk of their producing unexpected predictions when supplied with unknown data. To address the issue, this technology creates an AI (predictive formula) that is simple enough for people to understand, meaning that the reasons why it generates the results it does to any given input are readily apparent. As the technology also allows for the modification of formulas to reflect customer experience or expertise, its accuracy can be maintained and improved without loss of confidence.

The technology has been partially adopted within Hitachi on an automated line for inspecting products prior to delivery. While use of conventional AI has been impeded by a lack of confidence in its results, the simplification provided by this technology has enabled its use in this case by making its results more convincing, without compromising accuracy. Deployed in practice, the resulting benefits of the technology included higher inspection accuracy as well as helping to alleviate the shortage of skilled staff. In the future, Hitachi will continue to accelerate DX across all areas of society and deploy AIs that can be trusted in applications such as manufacturing, finance, and infrastructure control.

[19]Block diagram of AI simplification technology[19]Block diagram of AI simplification technology

20. Service Mashup Platform for Rapid Application Development

The increasingly uncertain and complex business environment calls for companies to be agile and flexible. When building a digital solution, it is rare to have an overall picture of the service and its requirements from the very beginning. What is needed, rather, is to adopt an approach based on co-creation in which this is worked out in consultation with customers.

As a solution to this issue, Hitachi has developed a service mashup platform that enables the rapid development of prototype applications. The platform incorporates methods for service chaining in which the logic is assembled by combining software-as-a-service (SaaS), microservices, and other loosely coupled service components in a composable architecture. This allows prototype development to proceed quickly with a minimum of coding, providing an environment in which an overall picture of the service and its requirements can be established quickly through a process of hypothesis testing and trial and error.

[20]Identification of requirements through rapid application development and hypothesis testing[20]Identification of requirements through rapid application development and hypothesis testing

21. Enhancing Security of Microservices-based Systems with Service Mesh

Due to the increasing emphasis on improving business agility and securing competitive resources, there has been a growing interest in microservices-based systems. These systems interconnect multiple small-scale systems to enhance scalability and reusability. However, such systems have a high number of access points, and ensuring security is a significant challenge. In response, Hitachi is utilizing service mesh technology to apply a consistent security policy to entire systems via communication control using a proxy. The following approaches are being taken to enhance the security of microservices-based systems:

  1. Separation of security responsibilities
    Achieving separation of communication security responsibilities from each system component, and centralized management on the infrastructure side with service mesh.
  2. System status awareness
    Achieving observability by visualizing communication status and abnormal access based on communication metrics and logs between components.
  3. Access control
    Achieving comprehensive authentication and authorization using web application firewall (WAF) and service mesh to block unauthorized access.

To date, Hitachi has constructed a reference implementation based on its security strategy and developed functions for monitoring, authentication, and authorization and has open-sourced* it. By expanding the scope of security to also include code and infrastructure, Hitachi intends to contribute to both agility and security in microservices-based systems.

Istio Bench

[21]Security enhancement for microservice systems[21]Security enhancement for microservice systems

22. DX Consulting Technology for Accumulating and Using Domain Knowledge

The Lumada Data Science Laboratory (LDSL) was established in 2020 to bring together researchers and engineers with a depth of knowledge in techniques from the fields of AI and data science as well as the OT essential for real-world deployment. By putting this knowledge and skill to work in co-creation projects with customers, the laboratory is expanding Hitachi’s Lumada business through the rapid implementation of data-driven and value-creating solutions.

While the need for DX has been rising in recent years, it is important to adopt it in a staged manner by identifying the challenges in the customer phase and choosing the best techniques from AI and data analytics, thereby ensuring that it satisfies the needs of practical deployment. Hitachi has built up a portfolio of knowledge, skills, and use cases by understanding the issues facing its diverse customers who operate a wide range of businesses and coming up with specific measures for achieving their goals. This has involved developing a framework and data platform for formalizing this domain knowledge in graph data structures and managing it as a digital asset that can be put to use across different industries. The aim is to use this technology to speed up the work of devising scenarios for customer business growth and solutions for supporting that growth.

[22]Overview of framework for accumulating domain knowledge[22]Overview of framework for accumulating domain knowledge

23. O&M Simulation to Help Manage for Resilience

Business planning for operation and maintenance (O&M) needs to factor in a range of value considerations, including safety, reliability, the environment, resilience, and work practices. This new technique supports rapid decision making in areas like organizational change and investment by giving business planners the ability to assess the risks and benefits of a variety of different strategies quantitatively. This is done using agent simulation. Simulation agents are provided to represent factors such as plant, workforce, and IT, and reliability analysis is used as a basis for the plant agents so that they are able to replicate equipment with complex configurations and characteristics. The simulations can be used to investigate performance targets as well as in the selection of maintenance practices and planning for the installation of condition monitoring, fault diagnosis, and other IT solutions.

The simulations are applicable to a wide range of fields, from the supply of workers for O&M at wind farms or retail outlets, to large facilities such as factories or power plants. By combining agents in ways that replicate actual management scenarios, the simulations provide an objective and quantitative means of reviewing plans without the need for additional work such as building mathematical models.

[23]Use of O&M simulator to review plans and create value[23]Use of O&M simulator to review plans and create value

24. Neuromorphic Computing that Expands the Scope of Edge AI with Ultra-low Power Consumption

AI techniques such as image recognition are now finding applications in edge devices. However, if AI is to be widely adopted in the diverse environments at the network edge, power consumption needs to be as low as possible.

The human brain is known to consume a mere 20 W of power. By using neuromorphic algorithms and innovative devices to mimic its operation, the energy required to perform image recognition by techniques such as deep learning can be reduced more than 100-fold compared to a GPU. Hitachi has developed recognition algorithms suitable for use in video surveillance. Because this allows advanced recognition tasks such as identifying the attributes or actions of people to run on less than 1 W of power, it can be incorporated into security cameras, drones, or other edge devices to perform real-time, on-the-spot incident detection without having to send back video for processing on a server or the cloud.

In the future, Hitachi intends to help make people safer and more secure by investigating the use of this technology in real-time monitoring and surveillance solutions for public transportation and workplaces as well as other indoor and outdoor locations.

[24]Comparison of computing environments for deep learning and potential applications[24]Comparison of computing environments for deep learning and potential applications

Download Adobe Reader
In order to read a PDF file, you need to have Adobe® Reader® installed in your computer.