Keynote: Shunpei Yamazaki Ph.D., President, Semiconductor Energy Laboratory Co., Ltd.
There are three roadblocks ahead for AI technology: data volume, energy consumption, and difficulties surrounding real-time processing. Among the three roadblocks, we feel that the issue of energy consumption needs to be focused upon and solved.
We have seen rapid development of AI technology in recent years, which is supported by advances in deep learning. Deep learning needs an enormous amount of compute resources for learning and inference. The graphics processing unit (GPU) is seen as a component that can process deep-learning operations at high speeds, and we can expect an even more pervasive use of GPU as the AI technologies are developed.
However, AI frameworks implemented on GPUs consume a large amount of power. This causes an exponential increase in energy cost and environmental burden, which could impede the future evolution of AI. According to a report from the Energy Information Administration, the world’s net electricity generation is set to increase 69% by 2040, from 21.6 trillion kilowatthours (kWh) in 2012 to 36.5 trillion kWh in 2040.. Imagine the amount of power necessary for an explosive pervasion of GPU-based deep learning. Unless we solve the issue of power consumption, AI development would not be environmentally sustainable.
In this keynote session, issues of power consumed by AI and their solutions will be discussed. In addition, the AI chip that utilizes the hardware technology enabled by SEL’s crystalline oxide semiconductor technology will be introduced as a breakthrough to the issues.