Central Government's AI Strategy Council Discusses Legal Regulations for AI Safety

Central Government's AI Strategy Council Discusses Legal Regulations for AI Safety

The recent meeting of the central government's AI strategy council, led by Yutaka Matsuo from the University of Tokyo, delved into crucial discussions surrounding the establishment of legal frameworks to safeguard the security of artificial intelligence. One of the primary focal points of the meeting was to address potential risks associated with AI technology, including concerns about the creation of AI weapons, violations of individual privacy and human rights, as well as possible criminal applications.

The council emphasized the importance of considering appropriate legal regulations that would specifically target companies engaged in the development of high-risk AI systems with substantial societal impacts, like Open AI, known for creating ChatGPT. Members highlighted the necessity of exploring effective measures to address companies that breach these regulations, stressing the importance of introducing penalties for non-compliance. However, the council emphasized that the legislation would not aim to micromanage every minor aspect but rather encourage companies to take on voluntary responsibilities and promote industry associations' active involvement.

While Japan has previously focused on fostering AI advancement and allowing companies to self-regulate, recent concerns over the potential risks posed by generative AI technology have prompted discussions on the need for legal regulations. Following the European Union's enactment of the world's first Artificial Intelligence Act and President Joe Biden's executive order requiring AI developers to disclose information in the United States, many countries are actively considering regulatory measures for AI. The council intends to closely study the regulations implemented in Europe and the U.S. to tailor appropriate laws for Japan, with the goal of presenting a bill during the next ordinary Diet session.