Priorities for the Japanese Economy in 2016 (January 2016)

What is Needed for AI Literacy?

Senior Fellow, RIETI

2015 was the year when AI became more familiar

In the midst of the third artificial intelligence (AI) boom that began in 2013, the term "AI" became widely used across society in 2015. RIETI organized a number of AI-related events such as BBL Seminars and Highlight Seminars, attracting people interested in AI from a variety of fields. The latest issue of RIETI Highlight (1) features their contents in detail. AI involves making computers learn concepts to establish human-like intelligence. The core technologies that support the third AI boom are data mining for exploratory analysis of big data and deep learning, one of the machine learning approaches. To understand AI, it is essential to know about machine learning. Yet, the image of the term "machine" does not generally resonate well with the cutting-edge image of AI, until it is explained that machine refers to computers, or, rather, algorithms and software that allow computers to perform specific tasks. Algorithms and software are developed for specific tasks and specific media (computers, robotics, automobiles, etc.). Since AI makes machines learn precedents and similar examples to help them make judgments on future events, the quality and quantity of data used play a crucial role in advancing AI technology. In its present stage, AI excels in learning and executing categorization, repetition, exploration, organization, and optimization.

Linking AI to economic analysis

The current level of AI does not perform tasks autonomously. Humans must provide fundamental factors such as setting a hypothesis, selecting a suitable algorithm, and assessing and interpreting the content, extent of learning, and results. As big data is involved, it is also necessary to understand its characteristics. Konishi (2014) (2) listed three aspects of big data different from data that can be obtained from existing public-sector and private-sector research. For example, big data does not include information of inaction. AI improves accuracy in proportion to the amount of learning it has made, and is not good at making judgments on unknown or rare events. Economic analysis can contribute to AI by (1) utilizing qualitative or prior information on rare events, (2) creating a theoretical model based on hypothesis and establishment of problems, and (3) introducing the concept of causal relationship to discrimination and classification. In order to understand the current situation fully, a seminar titled "Application of AI Technology to Big Data for Humans and Society" was held on December 1, 2015, inviting Yoichi Motomura, vice director of the Artificial Intelligence Research Center at the National Institute of Advanced Industrial Science and Technology, and Yuki Yamamoto of Nomura Securities Financial & Economic Research Center. Citing specific examples of recent trends and analysis on the AI field, I will attempt to summarize their presentations, highlighting the potential of collaboration with economic analysis.

AI technology in the near future

Under the title "Next-Generation Artificial Intelligence Technology and the Use of Big Data" (3), Motomura introduced initiatives by the National Institute of Advanced Industrial Science and Technology for advancing deep learning, one of AI's core technologies, with emphasis on "mutual understanding with humans" and "capability to describe a phenomena in addition to enhanced projection accuracy." Although many cases of AI learning were introduced, Motomura said the key to advancing AI technology lay in increasing the quality and volume of available data, and initiated future-oriented debate on how to gather and share information. Goods are increasingly fitted with sensors and data collection functionality based on Internet of Things (IoT), allowing automatic collection of massive data from producers, service providers, and consumers. He presented methodology for making such information anonymous and secure for privacy protection without significantly undermining its quality, so that the information can be shared among as many companies and researchers as possible. Advanced discussions and methodology development, envisaging revision of the Personal Information Protection Act, should contribute positively to the safe and secure utilization of personal data and ultra-micro data.

Applying AI technology to empirical economic/financial analysis

For securities companies, it is fundamentally important to aggregate and process public information about countless enterprises existing in the world and provide it to customers swiftly, accurately, and at low cost. Yamamoto used the Cabinet Office's Economy Watchers Survey as teaching data and applied deep learning to develop a system that can accurately determine the sentiment (positive or negative) of a passage on the economy. The AI system was then fed with text from the Bank of Japan's Monthly Report of Recent Economic and Financial Developments and the Cabinet Office's Monthly Economic Report for indexation of each of such reports' economic outlook. Furthermore, the obtained Nomura AI Economic Sentiment Index was analyzed for its correlation with macroeconomic indicators. The findings were presented in Suimon, Yamamoto, Kinoshita (2015) (4). Incidentally, the AI system trained with the Economy Watchers Survey data has now developed an intelligence level equivalent to that of an upper elementary school student. It continues to read--around the clock--the amount of reference materials that would take a person so much longer, and it can now determine the sentiments of text on macroeconomic subjects at an accuracy of around 90%. The AI field is characterized with the use of personified expressions such as learn and grow with respect to algorithms and computer programs.

Data literacy, statistical literacy, and AI literacy

Data literacy is required for data mining, in which data are exploratorily analyzed to examine a phenomenon. In machine learning, statistical literacy is required to analyze big data. In view of the fact that data mining and machine learning represent main AI technologies, data literacy and statistical literacy continue to be key elements. What about AI literacy? In daily life, none of us would create a deep learning algorithm or is expected to have such ability. AI literacy refers to being conscious about whether there are any standardized/formatted tasks that have not been applied for AI due to large data volume, and whether excessive labor, money or time is spent for "categorization, repetition, exploration, organization and optimization." We should make it a habit to think about what can and cannot be applied to AI. AI can learn much more than just quantitative data. Recording a range of information including text, sound, and images is valuable. For example, record your mother's recipe as digitized text. When a cooking robot becomes reality in the future, having the data for one recipe could lead to a new business of reproducing home cooked meals. We should constantly examine whether the information we possess has the potential of generating high value added, be emotionally prepared to let go of tasks that AI can perform in our place, and start learning and investing in our strengths that they cannot perform.

December 25, 2015
  1. ^ RIETI (2015) "Artificial Intelligence and the Future of the Economic Society," RIETI Highlight, vol.57 (in Japanese).
  2. ^ Konishi, Yoko (2014) "What is Needed to Ensure that the Boom in Big Data Lasts?" RIETI Column.
  3. ^ Motomura, Yoichi (2015) "Next-Generation Artificial Intelligence Technology and the Use of Big Data," reference material distributed at the RIETI research presentation session "Application of AI Technology to Big Data for People and Society."
  4. ^ Suimon, Yoshiyuki, Yamamoto, Hiroki and Kinoshita, Toshio (2015) "Indexation of Government and BoJ's Economic Sentiments with AI," Nomura Macro Economic Insight, November 30 issue.

January 13, 2016