APP下载

New challenge for bionics—brain-inspired computing

2016-02-09ShanYU

Zoological Research 2016年5期

Shan YU

New challenge for bionics—brain-inspired computing

Shan YU

By definition, bionics is the application of biological mechanisms found in nature to artificial systems in order to achieve specific functional goals. Successful examples range from Velcro, the touch fastener inspired by the hooks of burrs, to self-cleaning material, inspired by the surface of the lotus leaf. Recently, a new trend in bionics—Brain-Inspired Computing (BIC)—has captured increasing attention. Instead of learning from burrs and leaves, BIC aims to understand the brain and then utilize its operating principles to achieve powerful and efficient information processing.

In the past few decades, we have witnessed dramatic progress in information technology. Moore's law, which states that transistor density in processors doubles every two years,has been proven true for the last 50 years. As a result, we now have miniature processors in small devices (e.g., phones) that,in terms of numerical calculation and memory storage, easily dwarf the brightest human mind. Given such a condition, which aspects of the brain can still enlighten us?

First, we need more energy-efficient processors. Nowadays,supercomputers or large data centers contain thousands of cores/processors, with the energy consumption rate at the megawatt scale. This severely limits the use of computing power in embedded (e.g., small, smart devices) and long distance (e.g., Mars rover) applications. In addition, with further extrapolation of Moore's law, the energy density of a microprocessor will become so high that it will start to melt. In fact, this is an important reason why it is believed that the trend described by Moore's law will come to an end, and probably soon. In contrast, the brain is extremely energy-efficient. With many capabilities that are still far beyond modern computers,the power of an adult brain is only about 20 watts. Therefore, to learn from the brain how to be “greener” is a major goal of BIC. With the knowledge obtained in neuroscience, we now know that the secret of the brain's energy efficiency involves various factors, including the co-localization of data processing and storage, highly distributed processing, and sparse activity. Neuromorphic computing aims to implement these features in microprocessors, with electronic elements mimicking the activities of individual neurons and millions of artificial neurons interacting with each other to process information (Merolla et al.,2014). In the most recent advance in this direction, IBM reported that they achieved satisfactory performance in complex pattern recognition tasks with a neuromorphic chip. Compared with conventional chips, the system reduced the energy consumption rate by many orders of magnitude (Esser et al., 2016). It is reasonable to expect that the knowledge learned from the brain will enable us to eventually combine super computing power with extremely low energy demand in the not-so-far-away future.1

The second aspect that the brain can teach us is how to achieve better performance in so called cognitive tasks. Conventional computers, no matter how powerful, know nothing beyond what has been written by their programmers. In addition,although they are superfast in crunching large datasets, they are incapable of solving multiple tasks that a normal person can handle with little effort, such as using language, understanding a movie, or driving a car in complex environments. The reason behind this capability gap lies in the different ways that knowledge/rules are learnt and represented in the system. In the brain, the huge amount of knowledge learned by our countless ancestors during evolution is stored in the genome and expressed in the neural network structure during development. Later, through these well-tuned, highly sophisticated networks, more knowledge is gained through an individual's interaction with the environment, which is represented by hundreds of billions of synapses in the brain(Nikolić, 2015). In this sense, compared with the hand-coded programs that modern computers rely upon, the brain has a much greater capacity to learn and utilize complex rules(Baum, 2003). Equipped with a design similar to that of the brain and trained by algorithms that allow for highly distributed knowledge representation, also like that of the brain, deep neural networks—artificial neural networks with many layers of processing—have turned out to be very powerful in a variety of cognitive tasks, ranging from practical image and speech recognition to difficult game play (LeCun et al., 2015;Silver et al., 2016). The enthusiasm evoked by such exciting advances is enormous across the academic community,industry and even the general population. With more interactions between neuroscience and machine learning, we can be optimistic that the distance from general artificial intelligence, at the human level or even beyond it, will become increasingly shorter.

The brain has been the object of modern neuroscienceresearch for more than a century, and artificial neuron networks as a tool for information processing were suggested as early as the 1940s. So, why is BIC attracting so much attention now?On the one side, experimental brain research is at the edge of revealing the core principle of the brain. Powerful techniques to monitor and manipulate neuronal activities are being rapidly applied to both human subjects (noninvasively) and novel animal models, including various nonhuman primates and geneticallymodified organisms. These studies have begun to uncover 1) the detailed architecture of brain networks and circuits (e.g., Fan et al., 2016), 2) the dynamic rules governing network operation (e.g.,Yu et al., 2013), and 3) how network and circuit activities give rise to motion, perception, and cognition (e.g., Janak & Tye, 2015). Such studies provide a solid foundation for BIC. On the other side,more efficient algorithms to train artificial neural networks have been strengthened by powerful computers, making large,complex networks useful for practical purposes (Hinton & Salakhutdinov, 2006; LeCun et al., 2015). Thus, exciting advances in neuroscience and machine learning, as well as rapid improvement in computing power and availability of “big data”, have emerged almost at the same time, increasing the appeal and value of BIC like never before. Progress in these individual areas or in their synergization will no doubt be the perpetual driving force behind BIC.

The design of organisms has provided inspiration for many ingenious and elegant solutions in engineering. Now is the time to turn our eyes to the pinnacle of biological evolution—the brain. Today, BIC is the new challenge for bionics and, in many ways, probably the ultimate challenge.

Baum EB. 2003. What is Thought? Cambridge: MIT Press.

Esser SK, Merolla PA, Arthur JV, Cassidy AS, Appuswamy R, Andreopoulos A, Berg DJ, McKinstry JL, Melano T, Barch DR, di Nolfoa C, Dattaa P,Amira A, Tabaa B, Flicknera MD, Modhaa DS. 2016. Convolutional networks for fast, energy-efficient neuromorphic computing. Proceedings of the National Academy of Sciences of the United States of America, doi:10.1073/pnas.1604850113.

Fan LZ, Li H, Zhuo JJ, Zhang Y, Wang JJ, Chen LF, Yang ZY, Chu CY, Xie SM, Laird AR, Fox PT, Eickhoff SB, Yu CS, Jiang TZ. 2016. The human brainnetome atlas: a new brain atlas based on connectional architecture. Cerebral Cortex, 26(8): 3508-3526.

Hinton GE, Salakhutdinov RR. 2006. Reducing the dimensionality of data with neural networks. Science, 313(5786): 504-507.

Janak PH, Tye KM. 2015. From circuits to behaviour in the amygdala. Nature, 517(7534): 284-292.

LeCun Y, Bengio Y, Hinton G. 2015. Deep learning. Nature, 521(7553): 436-444.

Merolla PA, Arthur JV, Alvarez-Icaza R, Cassidy AS, Sawada J, Akopyan F,Jackson BL, Imam N, Guo C, Nakamura Y, Brezzo B, Vo I, Esser SK,Appuswamy R, Taba B, Amir A, Flickner MD, Risk WP, Manohar R, Modha D. 2014. A million spiking-neuron integrated circuit with a scalable communication network and interface. Science, 345(6197): 668-673.

Nikolić D. 2015. Practopoiesis: or how life fosters a mind. Journal of Theoretical Biology, 373: 40-61.

Silver D, Huang A, Maddison CJ, Guez A, Sifre L, van den Driessche G,Schrittwieser J, Antonoglou I, Panneershelvam V, Lanctot M, Dieleman S,Grewe D, Nham J, Kalchbrenner N, Sutskever I, Lillicrap T, Leach M,Kavukcuoglu K, Graepel T, Hassabis D. 2016. Mastering the game of Go with deep neural networks and tree search. Nature, 529(7587): 484-489.

Yu S, Yang H, Shriki O, Plenz D. 2013. Universal organization of resting brain activity at the thermodynamic critical point. Frontiers in Systems Neuroscience, 7: 42.

Dr Shan YU is at the Brainnetome Center, Institute of Automation & Center for Excellence in Brain Science and Intelligence Technology,Chinese Academy of Sciences. E-mail: shan.yu@nlpr.ia.ac.cn

10.13918/j.issn.2095-8137.2016.5. 261