Ap Cam

Find The Best Tech Web Designs & Digital Insights

Technology and Design

Unveiling the Processing Power of the Human Brain

When we discuss computers, we are referring to meticulously designed machines that are based on logic, reproducibility, predictability, and math. The human brain, on the other hand, is a tangled, seemingly random mess of neurons that do not behave in a predictable manner. Although it is impossible to precisely calculate, it is postulated that the human brain operates at 1 exaFLOP, which is equivalent to a billion billion calculations per second. Our miraculous brains operate on the next order higher.

Human Brain

The human brain is a complex network of neurons and glial cells.

Brains vs. Computers: A Comparative Analysis

The brain is both hardware and software, whereas there is an inherent different in computers. The same interconnected areas, linked by billions of neurons and perhaps trillions of glial cells, can perceive, interpret, store, analyze, and redistribute at the same time. Computers, by their very definition and fundamental design, have some parts for processing and others for memory; the brain doesn’t make that separation, which makes it hugely efficient.

The same calculations and processes that might take a computer a few millions steps can be achieved by a few hundred neuron transmissions, requiring far less energy and performing at a far greater efficiency. As much as computing has progressed, a biological brain still vastly outperforms the fastest calculators in many ways, and with a fraction of the energy consumption. Energy consumption is one of the main problems facing modern computing. In contrast to power-hungry computers, brains have evolved to be energy-efficient. It is estimated that a human brain uses roughly 20 Watts to work - that is equivalent to the energy consumption of your computer monitor alone, in sleep mode.

To illustrate the difference, consider supercomputers. A supercomputer is a computer with a high level of performance compared to a general-purpose computer. The performance of a supercomputer is commonly measured in floating-point operations per second (FLOPS) instead of million instructions per second (MIPS). Since 2017, there are supercomputers which can perform up to nearly a hundred quadrillion FLOPS. Since November 2017, all of the world's fastest 500 supercomputers run Linux-based operating systems. Additional research is being conducted in China, the United States, the European Union, Taiwan and Japan to build even faster, more powerful and more technologically superior exascale supercomputers. Supercomputers play an important role in the field of computational science, and are used for a wide range of computationally intensive tasks in various fields, including quantum mechanics, weather forecasting, climate research, oil and gas exploration, molecular modeling (computing the structures and properties of chemical compounds, biological macromolecules, polymers, and crystals), and physical simulations (such as simulations of the early moments of the universe, airplane and spacecraft aerodynamics, the detonation of nuclear weapons, and nuclear fusion). This allows for a heterogeneous computing model. To provide a high rate of data throughput, the nodes will be connected in a non-blocking fat-tree topology using a dual-rail Mellanox EDR InfiniBand interconnect for both storage and inter-process communications traffic which delivers both 200Gb/s bandwidth between nodes and in-network computing acceleration for communications frameworks such as MPI and SHMEM/PGAS. The amount of energy required to power computations by the world’s fastest supercomputer would be enough to power a building; the human brain achieves the same processing speeds from the same energy as is required to charge a dim light bulb.

Supercomputer

A modern supercomputer requires significant power and cooling.

Key Differences Summarized

FeatureHuman BrainSupercomputer
Processing Speed~1 exaFLOPUp to ~100 quadrillion FLOPS (as of 2017)
Energy Consumption~20 WattsEnough to power a building
StructureTangled network of neuronsMeticulously designed machine
AdaptabilityHigh (Neuroplasticity)Limited

Neuroplasticity: The Brain's Unique Advantage

One of the things that truly sets brains apart, aside from their clear advantage in raw computing power, is the flexibility that it displays. Essentially, the human brain can rewire itself, a feat more formally known as neuroplasticity.

Neuroplasticity: The Brain's Ability to Rewire Itself

Neuromorphic Computing: Mimicking the Brain

Neuromorphic technologies transfer insights about the brain to optimise AI, deep learning, robotics and automation. Computing systems using this approach have become increasingly refined and are in development worldwide. In the Human Brain Project, teams of engineers and theoretical neuroscientists are focused on the engineering and development of neuromorphic devices, which use spiking artificial neurons to train neural networks to perform calculations, and generally take inspiration from the way human brains function.

The first system, BrainScaleS, is an experimental hardware that emulates the behaviour of neurons using analog electrical circuits, omitting energy-hungry digital calculations. It relies on individual events, called “spikes”, instead of a stream of continuous values used in most computer simulations. Neurons sending such electrical impulses sparsely to each other is a basic way of efficient signaling in the brain. Mimicking the way neurons calculate and transmit information between each other allows the BrainScaleS chips, now already in their second iteration, to perform very fast calculations while also reducing data redundancy and energy consumption.

The second system, SpiNNaker, is a massively parallel digital computer designed to support large scale models of brain regions in biological real time. The SpiNNaker neuromorphic computer is based at the University of Manchester. It runs spiking neural network algorithms through its 1,000,000 processing cores that mimic the way the brain encodes information and can be accessed as a testing station for new brain-derived AI algorithms (Furber & Bogdan 2022). At the same time, SpiNNaker has shown promise for developing small low-energy chips that can be used for robots and edge devices. In 2018, the German state of Saxony pledged support of 8 million Euro for the next generation of SpiNNaker, SpiNNaker2, which has been developed in a collaboration between the University of Manchester and TU Dresden within the HBP. A SpiNNaker2 computer system with 70,000 chips and 10 Million processing cores will be based at TU Dresden (also see p. 55). SpiNNaker2 has been chosen as one of the pilot projects of Germany’s Federal Agency for Disruptive Innovation, SPRIN-D.

With the hardware advancing, software is learning from the brain as well. Brain research and AI have always shared connections. The earliest versions of artificial neural networks in the 1950s were already based on rudimentary knowledge about our nerve cells. Using new insights into biological brain networks, software modelers in the HBP have developed the next generation of brain-derived algorithms. After a series of high-level breakthroughs by several HBP teams (Cramer et al. 2022, Göltz et al. 2021, Bellec et al. 2020), in 2022, a collaboration of HBP researchers at TU Graz together with Intel tested the power of algorithms to bring down energy demand using Intel’s Loihi Chip (also see p. 56). The results were an up to 16-fold decrease in energy demand compared to non-neuromorphic hardware (Rao et al.

Importantly for the HBP and neuroscience in general, more powerful and efficient computing also accelerates brain research, generating a positive feedback loop between highly neuro-inspired computers and detailed brain simulations. In this way, mechanisms that have evolved in biological brains to make them adaptable and capable of learning can be mimicked in a neuromorphic computer so that they can be studied and better understood. This is what a team of HBP researchers at the University of Bern have achieved with so-called “evolutionary algorithms” (Jordan et al. 2021). The programmes they have developed search for solutions to given problems by mimicking the process of biological evolution through natural selection, promoting the ones most able to adapt. Traditional programming is a top-down affair; evolutionary algorithms, instead, arise from the process on their own.

In the last few years, impressive neuromorphic breakthroughs have made tangible what was previously only theorised regarding the advantages of the technology.