Median Group

The Brain and Computation, Part 1

Measuring Computation

The computational performance of microprocessors can be quantified by measuring the number of floating-point arithmetic operations the processor can perform per second (FLOPS). This number is very useful for comparing different hardware being used for numerically intensive applications like scientific computing or mining fake internet points, but some have attempted to quantify the computation done by the human brain in these terms to reason about how difficult it would be to run a human-level intelligence on modern computing hardware.

This post will discuss a few of the issues associated with measuring the computational performance of the brain with FLOPS, and a follow-up post will consider specific estimates.

Does it make sense to think about the computational capacity of the brain in terms of FLOPS?

There is a line of thinking that goes something like:

Neurons generate action potentials. Action potentials are stereotyped signals, so the computation that happens in the brain is essentially digital, so it makes sense to compare brains to digital computers, and synaptic operations are kind of like arithmetic operations.

This may or may not be a good enough approximation, but it’s definitely a lossy approximation.

Brains probably aren’t bottlenecked on arithmetic

A common objection to measuring the performance of the brain in FLOPS is that computation in the brain isn’t bottlenecked by arithmetic capacity, but rather by information flow, so the capacity of the brain should be measured in traversed edges per second (TEPS) rather than FLOPS. Synaptic connections between neurons tend to be sparse and axons tend to be long, which seems to suggest a lot of neural tissue is dedicated to pushing signals around rather than performing arithmetic on them1.

Brains are asynchronous

Microprocessors are clocked circuits. When a computation unfolds on a microprocessor, it proceeds in discrete, well-delineated steps with one occurring each processor cycle. This method of computation is fundamentally synchronous.

Brains don’t have a clock: neurons fire when they fire, which usually isn’t very often (one to ten times a second), but is sometimes much faster (up to around 1000 Hz)2. And the phase of the neural spike trains also seem to be important3, which further complicates the comparison.

Non-spiking neurons

Many neurons don’t even spike, having graded, non-stereotyped potentials. The best-studied are the photo-receptive neurons in the retina, but they occur throughout the brain and it’s unclear how to integrate them into the larger computational picture of the brain.


This post was not meant to be comprehensive, and is merely meant to highlight the strangeness and limitations of thinking of the limits of neural computation in terms of FLOPS.

  1. Limitations in the ability of evolution to modify the basic vertebrate developmental plan lead can lead to bizarre inefficiencies, like the optic nerve needing to carry signals from the retina to the back of the head before being processed in the visual cortex, or in the case of giraffes the laryngeal nerve needing to take a >4 meter detour

  2. See sparse coding

  3. See phase coding