Post

Avatar
I was surprised at first, because thermodynamically, the brain is incredibly efficient- it produces much less waste heat per computation than a computer. But if you’re talking chemicals, then sure.
Avatar
This is very exciting. A nice addition to prior work. There is Nobel potential for the glymphatic research. One day perhaps we will have a medication that supports the synchronized (sleep) wave waste disposal for everyone over 50?
Avatar
Sexually Suggestive
Labeled by Bluesky Moderation Service
Avatar
How does one count the computations performed by the brain?
Avatar
Avatar
I was afraid this would get too complicated for a simple physicist like me to understand. But I always thought a FLOP was an arithmetic operation done to 5 or 10 decimal digits of precision—and my brain can't do even one of those per second.
Avatar
The conscious mind is not very good at calculations but there is a lot of processing going on underneath it.
Avatar
Of course. But how do we quantify "a lot"? What's the unit of measure?
Avatar
Action potentials transmitted per second for example. lips.cs.princeton.edu/what-is-the-...
What is the Computational Capacity of the Brain?lips.cs.princeton.edu One big recent piece of news from across the Atlantic is that the European Commission is funding a brain simulation project to the tune of a billion Euros. Roughly, the objective is to simulate the entire human brain using a supercomputer. Needless to say, many people are skeptical and there are lots of reasons that one might think this project is unlikely to yield useful results. One criticism centers on whether even a supercomputer can simulate the complexity of the brain. A first step towards simulating a brain is thinking about how many FLOP/s (floating point operations per second) would be necessary to implement similar functionality in conventional computer hardware. Here I will discuss two back-of-the-envelope estimates of this computational capacity. (Don't take these too seriously, they're a little crazy and just shooting for orders of magnitude.) Take One: Count the Integrating Neurons People who know about such things seem to think that the human brain has about 1e9 neurons. On average, let's assume that a neuron has incoming connections from 1e3 other neurons. Furthermore, let's assume that these neurons are firing at about 100Hz and that the post-synaptic neuron does the equivalent of 10 floating point operations each time it receives a spike. I base this sketchy guess on what would be necessary to perform a cumulative sum for an integrate-and-fire model. Put these numbers together and you need about 1e13 floating point operations one hundred times per second, or one petaFLOP/s -- 1e15 FLOP/s. Take Two: Multiply Up the Retina This computation is due to Hans Moravec, discussed in this article. The basic idea is to take the retina, which is a piece of the brain that we understand pretty well, and assume that the rest of the brain is similar. The starting point is to imagine that the retina is processing a 1e6 "pixel" image about ten times a second. If you figure that each of these images takes about 1e8 floating point operations to process, then you're looking at a gigaFLOP/s to match the processing power of the retina. The brain is about 1e5 times larger than the retina, which gets you to 1e14 FLOP/s. I think this is is kind of fun, because it's a different way to think about the problem, but isn't impossibly far away from the previous estimate. It's also interesting, because a petaFLOP/s is well within our current computational capabilities. An off-the-shelf gaming GPU such as the NVIDIA GeForce GTX 680 can do 3 teraFLOP/s. A warehouse full of these is a lot, but not impossibly many. Indeed Oak Ridge National Laboratory has a machine that has hit a benchmark 18 petaFLOP/s. (Edit: This post originally incorrectly said that the GTX 680 had 3 gigaFLOP/s.)