To anticipate it, the brain is an entity organizing itself through its goals. The computer is a commodity created by humans for computing power.
The brain has the task of ensuring the survival of humans. The task of the computer is to carry out arithmetic tasks abandoned by humans (using algorithms - a prescription created in the language of the computer - consisting of an exact sequence of instructions used to perform certain tasks in a given time).
The computing speed plays a secondary role for the brain; Priorities have the links of the neural networks.
The brain is a tissue that works according to organic laws - the computer is a computer made of inorganic parts, which needs specifications, and cannot be creative by itself. Although it can count much faster than a brain, it lacks the creative element that allows creative solutions through infinite connectivity solutions.
An essential control mechanism of human beings are feelings, which i.e. be generated by the brain.
The brain is there to form a structure according to the respective goal. These goals have been in the brain since time immemorial, or they are always new after they have been procreated.
Everything in the brain can be influenced by everything. This must not be in the computer because otherwise he will not be able to carry out his bills - he will no longer be able to carry out his work step by step. He needs clear commands about what should be involved.
The central goal of the brain is that the living thing survives - adapts itself as possible to the environment. For this purpose, there are always various networks of neurons active in the brain that can constantly change their value flexibly.
The central goal of the computer is to calculate each time after the programming.
The brain carries out countless tasks at the same time and the priorities can change constantly - the computer is clearly overburdened here. It can only process a fraction of what the brain can do - and if so, then only with much more energy than the brain.
In the brain, many processes always run in parallel, and new ones are constantly added. There are always new synapses (which are responsible for learning) formed, which strengthen or regress.
As a rule, the processing steps in the computer run one after the other.
The computer needs clear conditions - the brain excludes facts and (partially diffuse) information clouds from outside and inside.
This makes it possible for the brain to be creative. A computer cannot do that - for the reasons mentioned.
The brain is able to learn for a lifetime because it has to adapt to changing circumstances over and over again. This is not always the case, but is usually better for survival, because it can change flexibly again.
The brain is always active - the computer only when information arrives. The brain never sleeps because it has to be careful about whether life is threatened.
What can a person think of when naming the word red? Unprecedented associations, such as love experiences, traffic accidents, the setting sun, blood, colours of autumn, the cloth in the bullring, colour of flags, blast furnaces, etc.
And these immediately create associations again, and these again, and so on. In other words, it activates neural networks that activate neural networks again, and so on.
What does the computer show when you enter red? A limited number of answers that humans have programmed once. There are only references (links) but no automatic associations (unless man links them). Feelings that are very strongly activated by the word red, for example, are lost on the computer because it is numb. This is a huge benefit for humans because emotions are fine-tuning elements that are important to react and act.
With this example - and no matter what word or term it is - you can also see very well the difference between brain and computer.
The brain has formed organically in the course of the development of the respective living organism, or the genus, with the aim of survival. It usually does not expect, but uses existing and emerging networks, which have similarities or signal a temporal or spatial proximity.
The computer is a man-made machine whose goal is to calculate. It does not use components that have nothing to do with each other - that do not fit into its calculation, which is subject to strict logic.
For example, a captcha - randomly arranged numbers and letters - are relatively easy to read for humans, with their similarity approach, but for computers this is virtually impossible because letters and numbers are so distorted that their systems cannot read them.
A computer needs commands to work step by step. The brain acts on its own and creates new goals time and again, only by driving the survival and well-being of the living being.
A computer almost never makes mistakes because it only calculates logically. He lacks the creativity.
The brain can make many mistakes because it is creative and can draw wrong conclusions. The advantage, however, is the tremendous flexibility that makes it possible to use all the components and networks for specific goals. The downside is that it wants to explain everything and often does not look closely. The advantage is that it can produce results that often make sense from a nebulous occurrence from a few clues.
The computer always makes rational decisions - the brain can make irrational decisions that can make sense for each goal.
So, as you can see on closer inspection, they are two very different systems.
If one reads in general about the comparison brain - computer, then it is noticeable that time and again the speed is mentioned. This does not play the central role in the brain in this regard, but the link; That's the way man lives. And that does not create in this form and to this extent a computer - also not approximately.
People are always learning, forming new synapses and neural networks. Whenever someone has learned something, new synapses have formed or been amplified in his brain.
On the other hand, these are restricted or deleted when they are no longer used.
The individual synapses, which are available in billions of times, can each connect thousands of times with other synapses.
To reproduce all this into a computer, and let it act just like the brain, is hardly possible.
And to show once the difference in energy consumption:
Comparing with modern computers shows the power of the human brain. While the brain manages about 1013 analog arithmetic operations per second while consuming about 15 to 20 watts of power, IBM's BlueGene / L supercomputer provides up to 3.6x1014 double-precision floating-point operations per second, but requires about 1.2 megawatts become. Intel's first teraflop "Terascale" prototype with 80 processor cores, on the other hand, provides about 1012 single-precision floating point operations at 85 watts (or 2 × 1012 floating point operations at 190 watts and 6.26 GHz), which still equates to 50-5,000 times the energy requirement , Although modern 3D graphics cards achieve comparable values with lower electrical power requirements, graphics chips are more specialized in certain computing processes.
However, it should be noted that the high computing power of the brain is mainly achieved by its many parallel connections (connectivity) and not by a high speed in the individual calculations (clock frequency). Artificial neurons work 105 times faster than neurons of the human brain.
Off: https://de.wikipedia.org/wiki/Gehirn (computing power and power consumption)
The brain is an organic tissue that wants to survive with humans and uses all the similarities in it. It makes its own interpretation, images and "truths". It always creates new goals.
The computer runs according to exactly predetermined steps.
Both are subject to precise but different laws. These are two completely different systems, each of which can only be described individually.
© It is permitted to use or reproduce this content without restriction on the condition of naming my website www.karlheinzhermsch.de and without changing or shortening the texts. (Please inquire about exceptions via my imprint.)
Here is a study:
Electrical properties of dendrites help explain our brain’s unique computing power
Neurons in the human brain receive electrical signals from thousands of other cells, and long neural extensions called dendrites play a critical role in incorporating all of that information so the cells can respond appropriately.
Using hard-to-obtain samples of human brain tissue, MIT neuroscientists have now discovered that human dendrites have different electrical properties from those of other species. Their studies reveal that electrical signals weaken more as they flow along human dendrites, resulting in a higher degree of electrical compartmentalization, meaning that small sections of dendrites can behave independently from the rest of the neuron.
These differences may contribute to the enhanced computing power of the human brain, the researchers say.
“It’s not just that humans are smart because we have more neurons and a larger cortex. From the bottom up, neurons behave differently,” says Mark Harnett, the Fred and Carole Middleton Career Development Assistant Professor of Brain and Cognitive Sciences. “In human neurons, there is more electrical compartmentalization, and that allows these units to be a little bit more independent, potentially leading to increased computational capabilities of single neurons.”
Harnett, who is also a member of MIT’s McGovern Institute for Brain Research, and Sydney Cash, an assistant professor of neurology at Harvard Medical School and Massachusetts General Hospital, are the senior authors of the study, which appears in the Oct. 18 issue of Cell. The paper’s lead author is Lou Beaulieu-Laroche, a graduate student in MIT’s Department of Brain and Cognitive Sciences.
Dendrites can be thought of as analogous to transistors in a computer, performing simple operations using electrical signals. Dendrites receive input from many other neurons and carry those signals to the cell body. If stimulated enough, a neuron fires an action potential — an electrical impulse that then stimulates other neurons. Large networks of these neurons communicate with each other to generate thoughts and behavior.
The structure of a single neuron often resembles a tree, with many branches bringing in information that arrives far from the cell body. Previous research has found that the strength of electrical signals arriving at the cell body depends, in part, on how far they travel along the dendrite to get there. As the signals propagate, they become weaker, so a signal that arrives far from the cell body has less of an impact than one that arrives near the cell body.
Dendrites in the cortex of the human brain are much longer than those in rats and most other species, because the human cortex has evolved to be much thicker than that of other species. In humans, the cortex makes up about 75 percent of the total brain volume, compared to about 30 percent in the rat brain.
Although the human cortex is two to three times thicker than that of rats, it maintains the same overall organization, consisting of six distinctive layers of neurons. Neurons from layer 5 have dendrites long enough to reach all the way to layer 1, meaning that human dendrites have had to elongate as the human brain has evolved, and electrical signals have to travel that much farther.
In the new study, the MIT team wanted to investigate how these length differences might affect dendrites’ electrical properties. They were able to compare electrical activity in rat and human dendrites, using small pieces of brain tissue removed from epilepsy patients undergoing surgical removal of part of the temporal lobe. In order to reach the diseased part of the brain, surgeons also have to take out a small chunk of the anterior temporal lobe.
With the help of MGH collaborators Cash, Matthew Frosch, Ziv Williams, and Emad Eskandar, Harnett’s lab was able to obtain samples of the anterior temporal lobe, each about the size of a fingernail.
Evidence suggests that the anterior temporal lobe is not affected by epilepsy, and the tissue appears normal when examined with neuropathological techniques, Harnett says. This part of the brain appears to be involved in a variety of functions, including language and visual processing, but is not critical to any one function; patients are able to function normally after it is removed.
Once the tissue was removed, the researchers placed it in a solution very similar to cerebrospinal fluid, with oxygen flowing through it. This allowed them to keep the tissue alive for up to 48 hours. During that time, they used a technique known as patch-clamp electrophysiology to measure how electrical signals travel along dendrites of pyramidal neurons, which are the most common type of excitatory neurons in the cortex.
These experiments were performed primarily by Beaulieu-Laroche. Harnett’s lab (and others) have previously done this kind of experiment in rodent dendrites, but his team is the first to analyze electrical properties of human dendrites.
Using hard-to-obtain samples of human brain tissue, McGovern and MGH researchers have now discovered that human dendrites have different electrical properties from those of other species. These differences may contribute to the enhanced computing power of the human brain, the researchers say.
The researchers found that because human dendrites cover longer distances, a signal flowing along a human dendrite from layer 1 to the cell body in layer 5 is much weaker when it arrives than a signal flowing along a rat dendrite from layer 1 to layer 5.
They also showed that human and rat dendrites have the same number of ion channels, which regulate the current flow, but these channels occur at a lower density in human dendrites as a result of the dendrite elongation. They also developed a detailed biophysical model that shows that this density change can account for some of the differences in electrical activity seen between human and rat dendrites, Harnett says.
Nelson Spruston, senior director of scientific programs at the Howard Hughes Medical Institute Janelia Research Campus, described the researchers’ analysis of human dendrites as “a remarkable accomplishment.”
“These are the most carefully detailed measurements to date of the physiological properties of human neurons,” says Spruston, who was not involved in the research. “These kinds of experiments are very technically demanding, even in mice and rats, so from a technical perspective, it’s pretty amazing that they’ve done this in humans.”
The question remains, how do these differences affect human brainpower? Harnett’s hypothesis is that because of these differences, which allow more regions of a dendrite to influence the strength of an incoming signal, individual neurons can perform more complex computations on the information.
“If you have a cortical column that has a chunk of human or rodent cortex, you’re going to be able to accomplish more computations faster with the human architecture versus the rodent architecture,” he says.
There are many other differences between human neurons and those of other species, Harnett adds, making it difficult to tease out the effects of dendritic electrical properties. In future studies, he hopes to explore further the precise impact of these electrical properties, and how they interact with other unique features of human neurons to produce more computing power.
The research was funded by the National Sciences and Engineering Research Council of Canada, the Dana Foundation David Mahoney Neuroimaging Grant Program, and the National Institutes of Health.