Deep learning systems are the hot rage these days and they seem to be making great progress in voice recognition, image classification, basically invariant pattern recognition. For example after training these systems they can recognize dogs from different angles.
But even with this great progress, there seem to be some fatal flaws, you can modify these images just so slightly that the system completely fails to recognize them. The pictures look exactly the same to us and we would never think they are conceptually different!!!! That are some recent studies trying to understand why these freak classifications happen and we have no idea why.
The pictures on the left are classified correctly while the pictures on the right are classified incorrectly. The middle is the difference between the two images.
I think its a a fundamental problem with all these algorithms and computation in general. These computer programs cannot feel. They have no emotion. They have no idea what they are looking at, its just 0’s and 1’s all the way down. A fancy can opener doing sophisticated math.
The question that I think about is, will it ever be possible for us to emulate emotions in a computer. If we were able to emulate emotions and combine them with algorithms, I see machine intelligence exploding. Machine learning algorithms could have a feeling of the data they are looking at. We wouldn’t need full emulation of emotions, just a basic model that captures the essence of emotion relevant to machine intelligence.
Will computers ever be able to categorize these concepts as the same thing? They definitely cannot do it now. You cannot just compare pixels like how a computer does now.
As I write this, I feel the irony of calling current machine learning algorithms fancy can openers, yet by sprinkling on a little more code to simulate emotion , it will fix everything. I do think that by adding an emotional processing unit, the computer is still just an even fancier can opener, but this new can opener will take us to a new levels of computing and technology
Douglas Hofstadter, a famous “super scientist” having studied physics, mathematics, philosophy, computer science,etc, believes that intelligence is just analogies.
For example comparing the concepts of a “pencil”, “focus” and a “door”. All of them are completely different concepts, but do they have something in common? You could say they all embody the concept of straight lines, unwavering movement.
I believe Hofstadter is right and it is a key insight for us to build smarter computational systems. We have just not found the algorithms that mimic this ability at the right level. We are missing something fundamental from our algorithms.
I believe that the missing piece is some kind of emotion processing unit. We are almost certain that human level intelligence comes from the neocortex. It is the top layer of the brain , about the size of a washcloth. All mammals have it. And we believe that that is where complex thought happens such as the ability to do logic, large math problems, and other sophisticated thinking. This is where I would say most of the focus of algorithm development happens. The deep neural networks are trying to emulate the processes that happen in the neocortex.
But that is only half of the battle. When brains are processing sensory information, that data is processed concurrently in different parts of the brain, including the areas where emotion happens like the amygdala and PAG nervous systems. They also process sensory information from other neurons and influence all of our thinking. Both computers and humans can process raw data, but only humans can feel and have emotion. We can ask questions like “does this feel right”, “is something wrong with this image?”. The computer on the other hand has no idea, its just processing algorithms.
All living things seem to have some an emotional system. All living things get hungry and when they do, they seek to satiate that hunger. I believe the same happens with pain. When an organism feels physical pain , they try to avoid it by moving away from the source of pain. Watch what happens when any creature gets poked with a needle.
Studying emotion is hard though, we can’t study it like we study other systems. Only the individual who feels the emotions can know what the emotion is. We can only observe externally what happens to an organism. In the past emotion has also had a negative connotation in science. Scientists have believe that studying emotion is a waste of time because there is no clear definition and its not “real science”. I think that it is a mistake to not put much more focus into the study of emotion and computation. There are some great scientists who are studying emotion. I love the books from Joseph Ledoux and Antonio Damasio
I believe if we continue to study the great model organism, C. elegans, we can make progress in understanding emotion. Hunger is an innate emotion that we all have. Maybe understanding the neural and chemical underpinning of hunger will lead us to the breakthroughs we need in understanding emotion. How exactly does hunger get triggered? What causes the nervous system to move when hungry? What are neurons doing when hungry? How and why is the organism attracted to food when it is hungry?
Does hunger originate from one place in the body or is it started throughout the whole body? Is hunger the emotion fundamentally the same as the other emotions such as fear and pain?
A lot of what I’m saying is not new. We must continue to study and make break throughs in our understanding of emotion so that we can apply it to computation. I’m sure machine learning will continue to make progress, but when computers have the ability to model emotion, that will really accelerate technology.