top of page
Writer's pictureWu, Bozhi

Neural Networks


Reading:

  • Rumelhart, D. E. (1989). The architecture of mind: A connectionist approach

 

As mentioned in Rumelhart’s introduction, it is natural for people in a certain generation to gain insights from state-of-the-art technologies and apply them to scientific investigations or theory constructions. As he pointed out that “Aristotle had a wax tablet theory of memory,” that “Leibniz saw the universe as clockworks,” and that “Freud used a hydraulic model of libido flowing through the system” to model the world, it is reasonable for nowadays scientists to draw insights from modern digital computers and model intelligence in a similar fashion.


It is genuinely an innovative idea to construct brain-style computation that mimics how information is being processed in the brain among billions of neurons. And most importantly, it is an approach that has brought advancement into both cognitive science research and solving complex computer science problems. On the one hand, by simulating the way how neurons are connected with each other in a hierarchical manner and carrying out complex computing, we can have an improved comprehension of the program/functional level operations underlying biological intelligence and mental life. On the other hand, it has opened up several new possibilities or approaches for certain difficult computational problems like “constraint satisfaction,” “interactive processing,” and “best-match search,” as it embraces parallel computing instead of the traditional serial computing.



Among all the detailed configurations and parameters involved in this computational model, the pattern of connectivity among the processing units (or abstract neurons) is what has especially caught my attention. In the article, Rumelhart wrote that “[it] is this pattern of connectivity that constitutes what the system knows and determines how it will respond to any arbitrary input.” There is an interesting connection between this model and the synaptic model of memory, which essentially views memory to be stored not inside of the neurons or any specific regions, but among the synaptic connectivity and connection strengths between neurons (Hebbian learning; Long-term potentiation). The latter component is also modeled in this brain-style computation as the “weight” or the “strength” of the connection.


Although the detailed computational rules and the other elements are beyond the discussion of this essay, I think it is rather amazing to see how our knowledge of neuroscience and computer science could be so intimately connected to form a such promising computational model that can assist in our further understanding of intelligence systems.


 

Suggested Video:



38 views0 comments

Comentarii


bottom of page