The observable examined

Google and the brain

leave a comment »

Google’s page ranking algorithm solves two fundamental issues in search: how to represent the universe of web pages to yield meaningful search results, and equally important, how to rapidly retrieve this information in response to a query.

Google achieves the first by assigning a number (importance) to each page based on the number of incoming and outgoing links. Each of these incoming and outgoing links are assigned “weights”. However, this number is also modulated by the salience of the linking pages.  The importance of each page can then be represented as a simple summation of all pages linking to it (including itself). Each summation term is the product of the link weights and the importance of each page.

A column vector, constituting the universe of all web pages, can be now be derived, as the product of a giant NxN matrix, representing the weights of all links to every page, and the column vector itself. This giant matrix, then captures, all of the relationships, amongst the universe of web pages.

In the general case, the above discussion has to be expanded to a multi-dimensional space. Rapid retrieval, in this high dimensional space, requires other mathematical techniques such as finding a projection that will reduce the dimensionality of this space while still retaining the interrelationships. This is important to retain accuracy in the retrieved results.

A key idea embedded in this formulation is that representation and retrieval are not separate mechanisms but fundamentally intertwined processes that have much in common.

For mathematical convenience, as well as to handle corner cases, there are numerous modifications to this fundamental formulation. But they are not completely germane to the current discussion so we can ignore them for now.

It turns out that evolution needed to solve the same fundamental issues – representation and retrieval- to facilitate the rich panoply of cognitive processes evident in our brain. If we recast this as a search problem, the mathematics behind Google’s page ranking algorithm, seems to offer a simple and elegant solution to the problem.  

In brain science, numerous theories have been put forth, addressing these issues, but fall short mechanistically.  We can think of the universe of neurons akin to the universe of web pages. Neurons have incoming and outgoing links (remarkably dense).  Importance of each neuron then is a function of its “tuning/content” (things it responds to in the environment) and “context” (incoming/outgoing links). The “search terms” are derived from the statistics of the impinging stimuli (which has the benefit of gravitating towards solutions that naturally preserve salient features thereby reducing dimensional complexity). A second mechanism, for reducing the dimensionality of the external environment, can potentially be achieved through a divide and conquer strategy. In other words, the brain uses modularity.  Pruning (strengthening/weakening) of links is accomplished by mechanisms such as Hebbian learning/temporal synchrony/ correlated firing. Such a distributed representation, also allows for graceful degradation, in case of damage to portions of the neural machinery. Simply put, our brain is a giant matrix!!


Written by asterix98

December 8, 2010 at 7:03 am

Posted in Brain, google

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: