Friday, January 27, 2023

On engrams


The following note on engrams was inserted Jan. 27, 2023.
Assuming the existence of one or more types of engram, or fundamental memory unit, it is to be expected that the process of association implies some form of energy value attached to engram links. A --> B says that basic memory A instigates basic memory B, not not necessarily the converse. A <--> B means A instigates B and B instigates A, whichever is triggered first.

But, under what conditions does A --> B? The link value must exceed that of competing possibilities, such as R --> S. In most situations, this means that A and B are sets, indeed subsets of engrams. So A --> B requires that A ⊆ X and B ⊆ Y, so that X ∩ Y = AB. The strength of the association between X and Y is determined by the subset of common links.

I suppose a way to represent engram associations and sets of associations would be with a weighted graph.

(A --> B) <--> C is a possible way to write of simple associations. Obviously association sets become very complicated.

In any case, suppose we have A --> B and A --> C, but the AB link is far stronger than the AC link. So we have AB >> AC, whereas if the link magnitudes are roughly equal, we have AB ~ AC.
                             A
                           // \   
                          B    C

AB > AC. We also have (CB --> A) > (BC --> A). That is the engram order CB triggers engram A more readily than does order BC.

It's possible of course to assign numerical values to links. If we make 1 the strongest possible value and 0 the lowest, we may approximate intermediary values using the real number continuum or we may assign some lowest possible finite value to an engram link. It's conceivable that in some sets, the engrams are "coherent" in such a manner that their values can be simply added while in other cases they behave as if under destructive interference, with the set value going to 0. In most cases, we would have sets of mixed value -- the engram links are "out of phase" and yield an imperfect memory, blurred in places and lacking certain links that would be regarded as important (components of the memory set are lacking).

Both fatigue and competing mental activity can affect the cohesion of memory sets.

How does this model fit with machine learning? Machine learning essentially requires that success be rewarded, which means reliable attainment of sets of numbers. The machine is programed to filter results thru negative feedback control. In the case of human memory, the system behaves analogously. The primary engrams may not be specifically of memories. They might be the instincts, the axioms of human cognition. (The archetypes of Jung we see not as axioms, but as cultural artifacts that arise in the manner of parallel evolution of species.)

No comments:

Post a Comment

A short proof of the Jordan curve theorem

The following is a proposed proof. Topology's Jordan curve theorem, first proposed in 1887 by Camille Jordan, asserts that an...