UP | HOME

Some unfinished notes on Intelligence

Evolved Biological structures which match the environment. Recognition of recurring patterns (outcomes of processes). Storing language-endoded “knowledge” within a shared culture. Random fluctuations (Brownian motion) withing a current\ social context.

The main point is that we have to always “zoom back” from abstractions to the actual patterns observed in Reality, from which a particular abstraction has been generalized.

The mind easily goes in the way of producing generalizations, but have hard time to unwind them back to the sources.

The mind of an observer “naturally” captured generalities and produced the universal notions of “and also”, “either or”, of an “arrow” and of a “composition” of two “arrows”.

The generalized notion of a Set is an orthogonal - it captures the grouping and classification aspect of the mind.

Notice that classifications “exist” and valid because the physical constraints are the same “everywhere” we observe, and that “everything” has been evolved by a process of gradual improvement (so where the individual “paths” form a tree).

Generalizations and specializations

Our brains have been evolved to recognize commonalities and generalize on them.

Commonalities are out there because the process of Evolution, which produced everything has a shape of an acyclic graph, where everything has its “parent nodes”.

Thus generalization on what we call species is valid not because someone have said so, but because they are indeed branches of the same parent nodes.

Animals do not posses any kind of concepts whatsoever, but their brain structure generalize into species just by pattern-recognition and grouping. Animals are aware of their own kind and other kinds, and know which are harmless and which are dangerous.

There is no abstractions whatsoever, pure pattern-recognition. This ability happen to evolve because different kinds or species are out there. Again, What Is comes prior to any valid concepts, before mathematics and logic. Both are made of generalized patterns and generalized rules. The fundamental notions of AND and OR are both such generalized patterns. So are addition, multiplication (as repeated addition) and exponentiation (as repeated multiplication) of Whole or Natural Numbers.

Sets are also generalizations of observed recurring patterns, and it can be thought of as a embedded DSL. Embedded in what we call logic.

Acyclic graphs, by the way, are literally everywhere, because they mimic (or represent) a resulting path of an evolution of any process which unfolds in distinct steps (transitions, or changes).

This is, of course, why logic, sets and graphs are our universal modeling tools - they are “taken” from What Is.

The most interesting part is that math and logic has been discovered and refined by endless individual trial-and-error processes, which eventually converge on What Is, while pruning put an abstract bullshit.

There is no intelligence aside from systematic trial-and-error. Science emerged as a methodology (set of explicit rules) to do this trial-and-error systematically - with proper back-racking, pruning, etc.

The systematic search and exploration has been studied by AI pioneers.

It is important things to understand that the observed results are very different from the process which produced them. The process by which intermediate results has been selected and incorporated into a language-based shared human culture as “knowledge” was of trial-and-error - pseudo-random fluctuations, bound by individual social contexts, which are vastly complex (as Brownian Motion).

The operational definition of intelligence is the ability, to use maintain an over-simplified, generalized (which absract away as many details as possible) inner “map” of the environment (the territory), and use this inner “map” for selecting actions, and for humans alone - the additional ability to use the stable intermediate results available in the shared linguistic environment, which we call “knowledge”.

The ability to lean to use language-based methods (knowledge encoded in terms of a language) is what distinguish humans from other species. It is a language and only language which makes us special.

Weighted sums

The sums of scaled factors are everywhere.

Most of the meaningful operations are “scaling” of one “object” by another. Matrix by another matrix, a function by another function, a polynomial by another polynomial.

The general theme is scaling of the terms of the same kind and adding them up. This generalization too can be traced back to What Is.

A Sum of Products

ORs of ANDs (sum-types of product types)

Lambda Calculus is Nesting

Abstraction by parameterisation, references, nesting and the rule-governed /substitution model are good-enough for everything. This is a fundamental theoretical result, and it is good exercise to trace everything back to What Is.

The ability to evaluate long nested expressions by reducing one distinct piece (a redex) at a time is, again, a generalized pattern, generalized from how to think rigorously using words of a language.

Nesting establishes an implicit order. Inner “nodes” has to be eventually (not necessarily right now) reduced prior to the outer ones.

Nesting of expressions is the way to structure or to shape the resulting processes. Nothing in Lambda Calculus is arbitrary, incidental or redundant .

Sums and Products

The meaning, captured in a generalized notion of an isomorphism and apparent duality of constructions of the Category Theiry for a sum (a | b) and a product (a, b) is that two “atoms”, lets say, could be potentially observed either together (in a compound structure) or one or another independently. This, of course, is a universal fact (ignoring all the actual electo-chemical constraints).

Another important generalization is that for a product-type we will always have “selectors”, which can be derived mechanically, and for sum-types we will always have a patterns-matching expression, with a distinct clause for each possibility.

The other way to say is that is that the set of all selectors for a product-type and the set of all clauses in a pattern-matching expression are the same kind of “arrows” (mappings).

A sum-type will always have multiple data-constructors, while there can be a single data-constructor for a product type, but it can be partially applied.

Arrows, Arrows Everywhere

When we start to think “visually” in terms of arrows, it is quite easy to mess everything up by not being able to distinguish between different kinds of arrows or by connecting unrelated dots from “orthogonal dimensions”.

Just like the original Lambda Calculus, “untyped” arrows are too general.

So, we need to partition the universe of discourse, just like introducing abstraction barriers, to separate different kinds of arrows from each other.

Flows-Charts of our forefathers were also full of arrows, they are at the different dimension form the arrows in type-signatures.

Our arrows may form pipelines or paths, potential (which can be taken) or actual (which has been taken).

So, when we see “arrows” in a pattern-matching expression these are all the potential paths out there, but only one can be taken at a time.

When we see “arrows” as a set of selectors, some or all of them can be taken in any order.

Last but not least, calling all the selectors will result in the same value of a product type, while only one clause can be used to get the value of a sum-type, which is, of course, again, corresponds to the notions of AND and OR.

Currying is a serious thing.

Notice that a -> b -> c is actually a tree, with the leftmost arrows being omitted.

It captures the notion that “the next step” is not the transformation, but taking in one more “required component”.

By the way, In Molecular Biology (for a enzyme) that would be literally “waiting until a particular structure to appears in my locality”, which shows which abstract notions are valid and which are ephemeral products of the mind of an observer. Waiting in time is ephemeral, locality isn’t. “At the same time” means at “the same locality”, and from there the proper notion of being “unrelated” arizes.

Notice also that when referenced by name (not the actual position), arguments form a Product (not a sequence or a list).

Sums and Products are defined to be unordered (or orderless), just like Sets.

This leads to the possibility to express everything in terms of just “arrows and dots” and this is what the Category Theory does.

Interpreter as a universal abstract machine

Dispatch on a distinct pattern is at the core of any interpreter. This is not an arbitrary result, it is another generalization.

The process of transcription of DNA or RNA involves a “mechanical” interpreter, which does structural patten-matching, literally.

Sequences, Stacks and Queues

Sequences are another generalization of the fundamental and universal pattern too.

A Stack is a specialization of a sequence - an ultimate ADT.

Author: <schiptsov@gmail.com>

Email: lngnmn2@yahoo.com

Created: 2023-08-08 Tue 18:37

Emacs 29.1.50 (Org mode 9.7-pre)