Artificial General Intelligence

Concept and Learning

A traditional criticism is on its limited expressive power. NAL solves this problem by introducing multiple copulas and compound terms, layer by layer. The ideas covered in this chapter mainly come from set theory.

1. Derivative copulas

Similarity is defined as the symmetric variant of inheritance. In psychology, these two notions are sometimes called "symmetric similarity" and "asymmetric similarity", respectively.

In IL-2, similarity means perfect substitutability.

The instance, property, and instance-property copula marks the end of transitivity in one or both directions in an inheritance chain, and represent "individual", "attribute", or both, respectively.

The instance and property copulas correspond to two ways to specify a set. A term can be a set, but not necessarily so.

Valid syllogistic rules of IL-2 are variants of the transitivity of inheritance, including resemblance, analogy, and comparison.

2. Intersection and Difference

A compound term can be formed by taking the intersection or difference of the extensions or intensions of two existing terms.

The inference rules of IL-3 come from the definitions of the related compound term, and may take one or two premises. The conclusion may contain new terms not included in the system's vocabulary.

In a term logic, the compositional and structural rules can be seen as variants of the syllogistic rules.

Intersection and union are dual operators, as in set theory, with respect to the extension and intension of a term.

The inference rules of NAL-3 include compositional, decompositional, and structural rules, as well as a choice rule that takes simplicity into account. The rules defined in lower layers remain valid when a compound is used as a whole.

3. Product and Image

A logic can have both "built-in" and "acquired" relations between terms. In IL/NAL, the former is either syntactic ("composed of", indicated by term connectors) or semantic ("used as", indicated by copulas), and the latter is represented by a term with a meaning learned from experience.

The same idea as in set theory, except here it is not limited to sets defined extensionally.

The inference rules of IL-4 come from the definitions of the related compound term. Each of them only takes one premise.

The inference rules of NAL-4 only take one premise, and produce conclusions with the same truth-value, since the premise and the conclusion express the same content, though in different form.

4. Meaning of concept

Since in NARS each concept is named by a term, the meaning of a concept is basically defined in the same way as the meaning of a term.

The meaning of a concept is determined by its relations with the other concepts. Some relations are syntactic (component-compound), and the others are semantic (subject-predicate).

A semantic relation can be extensional, intensional, or both. The extension and intension of a concept mutually determines each other in IL, and their sizes change in the opposite direction. In NARS, they are defined differently from the conventional definition (which presumes model-theoretic semantics), while still keep the same intuitive meaning.

Recognition, perception, and categorization: answering "T → ?" for a given term T. There are often multiple answers that are not mutual exclusive, but form an inheritance hierarchy. The choice rule in NAL: expectation and simplicity. Control factors: familiarity and relevance.

Degree of membership in an inheritance hierarchy. Two opposite tendencies: specificity (representativeness) and probability. The "Conjunction Fallacy". A compromise: basic level categories.

The meaning of a compound term is semi-compositional: it is determined partly by the syntactic relations, and partly by the semantic relations with the compound as a whole, which usually cannot be fully derived from the former. The meaning of a compound term is initially determined fully by the syntactic relations, but later more and more by the semantic relations, which usually cannot be derived from the former.

Restricted by available resources, when processing a given task, each involved concept normally is used with partial meaning. Which part will be used is influenced by the priority distribution among beliefs, which depends on experience and context. Essence and definition. Analogy and metaphor.

A useful concept usually have relatively sharp and balanced extension and intension, such as basic level categories and natural kinds.

The NARS categorization model and the existing categorization models: the latter are special cases.

5. Learning as reasoning

The empirical knowledge of NARS consists of explicit knowledge (as Narsese sentences and concepts) and implicit knowledge (as priority distributions).

In NARS, all forms of empirical knowledge is producible and modifiable by experience (though can be implanted, too). At this level, learning is complete.

On the other hand, the grammar rules, inference rules, and control mechanisms are defined at the meta-level, which are not acquired, but built-in.

In NARS, learning and reasoning are basically two aspects of the same process. Learning is an open-ended process that does not follow any predetermined algorithm, as studied in "machine learning".

New concepts appears in the system in three ways: accepted, composed, altered. The "original meaning" of a concept is not necessarily its "current" meaning. In general, there is no "correct", "true", or "ultimate" meaning for a concept, though concepts with stable and clear meaning are preferred.

The leaning process selected useful concepts, based on repeatedly experienced patterns to summarize experience and process tasks efficiently.


Reading