Why to build true intelligent machines, is to teach them how to interact with the world?
To build true intelligent machines, teach them how to interact with the world, training deep neural networks to generate and understand causal graph networks.
The language of machine intelligence and learning is digitized natural language, neither Python, nor Rust.
The model of machine intelligence and learning is a causal model of the world, neither ML models, nor LLMs.
The master algorithm of machine intelligence and learning is causal learning and inference algorithms, neither regression/classification algorithms or ANNs algorithms.
Real causality is key to our common-sense and scientific understanding of the world, as much as causal representation learning, inference or algorithms are critical to build real-world AI systems sustainably operating in a changing world.
Machine learning often disregards information that animals use heavily: interventions in the world, domain shifts, temporal structure — by and large, we consider these factors a nuisance and try to engineer them away. In accordance with this, the majority of current successes of machine learning boil down to large scale pattern recognition on suitably collected independent and identically distributed (i.i.d.) data.
Generalizing well outside the i.i.d. setting requires learning not mere statistical associations between variables, but an underlying causal model.
Correlation-based AI and causal AI have essential differences, including the following:
Correlation-based AI relies on statistics to provide assumptions about what’s happening.
Causal AI can clearly trace and explain exactly what’s happening at every step based on specific contextual data.
Correlation-based AI is probabilistic and requires humans to verify the accuracy of results.
Causal AI is fact-based and thus can do automated analyses.
Correlation-based AI can make only predictions with limited ability to explain an event.
Causal AI, on the other hand, provides details on how it arrived at a conclusion.
Correlation-based AI needs to be checked for bias due to the limitations of various data, algorithms, or sampling.
Causal AI, however, relies on actual data and not training data and is therefore not prone to bias issues.
Correlation-based AI may be completely off base in novel situations.
Causal AI can adapt to new situations and find unknown unknowns.
An Acausal statistical and probabilistic AI as implemented by machine learning models, deep learning algorithms and artificial neural networks has nothing common with human intelligence or human consciousness.
AI which is able to handle cause and effect, with all its complexities, scales and scopes and levels, dubbed as a Causal AI.
A framework for a linear causal inference, a 3-level hierarchy of causal reasoning, answering three kinds of questions.
Associational:
If I see X, what is the probability that I will see Y?
Interventional:
If I do X, what is the probability that I will see Y?
Counterfactual:
If had I done X, what would Y have been?
Real causation is interaction, a reciprocal action or influence or productive correlations and causative associations, or X causes Y if and only if Y causes Y.
X affects, produces or influences or changes Y by a transfer of matter, energy, force or information as much as Y affects, produces or influences or changes X by a transfer of matter, energy, force or information.
Types of causal representation learning, inferences and algorithms:
Linear Direct Causality (causation, or cause and effect) occurs when you believe that X causes a change of Y by a transfer of force. Or, if A causes B, then A must transmit a force (or causal power) to B which results in the effect. Causal relationships suggest change over time; cause and effect are temporally related, and the cause precedes the outcome. Or, one event, process, state, or object (a cause) contributes to the production of another event, process, state, or object (an effect) where the cause is partly responsible for the effect, and the effect is partly dependent on the cause.
In common-cause relationships, a single cause has several effects:
A data input variable is a single cause resulting in several effects (as food intoxication (a pathogen) - fever, headache and nausea).
In common-effect relationships, several data input causal variables converge in one output causal variable effect:
A warfare is an example of one effect with several causes.
In causal chains one causal variable triggers another variable, which triggers another effect: A causes B and B causes C, A causes C, as in transitive reasoning
Examples, digital illiteracy is leading to disinformation and miscommunication, which leads to mental confusion. Sanding causes dust and Dust causes sneezing, Sanding causes sneezing.
Reverse Causality, In reverse causality, a causal relationship is reversing its action when the effect Y causes the cause X. Or, Y is causing changes in X.
In epidemiology, it's when the exposure-disease process is reversed, the exposure causes the risk factor.
In causal cycles, causal relationships form a stable cycle or reinforcing mechanism:
The emergency of flight effect is explained by causal reinforcing interactions of feathers, hollow bones, high metabolic rate and flight reinforce each other in birds.
A full causal system (as a real physical system or anticipative system) is a system where the output (outputs and internal states) depends on past and current and future inputs (values). It makes invalid the conceptual misassumptions as causal, acausal and anti-causal systems due to a defective linear causation.
Full causation as interaction is formally encoded by full causal undirected cyclic graph networks embracing all known and unknown artificial neural networks architectures.
MRX Chairman 130+ Countries, 350+ Corporations, Employing 31,000+ Ph.D's 2024 Revenues of $62.38B
2yHighly beneficial actionable intelligent innovations providing optimal direction in managing, directing and applying AI to Accelerate Humanity from the innately gifted, highly educated, creatively driven mind of Pinaki Laskar.