Technology

Opinion – Marcelo Viana: Inverse probability seeks causes from effects

by

The Presbyterian minister and English mathematician Thomas Bayes (1701–1761) was interested in the intellectual problems of his time. In life, he published only “Divine Benevolence”, a religious study arguing that the goal of divinity is the happiness of his creatures, and “Introduction to the doctrine of fluxions”, a vehement defense of Isaac Newton’s ideas. But it was “Essay to Solve a Problem in the Doctrine of Chances,” published two years after his death, that ensured his fame in posterity.

In probability theory, it is usual to seek information about effects from causes. For example, if we know that a box contains B white balls and P black balls, the probability that a randomly drawn ball is black is P/(B+P). The problem mentioned in Bayes’s title is that of inverse probability: seeking information about causes from effects. If we don’t know the numbers B and P, what can we infer about them from taking some balls out of the box? Another example: if the Covid test came back positive, what is the probability that it was because the patient is actually infected?

Bayes proved a theorem that explains how to update the probability estimate of a random event from new information. Let’s apply it to the regional classic between the Alguidares and Bem-Bom football teams. Of the ten matches played previously, Alguidares won three and lost seven. So Bem-Bom looks like a favorite for the next match: a priori, the chance of winning is 7/10.

But we know that it rained in four of those matches, and that Alguidares won three of them. In light of this information, Bayes’ theorem says that the a posteriori chance that the Good-Bom wins is equal to the probability of rain when the Good-Bom wins (1/7) times the a priori chance that the Good-Bom wins. (7/10), divided by the probability of rain (4/10). Thus, the chances of Bem-Bom are no more than 1/4.

Practical applications of inverse probability are everywhere, and Bayes’ theorem is a fundamental tool. It has also proved particularly useful in the field of artificial intelligence, in designing machine learning methods. And it is at the basis of the so-called Bayesian interpretation, according to which the probability of an event reflects the subjective belief in the occurrence of that event, which must be updated based on each new information.

leafmath

You May Also Like

Recommended for you