Machine Learning, Bayes theorem, simple explanation with GeoGebra visualization

Have met a concept of Bayes theorem, and would pretty much like to visualize it here. As per purpose of this blog, its studying diary, so pretty much – visualizations, and making of those – would give me myself broader understanding.

Not to mention, that keeping notes publicly does make wanders to being neat in thoughtful way, so – lets begin.

Suppose we hear the description of a man: “He is strong and with a loud strong but assuring voice”.

Who is this man, do we want to guess – is he a librarian or a military drill sergeant?



Is it 50%/50 chance, that given description of a man, is either librarian or sergeant?

We would more likely to guess that he is sergeant, because it is a way of more stereotypical sergeant, than librarian, at least one can make intuitively such a guess. Which is of course wrong.

To make correct guess, one have to observe here actual ratio of librarians/sergeants in our set of

It is important to observe that, even though description sounds much more like sergeant, if there is 1 sergeant in our bunch, and 99 of librarians, then even given the description, it is still much more probably that we are talking about strong and loud librarian, than anyone else.

So, lets say that probability of meeting sergeant in our bunch is , or in other words – for each sergeant we have four librarians.

 

Any 5 guys from our People

Excellent, we’ve got our , now lets invent some values for likelihood of sergeants and librarians being with above mentioned properties.

Since those values are synthetic, lets invent that and

We are naturally here interested in C=sarge, and we require one more value, , which means that given any random guy we meet, chances of fitting the description is 29%, in case of geogebra app below this value is calculated from values mentioned above.

Now we have everything required, lets cook answer to the question: “We’ve met guy from our bunch, he is strong and loud. What chance he is army sergeant?”, or mathematically speaking

Estimate can be found by Bayes formula:

Or in our case

Application below tries to visualize behavior of event. Ie meeting C, after x is true.

Application has three handle to operate, those would be dots on top, left and right of the square.

Top one divides our set into C and not C, dot on the left chooses chance of meeting x in C, and dot on the right chooses the chance of meeting x in not C bunch.

Horizontal dotted line is our , chance of strong and loud member in our group – automatically generated.

And most important – vertical dotted line, which implies , chance of loud and strong being sergeant.