Mind Matters News and Analysis on Natural and Artificial Intelligence

TagEmpirical Generalized Information (EGI)

Red sports car Josh Rinard Unsplash

What Vehicle Would Bob Buy?

Both empirical generalized information (EGI) and the Gini metric can generate useful information

Contrary to traditional Fisherian hypothesis testing, it is possible to create models after viewing the data and still quantify the generality of the model.

Read More ›
Black balls white balls Adobe Stock

Machine learning: Harnessing the Power of Empirical Generalized Information (EGI)

We can calculate a probability bound from entropy. Entropy also happens to be an upper bound on the binomial distribution

We want our calculation to demonstrate the notion that if we have high accuracy and a small model, then we have high confidence of generalizing. Intuitively, then, we add the model size to the accuracy and subtract this quantity from the entropy of having absolutely no information about the problem.

Read More ›