Mind Matters News and Analysis on Natural and Artificial Intelligence

TagOccam's razor

Playing cards for poker and gambling, isolated on white background with clipping path

Machine Learning: Using Occam’s Razor to Generalize

A simple thought experiment shows us how

This approach is contrary to Fisher's method where we formulate all theories before encountering the data. We came up with the model after we saw the cards we drew.

Read More ›
Brush and razor for shaving beard. Concept background of hair salon men, barber shop

Occam’s Razor Can Shave Away Data Snooping

The greater an object's information content, the lower its probability.

One technique to avoid data snooping is based on the intersection of information theory and probability: An object’s probability is related to its information content. The greater an object’s information content, the lower its probability. We measure a model’s information content as the logarithmic difference between the probability that the data occurred by chance and the number of bits required to store the model. The negative exponential of the difference is the model’s probability of occurring by chance. If the data cannot be compressed, then these two values are equal. Then the model has zero information and we cannot know if the data was generated by chance or not. For a dataset that is incompressible and uninformative, swirl some tea Read More ›

tomas-robertson-1463833-unsplash

Successful Generalization Is a Key to Learning

In machine learning, the Solomonoff induction helps us decide how successful a generalization is

In the model of generalization set out in the paper, imperfect models can get better scores but they are discounted according to the amount of error they have.

Read More ›