Can simple probabilities outperform deep learning?One musician found it was so
Haebichan Jung tells us that he built an original pop music-making machine “that could rival deep learning but with simpler solutions.” Deep learning “is a subfield of machine learning concerned with algorithms inspired by the structure and function of the brain called artificial neural networks.” (Jason Brownlee, Machine Learning Mastery)
Jung tells us that he went to considerable trouble to develop deep learning methods for generating machine pop music but in the end…
I made a simple probabilistic model that generates pop music. And with an objective metric, I can say that the model generates music that sounds more like pop than the some of ones made by deep learning techniques. How did I do this? I did this partly by focusing on what I thought was at the heart of pop music: the statistical relationship between the harmony and the melody. Haebichan Jung, “Making Music: When Simple Probabilities Outperform Deep Learning” at Towards Data Science
Eric Holloway does not find the result surprising. He notes,
Jung explains how neural networks start with the assumption that harmony and melody are independent in pop music but then he failed to make interesting music. He now realizes harmony and melody are dependent, and thus created a much better method of generating pop music.
This is an expected outcome based on the fact that computers cannot generate mutual information, where two variables are dependent on each other.
See also: Can machines really learn?