Bayes, Ockham, and Shannon in machine learning...

Mark S Abeln

Forum Pro
Messages
20,527
Solutions
56
Reaction score
16,906
Location
Washington, MO, US
This article may be of interest:

 
Thanks for the link. Quite relevant to what is going on where I work. Just giving it a quick look, I suspect the article will give me enough things to talk about I will sound a lot more "with it" than I actually am.
 
Last edited:
“What’s Occam’s razor?”

”It’s probably a razor belonging to someone called Occam.”

”Well, that’s simple enough...”
"Simple enough" is correct in the absence of any context. Especially if whoever answers the question has no knowledge whatsoever of any facts about Occam.

The intelligent answer would be, "I don't know, because I never heard of Occam".

For someone who never heard of Occam, when forced to guess, the, ”It’s probably a razor belonging to someone called Occam" would be the best guess.

Prior information (i.e. facts – in this case historical facts based on Occam's writing) affects the implications of the word "probably".

There's nothing wrong with updating a the best guess when new facts become available.

--

"The belief that ‘randomness’ is some kind of real property existing in Nature is a form of the mind projection fallacy which says, in effect, ‘I don’t know the detailed causes – therefore – Nature does not know them."
E.T Jaynes, Probability Theory: The Logic of Science
 
Along a similar vein. A story my philosophy professor told us.

Our professor got on an airplane, and to his astonishment he found he was seated next to Renee Descartes. Of course our professor was terribly excited and trying to think of a deep philosophical question he could ask. But before he could, the stewardess came by and asked "Would you like coffee or tea?" Renee Descartes said "I think not" and poof ! He disappeared.
 

Keyboard shortcuts

Back
Top