Uncertainty & Probability Theory: The Logic of Science
Link to all Classes. Jaynes’s book (first part):
Video
Links:
Bitchute (often a day or so behind, for whatever reason)
HOMEWORK: Read Jaynes!
Lecture
We are reading Chapter 3.8 of Jaynes.
Watch this video of an audience participation lecture of a true random event. My first ever job was on Random Lane. True.
We are still using the same information as last time. B = “urn/object/device with M red/success/1 states/balls/states, and N total”. From which we deduce there are N-M white/failures/0. We already know how to compute Pr(R_1|B) and such like.
But let’s add to B this: “We take a ball out and put it back in/take a reading and restore the object or device.” This gives us B’. In old urn language we’d say we’re “sampling with replacement.” Take a ball out, put it back in, and draw out another, etc.
So what is Pr(R_1R_2|B’)? We can use Bayes:
Pr(R_1|B’) we already know, since on our first draw/measurement nothing has been replaced. But what is Pr(R_2|R_1B’)? Now you might guess the right answer here, but this is not the point, because you can’t always guess. You must read Jaynes on this. There is no point in me repeating what he has said so eloquently. The link to his book is above. I will assume you have read it below.
What does random mean? Only one thing: unpredictable on the information you assume. Here, on the B or B’ or whatever you choose. By unpredictable I mean a probability in (0,1), and not in {0,1}. Which is to say, any probability that is not 0 or 1. Not a local or necessary falsity or truth. This applies to the proposition of interest, say A, to the information you assume, say B. Thus in Pr(A|B)∈(0,1).
Some use the word randominzation. What could that mean? Making the proposition of interest unpredictable. That and nothing more.
It may be that for some C that Pr(A|C) = 1 (or 0), but Pr(A|B) is somewhere in (0,1). Indeed, some C always exists such that the proposition of interest is known. This is always when we know the full cause and conditions of A. So that another way to say random is unknown cause.
Take an urn as B says. What order are the balls inside? You do not know. There is no information about order in B. None. Zero. Zilch. The is some causal information about A in B, like the material cause; i.e. the number and colors of balls. There is no information about the efficient cause. That means
Randomizing, say by shaking the urn, provides no additional or new information to B. But let’s think about what you might do when picking a ball out. You do so, note it, and put it back in. If you’re very careful you could put it so it’s on top, for instance, so that when you go back in to grab another you might well grab it again. But that is not B. That is B + “Careful placement”. Pr(R_1|B) remains forevermore M/N.
If you are making your first draw, you only have B. You still do not know the order or the cause. There is no need for “randomization”. Unless you have outside information about the order you would add to B.
This is only our first time to discuss “random”.
Subscribe or donate to support this site and its wholly independent host using credit card click here. Or use the paid subscription at Substack. Cash App: $WilliamMBriggs. For Zelle, use my email: matt@wmbriggs.com, and please include yours so I know who to thank.
I first read the headline as "Random Memes"
Found myself disappointed on clicking. :(
(Did hear someone once describe schizophrenics as the only true pure randomness in the world.)
Interesting take, but I’m not entirely convinced. Randomization does have its place particularly in political polling. Modern political polls are hopelessly biased by using 'loaded dice.' Instead of genuinely “random” sampling from the US population, they use “online panels” made up of “paid survey respondents”. Hence the sample is of the population of “paid survey respondents” not the US population. What’s worse is pollsters are oblivious to this fact. I know because I’ve asked them.