Jaynes’s book (first part): https://bayes.wustl.edu/etj/prob/book.pdf
Permanent class page: https://www.wmbriggs.com/class/
Uncertainty & Probability Theory: The Logic of Science
Link to all Classes. Jaynes’s book (first part):
Video
Links:
Bitchute (often a day or so behind, for whatever reason)
HOMEWORK: READ JAYNES CHAPTER 4.4 and especially 4.5
Lecture
Curse of the Continuum!
Like last week, everything I did on the board is straight out of Jaynes, section 4.5. So read it and learn. There’s a bit more math than last time; a calculus background is assumed. If you can remember what an integral and derivative are, you’ll be fine.
I want to emphasize, as strongly as I can, that whether or not we live in a world in which the continuum is real, i.e. perhaps space is infinitely divisible (to the level of the continuum—and not higher infinities?), our knowledge when measurement is involved is discrete and finite. We do know something of infinity, in for example universals. But when we’re talking of devices, like say microscopes or widget-producing machines, we are dealing with the discrete and finite. Always.
The widget-making machine in the example will not last forever. No machine will. That itself is a judgement, and knowledge of a kind of infinity. That knowledge is an extreme probability (i.e. equal to 1). It is a deduction. Given that deduction, we know that there will be only N widgets, even if N is “large”. As I said in the video, “large” is always purely relative. It depends on the decisions made. “Large” for you might be “small” for me. There is no singular large. This is just another way of stating that nothing has a probability.
Because N is always infinitely far from infinity, any fraction f of bad widgets will always be a ratio of two integers. There will only be a finite set of possible values of f, from 0 to 1, and in the set {0/n, 1/n, 2/n, …, (n-1)/n, n/n}.
But, it must be admitted, discrete probability can be a pain in the ass to calculate. Combinatorics, believe it or not, is harder than calculus. At least I think so. Thus, if we can create reasonable approximations to our “large” problems using the continuum, then God bless us. We will be fine. As long as we keep in mind that they are nothing but approximations.
Alas, alas, it is not always kept in mind. People forget what they were about, and suddenly things like f take on life. The “have” probabilities, and therefore have “true” probabilities.
All false. As in not true.
We’ll meet the consequences of this soon.
Subscribe or donate to support this site and its wholly independent host using credit card click here. Or use the paid subscription at Substack. Cash App: $WilliamMBriggs. For Zelle, use my email: matt@wmbriggs.com, and please include yours so I know who to thank.
William,
Thank you for such a compelling lecture! Your perspective on the probability rabbit hole is absolutely fascinating. Honestly, I’ve never encountered this take on probability before. I’d wager—just a guess—that very few universities are teaching your book Uncertainty or Jaynes’ Probability in their courses. They’re just too far off the beaten path, and that’s a real shame.
When practicing statistics, it often feels like we’re missing the mark on the underlying theory, no matter how rigorously we apply it. It’s always feels off somehow. But your lectures, your book, and Jaynes' work are really filling that gap! I can’t thank you enough—turns out I’m not crazy after all.
In Brazil since 2016, the percentage of people who died on their birthday is the following (columns are year, people who died NOT on their birthday, people who died ON their birthday, percentage):
2016 1265596 3840 0.30%
2017 1268945 3898 0.31%
2018 1273651 3963 0.31%
2019 1307487 4101 0.31%
2020 1513788 4583 0.30%
2021 1723603 5309 0.31%
2022 1443127 4247 0.29%
2023 1421034 4381 0.31%
Shouldn`t the percentage be 0.27%?