14 Comments

All good, prof. Thanks for all you do.

Expand full comment

So isn't this interesting. Borel's law of large numbers. By large he means infinite. He then claims

P{lim n to infinity A/n = p } = 1. The proof is lame. It amounts to Sn/n = A/n goes to p almost surely. So what I say! Looks like zero to me. Or, unattainable if you like.

Expand full comment

After brushing up on some old theory I am even more convinced the frequentist model of probabiliity is completely unattainable. It needs to be changed.

Expand full comment

"Mathematician Augustus De Morgan wrote on June 23, 1866:[5] "The first experiment already illustrates a truth of the theory, well confirmed by practice, what-ever can happen will happen if we make trials enough." In later publications "whatever can happen will happen" occasionally is termed "Murphy's law""...

In EE, "given time, what is the probability that this power transformer will fail?" N=1

Solution: Always build with redundancy, by station, line, feeder, everything.. (If it's an electronic device critical to operation, use one each from two different manufacturers. )

Therefore we have these two axioms:

"whatever can happen will happen" and "don't put your eggs in one basket".

QED

Expand full comment

RATS RATS RATS RATS RATS. I ACCIDENTALLY PUT THE WRONG DATE ON THIS AND IT’S RUNNING TODAY INSTEAD OF MONDAY THE 16TH. I SCREWED UP. BUT IT’S TOO LATE. IT’S ALREADY EMAILED SO I HAVE TO LEAVE IT. NONE OF THE LINKS TO OTHER VIDEO SITES WILL WORK UNTIL MONDAY. MY APOLOGIES.

Expand full comment

But we loved it anyway. Thanks for your efforts!

Expand full comment

Your enemies continue to nip at your heels.

Expand full comment

So, William, let's say I've have standard deck of cards — that's my population. There are 52 cards in total. Now, I want to know the probability of drawing an Ace. Are you seriously telling me I need to make an infinite number of draws just to say that probability is 1/13? And what would 'B' be? There could be an infinite number of conditions as well. If we were to adhere to this idea rigidly we would not be able to do anything statistical at all. Ugh! Too rigid William, too unattainable.

Expand full comment

Only if you become...A FREQUENTIST!

Expand full comment

Of course there's always convergence in probability and convergence in distribution we could throw in to really confuse everything. My gripe is all these definitions flirt with infinity. I think that's a mistake. I forgot a lot of this stuff.

Expand full comment

Come on, William! That's not an acceptable rebuke! Don’t you see? This definition inevitably yields zero—or at best, an indeterminate form. It has to! So what’s the point of defining probability like that? It’s completely useless.

Expand full comment

On a related matter, the measure of the conditional outcome of an event of the future is not necessarily a probability though being assumed to be a probability by the Bayesians. See proof of same in the peer-reviewed article entitled "Unit Measure In Pattern Recocogniion" by Ronald Christensen and Thomas Reichert. Thus, neither the Bayesians nor the Frequentis are correct in their assumptioon. An application of innformation theory called "entropy minimax" saves the day but is seldom used by model builders. I am amongst the exceptions that do. For details on how to do so, see the seven volume treatise on this topic that is entitled the "Entropy Minimax Sourcefook., the author of which is the theoretical physicist Ronald Christensen.

Terry Oldberg

Expand full comment

"...there is an infinite subsequence converging in relative frequency to any value you like". So, we can say with confidence that the probability of any proposition is 50%.

Expand full comment

And nobody could prove you wrong, if you were a frequentist.

Expand full comment