So isn't this interesting. Borel's law of large numbers. By large he means infinite. He then claims
P{lim n to infinity A/n = p } = 1. The proof is lame. It amounts to Sn/n = A/n goes to p almost surely. So what I say! Looks like zero to me. Or, unattainable if you like.
After brushing up on some old theory I am even more convinced the frequentist model of probabiliity is completely unattainable. It needs to be changed.
"Mathematician Augustus De Morgan wrote on June 23, 1866:[5] "The first experiment already illustrates a truth of the theory, well confirmed by practice, what-ever can happen will happen if we make trials enough." In later publications "whatever can happen will happen" occasionally is termed "Murphy's law""...
In EE, "given time, what is the probability that this power transformer will fail?" N=1
Solution: Always build with redundancy, by station, line, feeder, everything.. (If it's an electronic device critical to operation, use one each from two different manufacturers. )
Therefore we have these two axioms:
"whatever can happen will happen" and "don't put your eggs in one basket".
RATS RATS RATS RATS RATS. I ACCIDENTALLY PUT THE WRONG DATE ON THIS AND IT’S RUNNING TODAY INSTEAD OF MONDAY THE 16TH. I SCREWED UP. BUT IT’S TOO LATE. IT’S ALREADY EMAILED SO I HAVE TO LEAVE IT. NONE OF THE LINKS TO OTHER VIDEO SITES WILL WORK UNTIL MONDAY. MY APOLOGIES.
So, William, let's say I've have standard deck of cards — that's my population. There are 52 cards in total. Now, I want to know the probability of drawing an Ace. Are you seriously telling me I need to make an infinite number of draws just to say that probability is 1/13? And what would 'B' be? There could be an infinite number of conditions as well. If we were to adhere to this idea rigidly we would not be able to do anything statistical at all. Ugh! Too rigid William, too unattainable.
Of course there's always convergence in probability and convergence in distribution we could throw in to really confuse everything. My gripe is all these definitions flirt with infinity. I think that's a mistake. I forgot a lot of this stuff.
Come on, William! That's not an acceptable rebuke! Don’t you see? This definition inevitably yields zero—or at best, an indeterminate form. It has to! So what’s the point of defining probability like that? It’s completely useless.
On a related matter, the measure of the conditional outcome of an event of the future is not necessarily a probability though being assumed to be a probability by the Bayesians. See proof of same in the peer-reviewed article entitled "Unit Measure In Pattern Recocogniion" by Ronald Christensen and Thomas Reichert. Thus, neither the Bayesians nor the Frequentis are correct in their assumptioon. An application of innformation theory called "entropy minimax" saves the day but is seldom used by model builders. I am amongst the exceptions that do. For details on how to do so, see the seven volume treatise on this topic that is entitled the "Entropy Minimax Sourcefook., the author of which is the theoretical physicist Ronald Christensen.
"...there is an infinite subsequence converging in relative frequency to any value you like". So, we can say with confidence that the probability of any proposition is 50%.
All good, prof. Thanks for all you do.
So isn't this interesting. Borel's law of large numbers. By large he means infinite. He then claims
P{lim n to infinity A/n = p } = 1. The proof is lame. It amounts to Sn/n = A/n goes to p almost surely. So what I say! Looks like zero to me. Or, unattainable if you like.
After brushing up on some old theory I am even more convinced the frequentist model of probabiliity is completely unattainable. It needs to be changed.
"Mathematician Augustus De Morgan wrote on June 23, 1866:[5] "The first experiment already illustrates a truth of the theory, well confirmed by practice, what-ever can happen will happen if we make trials enough." In later publications "whatever can happen will happen" occasionally is termed "Murphy's law""...
In EE, "given time, what is the probability that this power transformer will fail?" N=1
Solution: Always build with redundancy, by station, line, feeder, everything.. (If it's an electronic device critical to operation, use one each from two different manufacturers. )
Therefore we have these two axioms:
"whatever can happen will happen" and "don't put your eggs in one basket".
QED
RATS RATS RATS RATS RATS. I ACCIDENTALLY PUT THE WRONG DATE ON THIS AND IT’S RUNNING TODAY INSTEAD OF MONDAY THE 16TH. I SCREWED UP. BUT IT’S TOO LATE. IT’S ALREADY EMAILED SO I HAVE TO LEAVE IT. NONE OF THE LINKS TO OTHER VIDEO SITES WILL WORK UNTIL MONDAY. MY APOLOGIES.
But we loved it anyway. Thanks for your efforts!
Your enemies continue to nip at your heels.
So, William, let's say I've have standard deck of cards — that's my population. There are 52 cards in total. Now, I want to know the probability of drawing an Ace. Are you seriously telling me I need to make an infinite number of draws just to say that probability is 1/13? And what would 'B' be? There could be an infinite number of conditions as well. If we were to adhere to this idea rigidly we would not be able to do anything statistical at all. Ugh! Too rigid William, too unattainable.
Only if you become...A FREQUENTIST!
Of course there's always convergence in probability and convergence in distribution we could throw in to really confuse everything. My gripe is all these definitions flirt with infinity. I think that's a mistake. I forgot a lot of this stuff.
Come on, William! That's not an acceptable rebuke! Don’t you see? This definition inevitably yields zero—or at best, an indeterminate form. It has to! So what’s the point of defining probability like that? It’s completely useless.
On a related matter, the measure of the conditional outcome of an event of the future is not necessarily a probability though being assumed to be a probability by the Bayesians. See proof of same in the peer-reviewed article entitled "Unit Measure In Pattern Recocogniion" by Ronald Christensen and Thomas Reichert. Thus, neither the Bayesians nor the Frequentis are correct in their assumptioon. An application of innformation theory called "entropy minimax" saves the day but is seldom used by model builders. I am amongst the exceptions that do. For details on how to do so, see the seven volume treatise on this topic that is entitled the "Entropy Minimax Sourcefook., the author of which is the theoretical physicist Ronald Christensen.
Terry Oldberg
"...there is an infinite subsequence converging in relative frequency to any value you like". So, we can say with confidence that the probability of any proposition is 50%.
And nobody could prove you wrong, if you were a frequentist.