|
Register | Blogging | Today's Posts | Search |
|
Thread Tools | Display Modes |
![]() |
#11 (permalink) | |
SOPHIE FOREVER
Join Date: Aug 2011
Location: East of the Southern North American West
Posts: 35,541
|
![]() Quote:
If a set of several numbers goes as high as 1 (f. ex a set of five numbers being {1,1,1,1,1}), the number 1 will be 100 percent of the numbers in the set. If that set goes as high as 2 (ex: {1,1,1,2,2,2}), the number 1 will 50 percent of the numbers in the set. If that set goes to 9, then the number 1 will be a smaller piece of that pie. Once a set begins to reach double digits, the distance from 10 to 20 is 100 percent of what it took to get from 1 to 10, so the process repeats and the likelihood of 1 being the leading integer decreases as the highest number in a set approaches 100. Then that applies to hundreds, thousands, etc. Blending all of that into an average makes 1 the overall most likely leading integer. Probability doesn't work like that because Bernard's law assumes linearity and requires a framework based on how our base-10 scale operates. So Bernard's law looks at the way that we construct numbers while probability looks at how the chips fall and the likelihood of them falling that way. It's not a useless concept but I don't think that it's useful in all applications because number systems are a language and language has flaws (quirks might be a nicer word for it). It's more useful for things like coding where they're connecting a binary system to a base-10 system, I'm probably incorrectly assuming. Right?
__________________
Studies show that when a given norm is changed in the face of the unchanging, the remaining contradictions will parallel the truth. |
|
![]() |
![]() |
|