momothefiddler wrote:If going from 1 person in agreement to 0 people in agreement makes you X amount less confident in the truth of a statement, then going from 0 people in agreement to 1 person in agreement should make you X amount more confident in the truth of that statement, or your belief becomes decoupled from the actual situation. It's a smaller effect, but the same seems to apply from 2->1 and 1->2, and so on.
I'm certainly not saying this is any sort of certainty either direction. I'm not arguing that it should be your only source of confidence in such a belief, and you should be more confident that a cubic centimeter of lead outweighs a cubic centimeter of feathers than that a pound of lead outweighs a pound of feathers, even if the same number of people agree with you, but if 500 people agree with your claim and that doesn't make you more confident, and yet at the same time if those 500 disagree, that makes you less confident, I don't see how that leads to sensible results.
There are two or three problems with this.
First of all, the change from 1 to 2 is not the same as but a lesser degree as the move from 0 to 1. It is not that more people agreeing with you makes you more likely to be right or fewer the opposite. Often times whether a large or small number of people agree with you is irrelevant. Now, unbiased qualified expert consensus can often matter, depending, but just people period often doesn't. The difference between 0 and 1 is that at zero, the evidence is consistent with you actually speaking gibberish instead of any language at all, and at one or 100 it is clear that you are at least speaking a language and saying sentences that are related.
The second problem is you again stated 500 people agree or 500 people disagree. That misses yet again that we are not talking about asking 500 people in a poll. The issue is that there are literally billions of people who are genetically and environmentally programmed to take sides on issues. If you are talking about ancient history and you say Atlantis exists, people will side with you (not unbiased qualified experts, which is partly why they matter and laymen don't) If you say that the ancient Atlantians had super advanced tech far about anything we now have, people will agree with you. If you say that they built the pyramids with that tech, people will agree with you. You can get people to agree with almost any crazy damn thing you could think. So how fucking crazy an idea would it have to be before absolutely no one agreed with you?
The third problem is that you are still not properly applying Bayesian probability priors. "If going from 1 person in agreement to 0 people in agreement makes you X amount less confident in the truth of a statement, then going from 0 people in agreement to 1 person in agreement should make you X amount
more confident in the truth of that statement,"
Yes, but that doesn't mean that 0 people is evidence of anything. You have to pick an initial prior. No matter where that prior is, it has to be in one place, not two. Using the Pizza problem:
1 out of every 50 nights you eat pizza. Our prior is now .2. If it turns out based on previous analysis that every single person throws out the pizza box the same night they eat it every single time, then if we find a pizza box our chance that you ate pizza that night is nearly 100% (I haven't actually stated the probability of false positives like trashmen not coming, but let us assume it is tremendously negligible). If on the other hand people every single time they eat pizza throw out the box the night after, then if we see a pizza box, we know you ate pizza not this night, but the night before. So seeing a pizza box would not be any kind of evidence in either direction with regard to tonight.
Now, if you plug these numbers into Bayes theorum, you get basically those numbers (you need additional info, but I am assuming some factors are negligible to avoid complication).
However, what if people do not always act 100% the same? If there is a 50% chance that people throw out pizza box the night they eat and 50% chance they throw it out the next night, then the prior is still .2. But when you see a pizza box, it is not 100% evidence of anything about tonight, because it could be evidence about last night. Likewise it is not 100% evidence about last night, because it could have been tonight. So by varying that number, you can find the specific number that forces specifically the following siutation:
You start with a prior of .2.
If you see a pizza box, your new number based on this evidence is .4. Since it is now more likely, you say that the pizza box is evidence for P.
If you don't see a pizza box, your new number based on this evidence is .2. Since this number is the same as your prior, the lack of box is not evidence of anything.
It it true that the lack of pizza box makes it .2 less likely that you ate pizza then if the box was there, but it changes the prior you already had .0. So it is not evidence for anything, because you are in the same place you were before you looked in the trash.
Likewise, our experiences tell us that positions people take which literally no one supports are shitty positions with low probability of being true. But that when someone agrees with us somewhere, it suddenly increases a lot. It does not follow however, that it increases to above .5, which would be more probably than not. Further, you can define your priors to be .5 before anyone was had a chance to weigh in. Then, when people get a chance of weighing in, and no one agrees with your position, you have gone from .5 to nearly 0. But when you get some people to agree with you, you do not go above the .5 because some people agreeing does not in any way push the .5 prior.