# The Paradox of Consistency - Part One

**Background**: The government is testing citizens for telepathic potential.

**Procedure**: Each citizen is pulled into a room containing an experimenter, a face-down deck of 100 Zenner Cards (5 possible symbols), and 5 buttons each marked with a Zenner Symbol. The experimenter draws and looks at a card without showing it to the citizen, and the citizen tries to press the correct button, until the entire deck is depleted.

**Question**: At what threshold of correct guesses should the government consider it extremely likely the citizen has telepathic potential (<1% chance of reaching by random guessing)? Feel free to approximate.

**Assumptions**: The deck is perfectly random and can contain infinite duplicates. No cheating whatsoever occurs. No information leaks (except through telepathy).

**Hint**: Only trying values in [this tool](emathhelp.net/calculators/prob) is necessary to find the answer. The pre-loaded values show that there is a 44.05384% of chance of guessing more than 20 cards correctly without telepathic ability.

Follow

@category_mirrory The general rule of thumb is anything more than 2 standard deviations is considered significant.

I can calculate that specific number for you if you'd like.

It is interesting to note that getting an unusually high number correct would be just as significant, and just as deserving of further investigation, as getting a significant number wrong would be. If someone got 0 correct that might strongly indicate the person is psychic too.

# The Paradox of Consistency - Part Two

> If someone got 0 correct that might strongly indicate the person is psychic too.

That is the correct answer and thank you for hosting qoto. The scenario is from a book where a telepathic child underwent such a test and tried to hide their ability, and was therefore caught because they gave no correct answers (which is impossible for a non-telepath).

[Too Good To Be True: when Bayes transforms abundant success to abject failure](youtube.com/watch?v=Uz6xUjJHTI) - 20 minutes at 2x speed
* If too many judges agree
* If measurements of a noisy signal are too consistent
* If you find the same DNA in too many places
* If there are too many fines belonging to the same ID
* If your chem-trace results show too many positives
* If too many witnesses agree
* If you experience a single gamma ray bit errors

I found the theme of this video interesting because it reminded me of a characteristic of certain political theories often discussed today; namely that when you grant their assumptions it almost seems like they are "too good" at predicting or explaining every event that happens. One can posit a heuristic, then, that if a certain complicated-sounding theory or mode of analysis seems to produce "too consistent" a result; then it is likely that complicated-sounding theory or mode of analysis (T/MA) is actually effectively isomorphic to a far simpler (and less persuasive) T/MA for all values tested. It is as though the complicated-sounding T/MA is a complex electronic device which has a flaw resulting in a short circuit, causing it to behave as a much simpler and probably unintended T/MA. This also resembles a complex mathematical function intended to produce a complex mathematical shape containing a hidden factor that reduces to a far more degenerate shape over the values tested.

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.