Troglodite thinking:
Opinion - Something you believe to be true based on no or limited evidence.
Fact - A thing you believe to be true that is objectively true.
Enlightened thinking:
Opinion: Anything I believe to be true whether it is true or not and no matter what level of evidence I believe I have.
Fact - A thing which an oracle could, would know is objectively true, but since oracles dont exist we can never exert with certainty that something is or is not fact. It is at best an abstract concept.
Are we talking purely mathematically, in which case it is only true by definition, and it is still your opinion in how you are understanding that definition. For example in computing (not pure math) we have a saying "2+2 = 5 for sufficiently large values of 2" which is a play on some of the particulars of computing.
But something being true by definition is only true due to circular logic, and therefore true but not int he sense of being a world fact (a true idea).
By contrast though if you mean the real world manifistation of that.. so "if i have one rock, and pick up another, then I have two rocks" we still wind up in the same problem of true by definition but its just a harder concept to understand. In this case what is a "rock" and how we count rocks are ultimately at play and it still becomes a situation where it is only true by virtue of the fact that we define it as such. Again we would be in the problem that the definition still has room for definition... if i pick up a pile of sand, how many rocks do I have? One could argue each grain of sand counts as a rock.
@freemo Right, but in that example, it's "rock" that becomes ambiguous, not "1", which is an abstract concept we can reliably apply to rocks, apples, bananas, whatever.
I believe that somewhere between the theory and the individual applications, there lies an objective truth that remains factual regardless of our perceptions or ability to observe it.
@LouisIngenthron If you are refering to the one and not hte rock then you are talking about the first case not the second. In the first case there are no rocks, just "1" which you have defined and circularly used.
In the second case im talking about the idea withouth the numbers, which english isnt equipped to do. But i am talking about the fact that rocks are a comparable quantity irrespective of any numbers described to accomplish that. That is, that the qty is preseved from the individual components in the larger collection. That is, the actual real world scenario that math is being represented before the math or numbers were concepts that were defined.
In other words, imagine a person with no linguistic or math understanding adding quantities of rocks using purely abstract understanding (no internal dialog). There is no 1 in that situation, there is an idea of a one in a very abstract sense that is rooted not by the concept of 1 but by the concept of where the boundaries of a rock is where beyond that boundary is a "different rock".
To put it yet another way, without the numbers you are left needing to define where a thing begins and end. One you throw the numbers back in you realize it didnt really change that. Youjust have two things you need to define, its all still circular and by definition.
As a counter argument we actually have math where 1+1 is either undefined or equals something other than 2, it just depends on the system of math and definition your using. In math we define these as what is called "rings", not all rings even allow for addition operations. In other words addition is a nonsensical operation under some systems of math, so how can 1+1=2 be universally true if it is only true under specific mathematical systems and false or undefined under others?
@LouisIngenthron also to address your otherpoint.. my argument is not that there isnt an objective truth, only that no oracle exists as to what is and is not an objective truth. So even if something is, in reality an objective truth, the fact that it is so is still just your opinion of it.
@freemo Idk about that. If you met someone from 100 years ago, you'd sure seem like an oracle to them. I imagine that's largely true going back through human history. As a species, we seek to define and understand the world around us, and the only way we can truly do that is by finding these universal truths and using them as lenses. I don't think we've found many yet, but we're working hard on finding more.
I don't need confirmation from an omniscient oracle to confirm a fact as true. I just need to be reasonably certain that even totally foreign beings who experience life in a way that's unfathomable to me would still inevitably and independently come to the same conclusion, and I believe that to be true of "1+1=2".
> Idk about that. If you met someone from 100 years ago, you'd sure seem like an oracle to them.
So if someone a 100 years ago thought of a random number and asked this seeming oracle to tell them what number they are thinking of, would they be right 100% of the time? No. Therefore even to a critical thinker 100 years ago it would be trivial to prove you are not an oracle (a person who can determine what is fact without any chance of being wrong or not knowing).
> As a species, we seek to define and understand the world around us, and the only way we can truly do that is by finding these universal truths and using them as lenses. I don't think we've found many yet, but we're working hard on finding more.
Nothing wrong with trying to refine your opinion and your certainty of it based ont he evidence and your own logic. Also nothing wrong with communally sharing that so we all have a collection of opinions that are educated and well thought out.
But no matter how much you explore objective truth you can never state a thing to be absolute truth beyond it being your **opinion** that it is an absolute truth. Sure it may or may not be actually an objective truth, but with no oracle capable of determining that it will always be your opinion. The only thing that changes with evidence (and should) is the confidence you have int hat opinion. It will never stop being an opinion.
@LouisIngenthron Oh and to your other point...
> I don't need confirmation from an omniscient oracle to confirm a fact as true. I just need to be reasonably certain that....
That isnt a coinfirmation. If you are reasonably certain, then you are reasonably certain, that is not a confirmation of truth, it is just a measure of doubt that is reasonable for you to adopt the **opinion** that it is an absolute truth. Someone else who has a higher standard of evidence may set a higher threshold... but since as you point out, in no way confirmed it to be true in an absolute sense (only reasonably certain) it still remains an opinion.
we as humans just assume a certain level of confidence is the same as something being objectively and absolutely true as if its somehow special. Its only a matter of degree in confidence of your opinion, nothing more. It remains an opinion.
@LouisIngenthron Also it bears mentioning that as much as you may believe 1+1=2 is a universally and objectivelly true fact it is not. It is only true under an explicit definition of that being so, and depends on every definition therewithin.
For example 1+1 does nto equal two in the following systems:
in base 2 systems there is no symbol such as 2 so "1+1=2" is patently false. however "1+1=10" is true under that system. This is because our definition of the numbers and how they expressed is different.
In braurer groups addition is defined as a tensor product over algebras. Without getting too technical under this mathematical ring "1+1 does not equal 2" in fact this very assertion is non nonsensical.
In noncommunicative addition rings then "1+2 is not equal to 2+1" and in many such rings 1+1 equals something other than 2 as a consequence (in some such rings it does).
Similarly in mathemetical rings the very set of numbers that exist may be finite, and the number 2 may not exist at all. In fact you can have mathematical rings where 0 and 1 are your only numbers, and as such "1+1=2" is not true.
@freemo I feel like you're unnecessarily equating labels with their underlying concepts here.
The concept of two is still two whether we write it as "2" in base-10 or as "10" in base-2.
Likewise, the concept of integer addition is distinct from the addition symbol "+" which is also used to denote many similar-yet-distinct concepts (such as the ones you describe). While those may be referred to as "addition", they aren't the concept of integer addition that I'm specifically referring to.
These are, as you initially described them, effectively "definitions" that prove themselves circularly, but a tiny subset of such definitions describe concepts that would seem to be universally constant, that are independently and repeatably verifiable regardless of perspective. If we don't call those "facts", what do we call them?
> I feel like you're unnecessarily equating labels with their underlying concepts here.
> The concept of two is still two whether we write it as "2" in base-10 or as "10" in base-2.
No this is exactly what i said at the offset, there are two ways to discuss this, both very different but both agree with what I'm saying here.
What i just expressed was the "by mathemical definition" where I showed "1+1=2" is **not** a universal truth, it is only true when defined to be true, and not in any sense in reality.
> Likewise, the concept of integer addition is distinct from the addition symbol "+" which is also used to denote many similar-yet-distinct concepts (such as the ones you describe). While those may be referred to as "addition", they aren't the concept of integer addition that I'm specifically referring to.
So lets use the other half of the coin, since that is what you are asserting here you mean. Not by definition but due toi the real world concepts they represent.
> These are, as you initially described them, effectively "definitions" that prove themselves circularly, but a tiny subset of such definitions describe concepts that would seem to be universally constant, that are independently and repeatably verifiable regardless of perspective. If we don't call those "facts", what do we call them?
So even in the real world, not by definition, the real world concept that "1+1=2" is not a universal fact in reality either.
I mean sure, if I have one duck, and add to that one apple, I now have two things... but again that is only because we define what a duck is, and where one thing ends and another starts, its still all by linguistic definition, and ONLY works for some things even if we accept their definittions...
Here are all the counter examples where "1+1 does not equal 0":
One electron added to one positron results in 0 physical things. So in this scenario "1+1=0"
one blob of water added to another blob of water results in a single blob of water, therefore "1+1=1"
If you put two humans who are attractive to eachother in a room and wait 9 months you get an extra human. Therefore in some cases "1+1=3"
@LouisIngenthron Oh to answer your last question...
> such definitions describe concepts that would seem to be universally constant, that are independently and repeatably verifiable regardless of perspective. If we don't call those "facts", what do we call them?
Such a thing doesnt exist. Perspective will change what is true. A person tripping on acid or said to be hallucinating from their perspective they will see a very different reality and from their perspection they think their "facts" are objectively and repeatably true and you look like the crazy person (usually).
So you are really asking "What if something I and most people I talk to all agree we get the same results almost every time we do the experiment"... we would call that "An opinion of fact that we have high confidence in"... for the sake of linguistic simplicity we simply call that a "fact" but we must be aware that what we are always saying when we say fact is just an opinion you hold in high confidence.
@freemo In none of those examples are you doing integer addition.
The first two examples are collision, not addition. The third is procreation, not addition.
One electron added to one positron is two physical things... until they collide, which kicks off annihilation (which is also not addition).
For water, your quantity is not "1", but rather "blob". So, yes, blob+blob=blob, but none of those are integers. When they combine, it's not the math that changes, but rather the means by which you measure it. In different units, such as one gallon plus one gallon, the math holds true.
These universal truths I describe are often incredibly narrowly defined, by necessity. I believe very few things are truly absolute, but there are a few.
> In none of those examples are you doing integer addition.
The first two examples are collision, not addition. The third is procreation, not addition.
I never said addition was collision. Addition is "the combination of two things into one".. 2 is not one and one seperately, it is only when you take one and **combine** it with another 1 that it becomes 2.
The nature of how you "Combine" then is again open to definition. You can put them in the same basket or within some bounded space.
In the case of antimatter I picked electrons because they are point-like when manifest, so two electrons can **never** collide. It is only when they are near proximity (within the same energy level and region) that they combine and annihilate.
But again its all definitions. But it is important to note that addition is the combining of things, and is **not** the same as counting, which is an ordered set you are iterating through.
As for integers, again just a play on definitions. If i am adding ducks, is a conjoined twin duck two ducks or one? What if it has two heads vs 4 legs and one head? Or do we count the atoms that make up a duck. Why is a glass of water some real number thing (the volume) and not a single integer "one blob of water".. afterall ducks can be of various sizes and configuration and made up of constituent parts, so why cant we treat water the same?
Again the point here is all of this is you relying on arbitrary definitions to make any of this work. There is no "reality" to it and is very much a debatable "fact".
But again if the fact that "1+1=2" is true or not isnt relevant. Because even if it is a concept thatis absolutely and objectively true, since we can (and are) disagreeing on that it is only your **opinion** that it is fact, it is my opinion it is not fact. One of us might be true but there is no way to prove which of us is true in an infallable way, ergo regardless of its underlying truth it is still an opinion that it is true.
> For water, your quantity is not "1", but rather "blob". So, yes, blob+blob=blob, but none of those are integers. When they combine, it's not the math that changes, but rather the means by which you measure it. In different units, such as one gallon plus one gallon, the math holds true.
Blob is not a quantity, its a thing... You are acting like the question "How many blobs do I have" cant be "1".
The problem is "quantity" depends on "the quantity of what exactly".
blobs are measured in integers and yes 1 is a valid quantity of blobs. Blobs (like ducks) can also be measures in volume, in which case 1 m^3 + 1 m^3 = 2 m^3. But i wasnt measuring by volume, just as i didnt measurte by volume of ducks, i was measuring by unit, as we do with ducks.
If i make a duck into ground meat, and make it into sausage I can say "I have 1m^3 of duck meat" or I can say ""I have a pile of meat here that is 2 ducks of meat". Both are just as valid as the other.
> These universal truths I describe are often incredibly narrowly defined, by necessity. I believe very few things are truly absolute, but there are a few.
The fact that you have to define them at all, let alone narrowly, make them non-absolute truths. They are only true under a definition, and thus circular... at best you are creating circular arguments with extra steps to obscure the circular nature of the argument. But its absolutely no different than 1+1=2 being true by definition, you are just using looser linguistic definitions rather thant he mathematical ones.
@freemo @LouisIngenthron "True by definition".
<insert sad Bertrand Russel and Alfred North Whitehead on their 370 page proof of 1+1=2>
> I feel like there's got to be a limit.
> Is "1+1=2" opinion?
If you mean that "1+1" has the only possible and calculable result "2", then yes it is an opinion 🙂
We cannot prove that mathematics is consistent. So, if we find an error in current axiomatic systems, we can then prove that "1+1=3", too.
We believe that mathematics, as formalized today, is consistent, and that in case there are errors in its foundations, they can be fixed without affecting the majority of useful theorems and result we have now.
@freemo This is the Platonist vs Nominalist controversy. Do categories like "horse" exist in reality? Or are there only individual organisms we arbitrarily categorize as "horse"/"equine"?
In Medieval times, this debate focused on theology: Is God objectively "good", or do we just arbitrarily declare whatever God does as "good"?
In Modern times, Nominalism is getting ridiculous as it dismisses categories like "male", "female", and even "human".
I'm going to agree with the Medievalists, and declare Nominalism a "heresy".
> This is the Platonist vs Nominalist controversy. Do categories like "horse" exist in reality? Or are there only individual organisms we arbitrarily categorize as "horse"/"equine"?
I can see why you might confuse these two arguments but they are in fact subtly and importantly different.
I am **not** arguing that objective truths dont exist in reality (or that they do for that matter) at all. Only that if they exist you can never be 100% certain of something being an objective truth, therefore even if we accept objective truths as existing, anyone trying to state something is an objective truth is still stating an opinion that it is one of the objective truths that exist, and they could still be wrong about this. So regardless of if objective truths exist or not every utterance is always an opinion.
@freemo You can never be 100% certain that what you see is a physical reality. It might be an optical illusion. But if you run around refusing to believe the evidence of your own lyin' eyes, you might be a leftist.
It is quite possible to make a mistake in math too. That doesn't change the underlying (abstract in the case of math) reality. It just means having the humility to admit when you are wrong.
@freemo While I agree with this in general, I feel like there's got to be a limit. Is "1+1=2" opinion?