Philosophy and Truth
The world is full of information and most of this information conflicts. Different people have different philosophies, opinions and beliefs. Most of these beliefs are just tradition based upon layers of confusion and combined with flawed reasoning or biased paradigms. Worse yet, is that many people will claim that different philosophies are just different ways of looking at the world and that we have no basis to judge one belief as better than another. The result is that most information just can't be trusted. So how do we separate truth from falsehood? Science has developed techniques (the scientific method) for dealing with the objective, but not the subjective. However, there are two techniques that deal with both the objective and subjective world. They are Efficacy and Internal Consistency. These two tools of discernment can be used to evaluate different concepts and beliefs. They can be used with both the objective and subjective cosmos and are applicable to science, mathematics and mysticism.
The first tool of discernment is efficacy, which Webster defines as the power to produce an effect. Now there is a value judgement being made here. I consider "better" to be "more effective". Now when I say "more effective", I mean that it is more effective at reaching your desired goals. And everyone wants to be able to reach their desired goals. If you want to land and walk on the moon, then science is much more effective at reaching that goal than the religious or philosophical beliefs of a tribe that believes that the moon is the back of a big turtle crawling across the sky.
Now most people do not use efficacy to measure the validity of a philosophy. They rely on the opinion of others. Because of this philosophies have a tendency to be based on faith and tradition or even simplicity. This is especially true concerning the nature of reality. Most people gain basic assumptions about the nature of reality as a child. As they grow older new information is filtered by these basic assumptions forming a world view about what is going on around them. This forms an ediface of beliefs. The longer they live, the larger the ediface and the harder it is to make a major change in their beliefs.
What happens is that any new information that disagrees with the core of their belief system will simply result in some weird explanation that will seem perfectly rational to them. Information that agrees with them is used to affirm their beliefs, but information that disagrees is ignored or poorly explained away. The more entrenched a belief becomes, the more likely that new information or reasoning that counters it will be rejected.
An experiment first performed by Peter C. Wason demonstrates an underlying bias in most peoples perception. Take four cards with the following symbols on them "A", "D","4",and "7". One letter or number on each card. Now write down the following rule on a piece of paper: "If a card has a vowel on one side, then it has an even number on the other side." Then ask which cards MUST be turned over to see whether the rule is true or false. Most people will reply that only the "A" and the "4" card must be turned over. They are only looking for proof, not disproof of the rule. Only 5 of 128 subjects got the correct answer that the "A" and "7" card must be turned over. The "A" card must be turned over to affirm the rule, but the "7" card must also be turned over to make sure that there is no vowel on the other side. If there was, then this would disprove the rule. The "4" card is irrelevant. Even if there was a consonent on the other side, it wouldn't disprove the rule.
Now I could create two sets of these cards. One set would be true to the rule and the other set false ("E" on back of "7" card) to the rule. The people who turned over the "A" and "7" cards would be able to figure out which set was which. The people who turned over the "A" and "4" cards would not be able to figure out which set was the invalid set. It is obvious that their thinking lacks efficacy.
Now most of the ideas that people believe (or at least practice) have a small amount of efficacy even if they are mostly wrong. These ideas wouldn't survive if they were to far off. For instance if a set of cards was randomly wrong on only one card, then turning over the "A" and "4" cards would catch false sets with an odd number on the back of the "A" card. They would be catch 50% of the false sets similiar to the above set. And if the sets were randomly right or wrong, then half the time, they would still be correct even when they miss the vowel on the back side of the "7" card. So this is a 75% of getting the correct answer to the question of whether the rule is true in a random situation. And if everyone around them is just as ignorant, there is even some advantage by being part of the crowd. But what we want is a philosophy that has an extremely high level of efficacy. It should accurately predict the future and be useful for getting desired results.
The next powerful tool of discernment is the law of internal consistency. When a set of ideas are internally consistent, they result in no contradictions. If an idea or set of ideas create contradictions, then they are not internally consistent.
Internal consistency has to do with how well things fit together. Mathematics generally has a very high level of internal consistency. If A+B=C then C-B=A and C-A=B. The human brain does have a tendency to try and organize information so that it is internally consistent. A person who believes that the earth is flat generally won't believe that we have satellites in orbit or that we have landed on the moon. To maintain internal consistency, they will have to believe that NASA is a fraud and that the televised moon landing was just studio trickery. Internal consistency can often be checked using thought experiments. Take the hypothesis (or conjecture for you mathematicians) that there is a finite number of prime numbers. This can easily be disproven using a thought experiment. Multiply all the prime numbers together to create a new number that we will call X. Now add one to X so that we have X+1. We should be able to factor this number since it is larger than the last prime number. Now X can be evenly divided by any of the prime numbers since it is composed of all the prime numbers. So that means that X+1 divided by the same prime number will be have a remainder of one. This means that X+1 is a new prime number or there is an undiscovered prime number less than X+1 that can be evenly factored in. So the idea of a finite number of prime numbers lacks internal consistency. Now the great thing about internal consistency is that it allows us to quickly discard a lot of bad ideas, although most people will not expend the required mental effort needed for this test.
As an programmer, I also tend to test Internal consistency of diffferent metaphysical theories with a thought experiment. Imagine that you had a computer with unlimited memory and speed. Now try to create a universe like our own using the underlying theory being considered. The simpler the program, the better. A program that describes all possible universes is actually much shorter than a program that defines a specific universe. Occam's Razor says that we should generally take the simplest (as long as it is complete) explanation for something. And although we can argue over what is the simplest (it can often result in extreme complexity), it is a good rule to combine with internal consistency.
The Tools of Discernment
It is these tools of discernment that will allow you to filter out the bad ideas and find the good ones. The more you practice with these tools of discernment, the better grip you will have on reality. The stronger your grip on reality, the stronger your mind will become.