Successful swing trading requires you to consider and balance many elements to decide which trade to put on and when to get out. With 70% of stocks following the direction of the major indexes each day you also need to consider the direction of the overall market when trading individual stocks.

In this process you will probably look and listen to the opinion of experts. When you consider expert opinion you should listen for ideas based on good logic that resonates strongly with you because experts often disagree with each other. When considering the opinion of experts or other market news you need to avoid confirmation bias, which is a tendency of people to favor information that confirms their beliefs. You need to also be able to spot someone who is putting out poorly constructed opinions.

Confirmation bias often sees ambiguous evidence as supporting an existing position. You may even form an early opinion based on some preliminary information that resonates with your own market theories and then ignore later evidence that conflicts with your views (Known as Belief Perseverance and Irrational Primacy Effect –greater reliance on information encountered early in a series). Confirmation bias often occurs because people weigh the costs of being wrong, rather than investigating in a neutral, scientific way. These are human tendencies that are natural but must be avoided in trading to prevent unnecessary losses.

Confirmation bias can even maintain or strengthen beliefs in the face of contrary evidence as people seek out additional sources that agree with their views even as evidence mounts to the contrary. In trading I see this when traders fail to follow proper exit criteria and give various reasons why they are still in a bad trade. As a relatively disinterested observer looking at their trade and the evidence the trader is presenting, their judgment seems obviously flawed. These situations usually end badly.

An experiment gave 45 subjects a complex rule-discovery task involving moving objects simulated by a computer. Objects on the computer screen followed specific laws, which the subjects had to figure out and they could test their hypotheses to see if it worked. Despite making many attempts over a ten-hour session, none of the subjects solved the problem or even came close to solving the problem. They typically sought to confirm rather than falsify their hypotheses, and had a strong tendency to disregard disconfirmation and consider alternatives. Even after seeing evidence that objectively refuted their working hypothesis, they were reluctant to try something else. Some subjects returned time and time again to the same wrong hypothesis, seemingly unable or unwilling to give it up. Ideas that were proven to be wrong were only abandoned 30% of the time. The intent of the study was to assess the Instructional Effect: Half of the subjects were instructed in proper hypothesis-testing which would have made them successful, but these instructions were almost entirely ignored and had no effect.

In another experiment the subjects displayed a surprising and strong tendency to seek diagnostically worthless information to confirm their beliefs. Those with a strong math background tried to use math to explain phenomenon that did not lend itself to mathematical analysis. They then strengthened their conclusion based on that irrelevant information. I have seen swing traders use ideas irrelevant to swing trading to justify a bad trade.

Traders with a strong science or engineering background tend to over apply math in trading decisions. When you over apply math you run the risk of falling into the False Positive Paradox, which is a statistical result where false positive tests are more probable than true positive tests. Even tests that have a very low chance of giving a false positive in an individual case will give more false than true positives overall (more losing trade than winning trades). This effect is more likely when the overall positive sample size is small (The number of winning trades in the sample is small). Also, combining ideas using math can produce Simpson’s Paradox in which a trend that appears in different groups of data disappears or even reverses when these groups are combined.

My experience is that a trading idea proven by math is only valid if it has a sample size of at least 1,000 trades. I see math people constantly putting out trading ideas based on far smaller sample sizes. Several years back an article was published noting that eight out of eight times when the weather was unusually warm in Africa that the Coffee crop in South American was unusually good. This was theorized to be caused by warmer African ocean currents or perhaps the Jet Stream bringing good coffee weather to South America. However, after the eighth correlation it was never correlated again.

A key element in avoiding confirmation bias is to deliberately try to see if you can effectively refute your opinion. Disconfirmation is very important and has led to the development of Red Teams* in business and government to check an idea before it is implemented. IBM, SAIC and CIA have long used Red Teams to improve decision making. The 2003 Defense Science Review Board recommended Red Teams in security organizations to help prevent the shortcomings that led up to the 9/11 attacks. You need to periodically turn yourself into a Red Team to check your ideas if you want to make good decisions.

The logic of entering a trade often involves little more than a hunch but I would encourage you to have at least a confirming chart pattern and one other good reason before putting on a trade. Before you enter the trade and become emotionally involved with it you need to set clear profit and exit criteria. Once you are in the trade then you follow your rules without exception. Rules can only be changed for future trades, not the current trades and then only after careful consideration and Red Teaming the logic. When deciding on the best profit and exit criterion you should use support and resistance levels (more on this later).

*Red Team vs. Blue Team is a military simulation convention where the Blue Team represents the friendly forces and the Red Team represents the enemy forces.