Nobody understands statistics. Not that everyone should, but doctors should. Dr. Johnny Benjamin stated:
Could this all be due to lab error? Possible, but not likely. The lab in question is the UCLA's Olympic Analytic Lab used by the U.S. and World Anti-Doping agencies. They test both an A and B sample just to check themselves. They happen to be more than 99 percent accurate. I'm no statistician, but I know that to be better than 99 percent accurate twice is no small feat. This is no similar home test or one that you purchased at your local health-food store. This lab is the best of the best.
The accuracy of the test is meaningless if you don’t look at the base rate. To examine this lets put the question in another context. Let’s say you have an illness, let’s call it MMA Pox, that is very rare infecting only 1% of the population. Now a scientist has created a great test for MMA Pox that is 99% accurate.
If a doctor told you that you tested positive for MMA Pox, what is the likelihood that you actually have the disease? The answer that most people would say is 99%. Most people are wrong.If we had 100,000 people then 1,000 would have the disease and 99,000 would not have it. Now the test is 99% accurate so of the 1,000 with MMA Pox the test would catch 990 and miss 10. Of the 99,000 without the disease the test would tell us that 98,010 do not have MMA Pox, but it would also falsely tell us that 990 people do have the disease. We have 990 people who tested positive and 990 people who falsely tested positive. So if a doctor told you that you had MMA Pox there would only be a 50% chance that you had the disease.
This is with a great test for MMA Pox, but because of the low base rate a positive test result is only as meaningful as a coin flip. To put this in perspective, if we just said that nobody had the disease we would be right 99% of the time. The higher the base rate the easier it is for a positive test result to be correct.
I couldn’t find statistics to the actual accuracy of steroid tests, but let’s accept the 99% number that has been proffered. To get a better idea of Chael Sonnen’s likelihood of having used steroids we need to know the base rate. When MLB did league wide testing they had a rate of use at 7%.
At a 7% base rate we end up with 7,000 users, 6,930 of which would test positive. We would also have 930 false positives. This leaves a likelihood of 88.2% that the test is correct.
Now I am sure many of you think that there are more people using steroids in MMA than in MLB. So if we double the base rate to 14% that would give us a steroid user on almost every fight card. This would give us 14,000 steroid users in the population, of which 13,860 would test positive. We would also have 860 false positives. This leaves likelihood 94.2% that a positive steroid test is positive.
Dr. Benjamin also implied in his statement that because Sonnen had two separate samples tested the test was more likely to be accurate. This may or may not be true. It depends on the test. Some tests decrease error when repeated, while other tests the error is likely just to be repeated.
Does this mean that Chael Sonnen is innocent? I have no idea, but the statistics being banded about are meaningless without understanding their context.
For more on statistics check out Peter Donnelly's TED lecture, the information directly on point starts at the 11:00 mark.
* I intentionally ignored the difference between specificity and sensitivity to simplify this discussion.