Bad Evidence

A presentation at CodeFest 2017 in in Novosibirsk, Novosibirsk Oblast, Russia by Jayne Mast

Criminal investigators have something called ‘bad evidence’ or confirmation bias. When they have a theory about a case, sometimes they tend to avoid evidence that goes against that theory. Unconsciously but also consciously.

This is a big problem we have with testing product hypothesis as well. We tend to ignore the data that go against our theory. And if we do have “bad data” we can’t go around, we test it a bit longer until this disappears.

Why do we do this and how do you deal with this? What other common pitfalls do we have? And is hypothesis testing really worth the time?