In his seminal work on user experience (UX) design, The Inmates Are Running The Asylum, Alan Cooper introduces the term "Cognitive Dissonance", which in psychology means the stress caused by holding two different ideas at the same time. Cooper uses this in the context of a computer system performing an action in a way that is different from the idea that a user might expect it to. Cooper, correctly, argues that computer systems should be designed better to avoid such cognitive dissonance.
But perhaps there is room for some cognitive dissonance in data visualization?
A couple of years ago, I heard a senior person in a particular BI vendor discuss the work of Dr. Daniel Kahneman. Dr. Kahneman has written an excellent book called Thinking Fast and Slow. In this book, we are introduced to the idea of System 1 and System 2 thinking. System 1 thinking is fast - largely subconscious thinking. Imagine answering the question, "what is 2 + 2?" The answer comes immediately. System 2 thinking is more conscious, deliberate, and often logical. Think of answering the question, "what is 17 x 22?" We need to follow a process to come up with the answer and it is a lot harder. Because it is harder, and uses more energy, our brain much prefers to use System 1 thinking.
The BI vendor in question was starting along a path where they were suggesting that it would be good for visualization designs to support the System 1 thinking - make things more automatic and easier for users. If they don't have to think to hard about decisions, then we make their lives easier, right? Having read Dr. Kahneman's book, I am glad that they didn't pursue this messaging in their global marketing, because it might just be very wrong.
Last Monday 24th February, the BBC's Horizon series covered the subject, How You Really Make Decisions. I heartily recommend it if you can catch it on the iPlayer or if it is repeated on a local station (Horizon has always been one of my favourite programmes on TV). The program looked at Dr. Kahneman's work. It appears that System 1 thinking is actually more associated with making mistakes!
You may have seen this video before: https://www.youtube.com/watch?v=vJG698U2Mvo
This is a great example of inattentional blindness. It was created by researchers Chris Chabris (Union College, NY) and Daniel Simons (U. of Illinois). This is following the case of Boston police officer, Kenneth Connolly who was pursuing a murder suspect on foot when he passed some other police officers beating up another suspect. Connolly completely missed the beating up. However, he was convicted of perjury because the jury couldn't believe that there was any way that he couldn't have seen it. However, Chabris & Simons do regular tests where they have people running after a jogger while having some guys having a fight along the route - 50% of the people do not see the fight!
So, how does this apply to UX and Data Visualization? Well, I think that Cooper's UX rule on cognitive dissonance might not hold all the time in the case of a dashboard or analysis piece.
We know that a dashboard that is over complex with too many pieces of information is difficult to use and makes a user work very hard to find the information that they are looking for. There is a danger that they might miss a tree because of the forest of options. However, the idea that we should design a dashboard that a person can just look at and see the answer immediately may not be right either.
The problem is that an easy to use dashboard may pander to our System 1 thinking. But System 1 thinking can also be influenced by a whole load of cognitive biases that we as designers may not be aware of - some of which we could even be accidentally programming by our design. We might think that a dashboard is simple and that the answer is obvious to everyone - but that is not always the result.
So, perhaps there is a case for a deliberately conceived cognitive dissonance being built into the dashboard? Force the user to think twice. Make them engage their System 2 thinking to come up with the answer. Don't make it easy, make it deliberately harder. Perhaps that might actually help users make better decisions.
Stephen Redmond is author of QlikView Server and Publisher and the QlikView for Developer's Cookbook
He is CTO and Qlik Luminary at CapricornVentis a
QlikView Elite Partner.
Follow me on Twitter: @stephencredmond