My agenda is that we are culturally programmed – by received wisdom in accepted world-views – to misrepresent(*) information in our decision-making at all levels. Hat tip to Maria-Ana Neves on LinkedIn for the pointer to this TED Talk by Ruth Chang. More later.
[(*) Two errors in the same direction. Firstly we objectify what we're dealing with, "it" our subject, so we can refer to it, talk about it, assign properties and values to it - useful, nay essential, but ... Secondly here's the rub, having objectified it, we reify it in our world-view, the model we hold in our head of how the world works. So whilst being carefully experiential, empirical and evidential in our objective considerations about "it" we forget the contingency in our model of "it" - and the more we use it, analyse it, put its values in tables and matrices, the harder it is to question the existence of "it", the more we confuse our model with the world itself and the more we confuse our own relation to that world(model). Furthermore, being carefully empirical is culturally engrained, to the point of being neurotic - a scientistic neurosis. Perversely the closer our subject to science itself, and the closer that science to fundamental science, deeper the error. ie if you know the "it" you're talking about is subjective and qualitative, you can still make the objectification errors but you are less likely for forget the subjectivity of the subject when alternative questions and doubts arise. Whereas, the more you come to believe "it" is an objective reality, the more blind you will be to such questions and doubts arising - a deadly combination in any apparently "scientific" form of decision-making, and indeed in science itself.]