Renos Vakis. eMBeD unit. The World Bank Behavioral psychology to improve decision making
We are human beings and, as so, we are social.
How do we make decisions?
Automatic thinking
Social thinking
Mental models
What do decision makers do:
Contextual definition of problems.
Map behaviours.
Solution, evidence, iteration.
Main problems in decision-making:
Bias of confirmation: when the individual seeks or interprets new evidence as confirmation of their beliefs or theories already conceived.
Bias of confidence: when subjective confidence of someone over their own judgement is higher thant the objective precision.
Framing and aversion to losses: we tend to take more risks in the “losses” frame rather than on the “gains” frame. We prefer not losing rather than gaining.
Case study: paying taxes in Poland
(Some) people do not pay taxes.
Reasons: architecture is complex, mental effort to understand how paying taxes work, bad perception of what happens with taxpayers money (e.g. corruption), etc.
Possible solutions: improve electronic procedures, etc.
Experiment in Poland: sending letters to “remind” tax evaders that they should pay. Letters work, but they work better the harder the tone of the letter.
Case study: water saving in South-Africa
Water consumption invoices included explanations on pricing and the different price thresholds. Especially poor people was responsive to such information, but not as much richer one. Then other information was included: how one behaved according to the average citizen and publicly acknowledging those more efficient in saving water. Then rich people also were responsive and saved water.
Mapping behaviors
Diagnosing bottlenecks:
Decisions
Actions
Mindsets
Information
Context
Map a given process, identifying all the behaviors —especially decisions and actions— and see how they are conditioned or determined by information, beliefs, procedures and tasks, social norms, etc. This should help us to accurately find out the potential decision or action bottlenecks: steps where one may or may not make a decision or do an action depending on several factors. If these are properly identified and characterised, we can act upon those factors to improve the likelihood of decisions to be made and actions to be carried out.
Group decisions and mindsets
Social conformity
Independent behavior
Interdependent behavior
Empiric
Customs It is what I want to do.
Descriptive norms Everyone does it.
Normative
Moral norms It is the correct thing to do.
Social norms It is what everybody expects from me.
Messages can be shaped in a way that refer to different kinds of norms and thus have different effects on people. Besides, social norms and mental models are strong conditioners (even determinants) of behavior and it is crucial to take them into account when designing and executing public policies.
Fixed mindset —belief that certain things cannot be changed, or that one is born with some skills that cannot be changed— vs. growth mindset— things can be changed, one’s own skills can be improved. We have to foster growth mindsets.
This has been a terrific experience on many levels. The most important one was acknowledging how advanced the field is and, even more important, how deep the sensation is that a point of no return has been crossed in terms of open data, open government, transparency, accountability, open development, etc. Some important outcomes will, of course, still take some time to take place, but the path is been paved and the trend is gaining momentum quickly, adding up critical mass at each stage.
The collaboration and excellent attitude of all the actors involved in the project (we interviewed 41 people and read more than 150 working documents and 128 bibliographic references) was another aspect of the work that is worth highlighting. Special gratitude goes to Fernando Perini, Erika Malich, Katie Clancy and Tricia Wind at IDRC. It is not every day that one finds people so willing to have their work thoroughly scrutinized, to explain things without making excuses, to expect the evaluation to be an opportunity to learn and to improve. Same goes for the team at the World Bank and the Government of Canada (especially Amparo Ballivian and Yohanna Loucheur, respectively).
This impression of people taking seriously their work, including third parties’ evaluation and insights is confirmed not only by the publication of the report with the evaluation of the Open Data for Development program, but also the publication of the response of the Management of the program to our evaluation, providing both context and commitment to the recommendations made by the evaluators.
Below can be downloaded the three documents generated by the evaluation: the full final report, the executive report and the management’s response.
If I am allowed to, I would like to state that both Manuel and I are quite proud of the recommendations we made at the final section of our evaluation. Of course, the recommendations come from the many and richest inputs that everyone we talked to or read about kindly gave us. These recommendations are as follows.
OD4D: greater emphasis on the right side of the OD4D equation (i.e. “for development”)
Reticulating OD4D: towards an expanded network vision for OD4D
Build capacity for gender-purposeful programming
Invest in strategic partnerships
Greater engagement with the D4D community
Support OD intermediaries
Place knowledge management at the core of OD4D implementation processes
We hope the evaluation and, especially, the recommendations are useful not only for the program but for the whole open data and open data for development community. We remain at the disposal of anyone in need of more information, doubts or suggestions.
Abstract:
The evaluation focuses on both accountability and learning. The primary intention of the evaluation is to provide accountability to the program’s management and organizational governance structures for program results. In addition, it reflects upon OD4D’s implementation in order to inform future programming on open data for development themes. The process was guided by five evaluative questions, on (1) Results, (2) Design, (3) Management, (4) Policy and (5) Gender. The evaluation report addresses these five topics, and also refers to some cross-cutting issues which were identified during the process. The analysis is completed with a final section with key recommendations for the upcoming new phase of the program.