Professional error, critical awareness and good science
MetadataShow full item record
The history of development is well provided with examples of beliefs which, though sincerely held by professionals in the social and natural sciences, have later come to be seen as ill-founded or wrong. Nine examples help to explain the tendency for questionable and erroneous beliefs and policies to be robustly resilient. Interactions of power, interests and mindsets, and of behaviour and experiences, play a part in generating and maintaining myth and error. Critical epistemological awareness to offset and correct misleading influences of professional, institutional and personal interests and orientations is proposed for a more prominent role in good science and policy, and for enhancing the impact of impact evaluations. Questions for self-critical reflection are proposed. The reader is invited to improve on these.