Scientific advice can be defined as empirical evidence providing a true and accurate representation of the world. Historically, the notion of scientific advice was that it provided an objective representation of the world. In more recent years however, in particular after the industrial revolution, there has been a growing recognition that scientific representations of the world, and the social context embedded within these material representations, are intricately interwoven (Hinchliffe, 2001). Here social context is defined as the interplay between social, economic and political spheres manifested in culturally and geographically different ways in each locality.
Morss et al’s (2005) end-to-end-to-end approach provides a key argument, highlighting how social context feeds into the scientific advisory process at different stages and in different manners depending on the issue and society in question. As such, we come to understand how the co-dependent relationships between scientific advice and its social context can be manifest in two ways; at the initial stages of research and through its later usage in everyday life.
Demeritt (1996) insists that scientific facts are outcomes of ‘social relations between scientific actors with different and conflicting interests’. He notes how climate change could once be adequately portrayed as a ‘statistical phenomenon described by readings from thermometers and rain gauges’, whereas in reality it is a ‘socially constructed product of three- dimensional mathematical models’ and the result of negotiations between various ‘interests’ with regards to funding, for example.
While scientific advice is bound up in its social context during its inception, there is also evidence that this relation continues once the information is utilised. Rayner (2003) suggests that the last forty years have demonstrated a strengthening of this relationship due to political support of evidence-based policies in decision-making. Donovan and Oppenheimer (2013) add that ‘science and policy are complicated by the social context and ramifications of scientific advice’. The use of scientific advice by social actors, in particular politicians and regulatory agencies, is not only a crucial device to garner public support of policies, but also to allow these actors to deflect responsibility onto ‘inadequate science’ if their decisions yield negative results (Rayner, 2003).
Aspinall (2011) highlights how social context was fundamental to scientific advice when referring to the litigious risks that scientists faced following the L’Aquila magnitude-6.3 earthquake in central Italy on 6 April 2009. The relatively high death toll arguably could have been prevented, bringing to the fore critical issues of risk miscommunication between scientists, governments and the affected population (Donovan and Oppenheimer, 2013). This in turn amplified ‘lack of coherence among competing scientific understandings and the various political, cultural, and institutional contexts within which science was carried out’ (Sarewitz, 2004). Together, these resulted in legal consequences for the scientists themselves who were charged with manslaughter over the giving of (supposed) negligent advice on the risk to public safety (Aspinall, 2011). These outcomes may lead to scientific advice becoming even more sensitive to social context through the manner in which scientists select, communicate and present their findings.
The point here is not to say that all science is flawed. Certainly not. But it is to make us consider that while scientists can tell us about the probability of events and often with high certainty, their empirical evidence alone is insufficient in telling us various important points. For example, where and when disaster will strike, how to allocate resources between prevention and mitigation, which activities to target first, or whom to hold responsible for various causes and consequences of certain scientific phenomena (Jasanoff, 2007). Also important is to appreciate that like with any profession, scientists’ social circumstances including identity, ideological values and perceptions of risks and uncertainties will influence the advice they provide.
What can be concluded from this? Well, given that this is such a multifaceted concept, I would say that society should learn how to utilize scientific advice by creating improved synergies between natural scientists and social scientists. The process of producing and communicating technical knowledge and understandings must have actors’ motivations aligned in the interest of protecting the safety of societies and scientists alike (Hinchliffe, 2001). Social and scientific understandings must be coupled when tackling natural problems in a human world since assessing and preparing for risk is necessarily a social and political exercise. As O’Brien (2010) argued, without social context scientific advice has no meaningful value to societies. In sum, the usefulness of scientific advice can only benefit from paying closer attention to how its advice relates to specific localities.