10 thoughts on “Questions on Scientific Literacy: Do we need to rethink our public engagement approaches?

  1. A really great read, thanks! Also critical thinking tools are not just useful when it comes to understanding science. In my opinion everyday life decisions can often come down to assessing the relative risk of different choices. Therefore an understanding of critical thinking can be crucial to help you avoid preconceived bias, aiding you to reach a reasoned and sensible decision.

    Once thing I’m not sure I agree with is the minor point that the questions to scientists “demonstrates that scientists know that it is OK to be wrong.” While I do agree that this is a position any scientist worth his/her salt would adopt – i just don’t think this shows that. For me it highlights once again the fallibility of questionnaires as a means of investigation.

    Once again, a great read!

    • Thanks for the comment!

      I agree about the usefulness of critical thinking skills, and addressed this same point in my post on Science Communication: What do we do it for?

      I think that, although not conclusive, the method of answering for scientists demonstrates that they can admit that there might not be one straight answer to a subject, and that every answer does come with several more questions in the process – as demonstrated by the couple of questions I referenced. Although a straight answer might be yes or no, this does not fully cover the breadth of the question and means that a scientist would be more likely to say ‘I don’t know’, and accepting that this is OK and not a reflection of their own scientific knowledge. Interestingly, the paper states that many of the scientists did do this in the questionnaire.

      As far as the use of questionnaires in investigation, I think there are situation where questionnaires are extremely useful but when they are used in any instance they need to be carefully conceived and tailored to the specific aims of the overall research question. In this case, as you also highlight, a questionnaire was maybe not the best way to measure scientific literacy, and the questionnaire used for scientists caused more problems than it might have if another method of data collection were used. At the same time we see that this piece of research has still been useful even if there are negative results: knowing that this is not an accurate measure of scientific literacy is in itself useful.

  2. Favourite quote atm: Nobody was born with knowledge, it’s something we have to acquire. I’m sick of this frame of mind – the amount of times I hear people [wrongly] being called ignorant or uninformed, in a negative manner, because they don’t know a particular fact or whatever, is BS! No-one can know everything.

    In relation to your last question, I think it’s a blend of both (the weighting of which I’m not sure of). Critical thinking skills can be developed by educating/communicating to people using contexts that are relevant to them (acting as a hook), which will inevitably require some ‘fact-dumping’ or whatever. For e.g., with climate change, we have to convey what we know/theorise about how the world is going to change in the future, and how it will affect them. This would, I guess, be the first stepping stone on a pathway towards being able assess evidence critically. Maybe. I need coffee..

    • I agree, an interesting question to add – as you say – is what the weighting of the balance between subject specific education and process/critical thinking. Also, what should come first – or should these be undertaken simultaneously, where a subject is taught using a re-analysis of the critical thinking behind the facts that are being taught. Incidentally, this is what I tried to achieve with a lesson plan I made as part of a geoscience outreach course in my final undergraduate year. Through helping pupils understand the process behind the research outcomes, we came to conclusions as part of the lesson and the pupils were then able to have a firm enough grasp of the subject to also provide detailed conjecture on the future of the research topic. I think I will add this little sentence to the article I am writing for Geology Today now…

      • That lesson format sounds pretty sweet. So much better than being drip-fed facts all day long *glares in the general direction of Manchester*. I guess I was trying similar things when demo-ing this semester – I never gave the undergrads answers (which pissed them royally off to begin with!), but instead tried to teach them to be able to be confident in their interpretations of their independent observations (not difficult – just rocks, stratigraphy etc). Seemed to work to a degree, as the number of questions did decrease over the duration of the course, particularly those like ‘is this right?’ or ‘what’s the answer to this?’. 2 months is a bit of a short time to make any kind of rigorous trend interpretation though. Oddly, that was with Peter Doyle’s lectures/practicals, who’s a damn fine educator, and actually did bring your name up once, saying, and this might be paraphrasing, that you were a ‘freaking sweet writer and science communicator’. Small world eh..

  3. Great post – thanks for pointing me to the IJSE article.

    In regards to this:
    “‘The oxygen we breathe comes from plants’. Now, the answer was ‘true’ in the questionnaire, but 27% of scientists who answered got this wrong. Scientists pointed out that oxygen was also present on earth before the rise of plants in compound form…”

    That questionnaire item is seriously flawed, as about half of all photosynthetic activity on earth comes from phytoplankton (bacteria and protists). It seems the item writer had a liberal view of ‘plants’ in mind (photosynthetic life) – but that would not be apparent to respondents, especially scientists.

    • Thanks for the comment!

      I have had a look back at the IJSE article to see who compiled the survey, and found that in this piece of research the scientists’ survey was compiled of questions from the Durant et a;. (1989) survey they refer to. They wondered whether the wording of the questions led to wrong answers, rather than there being a problem with scientific literacy (harken back to original commenter Cam):

      “In 1996, we had misgivings that some of the questions in the Durant et al. (1989) questionnaire did not necessarily lead to the ‘correct’ answers that the compilers expected.”

      And they mention earlier the validity of the questions:

      “Thus, versions of the 1989 survey have provided a mechanism for comparing countries, socio-economic groups and gender for over 20 years. This is a profound outcome
      for a set of questions that, from the first, attracted criticism, particularly from some
      researchers interested in the public awareness of science.”

      And finally, once they have analysed the scientists’ answers:

      “…these disagreements, therefore, cast further doubt upon the value of such a survey as a tool for measurement of public ‘knowledge’.”

      I think we can see from this that regardless of whether we should measure scientific literacy in the form of scientific fact checking (as I discuss in the post), a questionnaire, or the particular questionnaires used so far are not the best way of measuring this (and this could take up an entirely new post!).

  4. Pingback: Weekly Round-up (Dec 10-14, 2012) | scicommnetwork

  5. Great piece – just tackling Pre-PGCE essay on “Why Science” and trying to get my head around the evidence base for the societal value of scientific literacy. This has set a number of rabbits running!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s