News and events

Engineering Success: Science and Innovation Observatory Evaluation Briefing Making and Impact

Following the publication of the first SIO Briefing on STEM evaluation, Ken Mannion and Mike Coldwell - the Observatory co-directors - were invited to speak at a set of further events on Cost Effective STEM evaluation organised by the Royal Academy of Engineering. The seminar series raised a lot of exciting questions, which we hope will lead to new ways of using evidence in the field that the Observatory seeks to engage with. Watch this space...

posted 26th January 2012


Science teachers' careers: a new Research Project

Researchers from the two centres that make up the Observatory have recently begun work on a new project looking at the impact of continuing professional development (CPD) on science teachers' career development.

The National Science Learning Centre commissioned the team, jointly led by Mike Coldwell, head of CEIR, and Ken Mannion, head of CSE, to examine how the work of Science Learning Centres affects teachers' intentions to stay in the profession, and their career progression. This is the latest in a series of research and evaluation projects in the STEM field that the two centres (sometimes with Mathematics Education Centre colleagues) have conducted, and confirms Sheffield Hallam University's place as one of the country's leading STEM evaluators.

posted 26th January 2012


Evaluation in STEM Education Event

On April 5th 2011 we held our first event, an invitation-only seminar on Evaluation in STEM education. We looked at critically at the impact, accumulation and usefulness of what has taken place to date with the aim of informing evaluation in this field in the future. Does evaluation make a difference? Who uses it? Is it just evaluation for its own sake?

We have a wide range of initiatives in STEM education and spend significant sums on evaluating many of them. Yet where evaluations take place they are often poorly thought through; and then the learning from all of this is often negligible or poorly shared, rarely influencing other interventions. This raises two important questions:

1. Why do we engage in evaluation?
2. How can we make evaluation work better?

We used the outcomes of the day as part of a Briefing Report that is now available through the Science and Innovation Observatory website.

posted 26th January 2012

Reserachers discussing strategies