Evaluation of the EPO Program

Approach

The IRIS EPO portfolio is evaluated through a strategic combination of internal and external evaluation practices, applied throughout the lifecycle of a project, with the goal of maximizing the desired programmatic impact. This approach is approach, based on the Collaborative Impact Analysis Method (IAM) (Davis and Scalice, 2015), which combines staff knowledge of programs and products, audiences, and content, with the expertise of an outside evaluator. This combination captures the effects on the behaviors, attitudes, skills, interests, and/ or knowledge of users/program participants, while achieving efficiencies by having IRIS staff conduct much of the development of assessment instruments, data collection efforts, and preliminary data analysis.  To ensure success, an external evaluator provides consultation, review and feedback, and/or more robust analysis of data. Risks are monitored on a project-by-project basis as part of the evaluation process, allowing resources to be reallocated as needed to keep projects on schedule.

Each project in the portfolio is annually reviewed jointly with the external evaluator. Working with the staff lead for the project, the external evaluator scores the robustness of the project’s current evaluation, using a qualitative rubric based on best practices. The outcome of these annual reviews is a project score and steps to improve the project’s evaluation and impact. In this way, the process delivers the formative and impact data to ensure project efficacy and efficiency. Periodically, each project prepares a report on the impact of the evaluation on the project going forward. These reports are used for high-level, cross-program analysis and strategic planning. In addition, the program activities are monitored regularly by the EPO Standing Committee and at longer intervals by a separate external panel and independent evaluator. 

Guiding and Supporting Documents

EPO Program-wide Evaluations

Program Evaluation Reports

Software/Web/Social Media

Meetings

Other

Conference Papers and Presentations

  • AGU poster and abstract on Evaluation (December, 2016)
  • PPT presentation on Program Evaluation at NSF Large Facilities Workshop (May, 2016)
  • Hubenthal, M. (2009, April). Wallpaper or instructional aids: A preliminary case study of science teachers’ perceptions and use of wall-posters in the classroom (S3.10.3). Paper presented at the annual meeting of the National Association of Research in Science Teaching Annual International Conference. Garden Grove, CA. Abstract | Full Paper 
  • Smith, M. (2005, July) Opportunities to Make Science Museum Visits More Meaningful: Results from a Real Time Earthquake Exhibition Summative Evaluation. Visitors Studies Association Annual Meeting. PowerPoint presentation
  • Hubenthal, M., Taber, T. (2004, January) Assessing the IRIS Professional Development Model: Impact Beyond the Workshops. Presentation at the Annual International Conference of the Association for Science Teacher Education, Nashville, TN. PowerPoint presentation

Publications

Hennet, C.B., J.J. Taber, G.E. van der Vink, C.R. Hutt. (2003) Earthquakes in Museums, Seismological Research Letters, Vol 74, No. 5, 628-634. 

Hubenthal, M., O’Brien, T., & Taber, J. (2011) Posters that foster cognition in the classroom: multimedia theory applied to educational posters. Educational Media International, 48(3), 193–207. http://doi.org/10.1080/09523987.2011.607322

Smith, M., J. Taber, and M. Hubenthal. (2004) Providing Seismic Data to the Public: Evaluation and Impact of IRIS/USGS Museum Displays, Eos Trans. AGU, 85 (47), Fall Meet. Suppl., Abstract S53B-0215. (Full Text)

 

References:

Davis, H. & Scalice, D. (2015). Evaluate the Impact of your Education and Outreach Program Using the Quantitative Collaborative Impact Analysis Method (Invited), AGU Fall Meeting, Abstract # ED53D-0871.