1. Matt Silberglitt
  2. http://simscientists.org/
  3. Senior Research Associate
  4. SimScientists Assessments Physical Science Links
  5. http://simscientists.org
  6. WestEd
  1. Edys Quellmalz
  2. https://www.wested.org/personnel/edys-quellmalz/
  3. Director Technology Enhanced Assessment and Learning Systems
  4. SimScientists Assessments Physical Science Links
  5. http://simscientists.org
  6. WestEd
Public Discussion
  • Icon for: Matt Silberglitt

    Matt Silberglitt

    Presenter
    May 16, 2016 | 01:05 p.m.

    One of the challenges in evaluating new, Framework- and NGSS-aligned interventions is the lack of outcome measures that are sensitive to the effects of these interventions. Can the summative components of the SimScientists system or other interactive assessments that engage students in doing science fill this need?

  • Icon for: Matt Silberglitt

    Matt Silberglitt

    Presenter
    May 16, 2016 | 01:07 p.m.

    What do teachers need to know in order to make the most of resources for formative assessment such as SimScientists and other curriculum-embedded assessments?

  • Icon for: Teresa Eastburn

    Teresa Eastburn

    Facilitator
    May 16, 2016 | 07:48 p.m.

    Hi Matt, it’s evident that the project has focused on tools that maximize assessment of STEM learning with MS students to ascertain retention and transfer as well as its application beyond the classroom environment and scenarios. I particularly like the fact there is real-time diagnosis, which allows for feedback and coaching when it is most relevant and needed. I’m curious of where you are at with the research, research questions that you have discovered through the program, and areas of challenges. You mention at the start of the video that SimScience seeks to enhance connected knowledge and systems thinking. How is this done? Also, does the project spark students’ motivation? I didn’t hear much about that essential element.

  • Icon for: Matt Silberglitt

    Matt Silberglitt

    Presenter
    May 18, 2016 | 11:29 a.m.

    Thank you for the feedback and questions. This study is underway in the classrooms of ten teachers, and will scale up next fall to 40 teachers. Our research questions focus on the validity of the assessments, their use in classrooms, and the policy implications of our findings for developing systems of assessments from the classroom to state levels. Student engagement is included among the research questions and is being studied through classroom observations and think-aloud studies with individuals and pairs of students. The assessments are built upon system models that can be used to explain system-level phenomena based on an understanding of the interactions among the components of the system. The system models are used in the development process for the assessments, in the professional development for educators, and emphasized in the coaching system, reports to students and teachers, and highlighted in follow-on classroom activities designed for formative use in tandem with the curriculum-embedded assessments. Although the study is underway, past research and data collected to date show that the assessments are feasible for teachers to use in their science classes, engaging to students, and useful for monitoring and adjusting instruction. Preliminary data also point to the benefits for student learning, particularly for English learners and students with disabilities. The primary challenges facing the program are adjusting to changes in the education landscape and competing demands on teacher time associated with adoption and implementation of new standards and multiple models for distributing Earth, life, and physical science topics across the middle grades.

  • Icon for: Lauren Allen

    Lauren Allen

    Facilitator
    May 16, 2016 | 09:59 p.m.

    I agree with Teresa, it would be really interesting to hear students’ responses and reactions to this program. I’m also interested in how teachers are trained to use the embedded assessments built into the system, and whether there were any interesting findings during that development in terms of how teachers use the information compared to how learning researchers might use assessment data. Great to see a program like this that’s taking what we understand about the importance of feedback for students and teachers and incorporating it into the design!

  • Icon for: Matt Silberglitt

    Matt Silberglitt

    Presenter
    May 18, 2016 | 11:29 a.m.

    Thank you for the feedback and follow-up to the questions above. Teachers’ and students’ responses to the program have been overwhelmingly positive. In interviews and surveys, teachers highlight the benefits of immediate feedback, interactivity, and visuals. In classroom observations and think-alouds, students are focused, on task and, when provided with opportunities to work in pairs, discuss ways to use the simulations to investigate and engage in scientific discourse to make sense of the phenomena.

  • Icon for: Roger Taylor

    Roger Taylor

    Facilitator
    May 17, 2016 | 10:05 p.m.

    Hi Matt,

    Three-minutes just wasn’t enough time to explain such a complicated (and interesting) project. Could you talk a little bit more about the general principles guiding your development of simulation-based assessments?

    I was also curious about the Bayes Net that you included in your flow-chart of benchmark summative unit assessment.

  • Icon for: Matt Silberglitt

    Matt Silberglitt

    Presenter
    May 19, 2016 | 03:21 p.m.

    Thank you for the feedback and questions. The SimScientists program is built upon a base of research in learning and cognition. The key principles guiding development are Evidence-Centered Assessment Design (Mislevy, Almond, & Lukas, 2003), Model-Based Learning (Gobert & Buckley, 2000), Universal Design for Learning (Rose & Meyer, 2000), Universal Design for Computer-Based Testing (Harns, Burling, Hanna & Dolan, 2006), and Formative Assessment for Students and Teachers (CCSSO, 2005). More information about our research base is available on our website at simscientists.org.
    Bayes’ Nets are used as part of the evidence model to synthesize evidence across a variety of tasks, student response patterns, and process data from student investigations. The Bayes’ Nets specify how this evidence is used to estimate students’ ability on a variety of science and engineering practices and core ideas in science.

  • Icon for: Roger Taylor

    Roger Taylor

    Facilitator
    May 20, 2016 | 10:10 a.m.

    Very interesting! Getting good estimates of students’ ability is very challenging – I’ll take a look at your website to read more about how you’re tackling this problem. Thanks.

  • Icon for: Jennifer Adams

    Jennifer Adams

    Facilitator
    May 18, 2016 | 12:29 p.m.

    I agree with Lauren, the feedback to teachers and students is critical. I also agree with Roger and would like to hear a little more about the principles behind the development of the assessments, is this based on prior research or project work?

  • Icon for: Matt Silberglitt

    Matt Silberglitt

    Presenter
    May 19, 2016 | 03:22 p.m.

    Thank you for the feedback and questions. Please see my response above about our design principles. The current project is part of an ongoing line of work in the SimScientists program at WestEd. The SimScientists program has developed suites of instructional modules, curriculum-embedded assessments for formative use, end-of-unit benchmark assessments, and end-of-year assessments for 11 topics in middle school science and one topic in high school biology. The program has also developed an online professional development program for educators and a Learning Management System to deliver the modules to students and reports to teachers and students. You can find descriptions of our current and prior research and development projects on our website at simscientists.org.

  • Further posting is closed as the showcase has ended.