1. Jacqueline Barber
  2. Director of the Learning Design Group
  3. DIMES: Immersing Teachers and Students in Virtual Engineering Internships
  4. Lawrence Hall of Science
  1. Eric Greenwald
  2. Director of Assessment and Analytics
  3. DIMES: Immersing Teachers and Students in Virtual Engineering Internships
  4. Lawrence Hall of Science
  1. Kathryn Quigley
  2. Producer and Media Lead
  3. DIMES: Immersing Teachers and Students in Virtual Engineering Internships
  4. The Learning Design Group
  1. Jane Strohm
  2. https://sites.google.com/site/janestrohm/
  3. Engineering Curriculum Lead
  4. DIMES: Immersing Teachers and Students in Virtual Engineering Internships
  5. Lawrence Hall of Science
Public
Choice
Public Discussion
  • Icon for: Teresa Eastburn

    Teresa Eastburn

    Facilitator
    May 16, 2016 | 11:07 a.m.

    Kudos Jacqueline and team on Futura and the richness of the experiences you’ve created with the internships to engage the students in engineering design for the 21st century. What ages does the project target and is the program being implemented within school time, after-school time, or both? What tools are you using to assess engagement? Are all design challenges within the internship computer based? Are your resources online to view? Looking at student’s salient moves and use with the design tools is brilliant, combining learning and assessment! Interestingly, I am involved in an engineering program with middle schoolers now and would love to learn more. I’m intrigued! Kudos again on a great video and program.

  • Icon for: Lauren Allen

    Lauren Allen

    Facilitator
    May 16, 2016 | 12:05 p.m.

    I totally agree, this project looks really promising! I am curious also about the details that Teresa asked about, and additionally would like to hear more about what teachers see (the video used the phrase “automatically analyze data”) and how they use that information. Is there a teacher professional development program that goes along with these virtual internships?

  • Icon for: Jane Strohm

    Jane Strohm

    Co-Presenter
    May 16, 2016 | 02:09 p.m.

    The teachers use a platform to send messages and feedback letters to their interns. For example, when an intern submits the results from one of their design tests, the system evaluates the values and composes a feedback letter that gives the interns suggestions about what’s working and what could be improved.

    Later, when the interns have identified their optimal design, they draft a proposal outline prior to completing a final proposal. The teacher uses the platform to evaluate the outline using a rubric tool. The selections the teacher makes generate a prepared feedback letter that can help interns improve their argument and use of evidence for how and why their design is a good solution to the problem.

    The engineering units come with several supporting documents to help teachers implement, but at this time there is no formal professional development.

  • Icon for: Eric Greenwald

    Eric Greenwald

    Co-Presenter
    May 18, 2016 | 01:51 p.m.

    To follow up on the assessment questions, to shed light on engagement, we are collecting event data (clicktream and student submissions) and back-end data from student use within the Design Tool—this will give us things like how long students are using different features of the tool, as well as the specific moves they make within it—the expectation is that this will help us (and help teachers) learn, for example, whether students are making meaningful moves within the digital environments vs just randomly clicking buttons. Also, I’m not sure if this was a part of the engagement question, but we are using the Fascination scale from the Activation Lab suite of measures (see http://www.activationlab.org/ for more detail).
    For the “automatically analyze data” question, we are building an analytics framework to draw inferences from all of the data we are collecting (see my response to Jennifer Adams, below, for more on that approach). Our goal is always to provide actionable information to teachers, based on these data—we are building a reporting infrastructure in conjunction with the analytics approach to digital data so that teachers can easily see and make use of the insights from student work in the digital environments.

  • Icon for: Jacqueline Barber

    Jacqueline Barber

    Presenter
    May 16, 2016 | 12:24 p.m.

    Thank you Teresa and Lauren! We are excited about these Engineering Internships as well!

    We have designed the experiences to be used in-school, by middle-school aged students (and their teachers). Each internship is designed to follow a science unit on a particular topic. For instance the Rooftops for Sustainable Cities internship, in which students work to make a city more energy efficient in order to reduce the carbon dioxide produced from combustion, is designed to follow a unit on climate change, in which students learn how changes in the atmosphere are affecting the energy balance in the Earth’s system, and about human’s role in these changes. The internship on Fighting Drug-Resistant Malaria, in which students develop, test, and refine treatments for drug-resistant malaria, is designed to follow a unit on natural selection. etc. This provides students with the opportunity to apply their newfound science understanding to another situation—in each case, a design challenge.

    While we have worked to create hands-on experiences as possible, the design challenges are all computer based, providing an affordance that we are leveraging to create unobtrusive assessment capabilities.

    I will let my colleague Eric Greenwald respond about more details of the assessment and engagement measure, and my colleague Jane Strohm respond about the teacher interface. Thanks so much for your comments and questions!

  • Icon for: Susan Doubler

    Susan Doubler

    Senior Researcher
    May 19, 2016 | 12:01 p.m.

    Jackie,
    There are so many wonderful qualities in this program, but I’d like to focus on one that really captured me. Your internships engage students in current societal problems. These experiences are preparing students—creating the state of mind—to contribute to the new, hard questions that society will face in the future. Congratulations!

  • Icon for: Jacqueline Barber

    Jacqueline Barber

    Presenter
    May 20, 2016 | 11:42 a.m.

    Thank you, Sue! It has been our goal to enable students to take on a problem-solving role related to real world problems, something that engineering easily affords. Student engagement has been super high. I really liked the comment by one teacher about how her students seemed more hopeful after participating in tackling climate change.

  • Icon for: Jennifer Adams

    Jennifer Adams

    Facilitator
    May 16, 2016 | 12:39 p.m.

    I agree about the promise of this project and I am especially interested in the capturing of the moves that students make while engaging in the process. Do you have any research questions (research on learning and engagement) around this data and, if yes, what has been revealed thus far?

  • Icon for: Eric Greenwald

    Eric Greenwald

    Co-Presenter
    May 18, 2016 | 01:38 p.m.

    Thanks, Jennifer! Our approach is grounded in Evidence-Centered Assessment Design (from Mislevy and colleagues). To that end, our basic question is, what would count as evidence of understanding in student task performance (in this case, what they’re doing within the digital Design Tool)? This involves careful articulation of the specific practice/understanding we are assessing (e.g., optimizing a design solution), and specifying what moves students could make within the Design Tool environment to show evidence of developing that practice. We are currently working through a combination of top-down (e.g., expert models) and bottom-up (e.g. machine learning with student data) approaches to building an analytic framework that yields useful, formative inferences about student learning for the teacher and student.

  • Icon for: Jennifer Adams

    Jennifer Adams

    Facilitator
    May 19, 2016 | 02:43 p.m.

    Thanks! It will be interesting to compare the outcomes of each (top-down vs bottom-up)

  • Icon for: Teresa Eastburn

    Teresa Eastburn

    Facilitator
    May 16, 2016 | 01:39 p.m.

    Thanks Jacqueline and colleagues. I’ll look for more info online as I’m particularly interested in the Rooftop lesson that you shared. Good luck this week and thank you for your efforts to strengthen STEM during the school day. That’s the best way to ensure broader impacts by reaching a broad segment of students. What class is this most often offered in?

  • Icon for: Jane Strohm

    Jane Strohm

    Co-Presenter
    May 17, 2016 | 07:29 p.m.

    Hi Teresa, We expect these units to be part of a middle school science course.

  • Icon for: Helen Teague

    Helen Teague

    A cyber-ensemble of inversion, immersion, collaborative workspaces, query and media-making in learning
    May 16, 2016 | 03:58 p.m.

    The focus on student-driven investigation is wonderful! Is there a specific framework you use for the Design Cycle in the 2nd stage?

  • Icon for: Jane Strohm

    Jane Strohm

    Co-Presenter
    May 19, 2016 | 08:37 p.m.

    Hi Helen, We recognize there are a variety of Design Cycles/Processes out there. We’ve simplified the Design Cycle to Plan, Build, Test, Analyze… Repeat.

  • Icon for: Helen Teague

    Helen Teague

    A cyber-ensemble of inversion, immersion, collaborative workspaces, query and media-making in learning
    May 19, 2016 | 10:30 p.m.

    Thank you, Jane! I like the emphasize on the action words!

  • Icon for: William Finzer

    William Finzer

    Senior Scientist
    May 16, 2016 | 05:31 p.m.

    This seems like a fabulous way to generate enthusiasm for doing design and science in the classroom. I love the way you keep the locus of control for what messaging goes out when with the teacher. I imagine that students buy in to the process as pretty realistic. True? What challenges are there in getting students to believe in it all as more than “school work?”

  • Icon for: Jane Strohm

    Jane Strohm

    Co-Presenter
    May 16, 2016 | 06:26 p.m.

    Thanks! There’s notable buy-in because we ask students to consider problems that are current and in the news that affect humans around the globe. We’ve seen that some students really engage in the internship fiction, but it depends on how the teacher implements the immersion using internship language, referring to the project director, or modifying other classroom routines.

  • Icon for: Isabel Huff

    Isabel Huff

    Program Outreach Coordinator
    May 16, 2016 | 06:14 p.m.

    A very professional video and innovative ideas! Is there any way for students to collaborate on their internships? Can the internships be done on tablets?

  • Icon for: Jacqueline Barber

    Jacqueline Barber

    Presenter
    May 16, 2016 | 07:22 p.m.

    The Futura Engineering Work Space is browser-based, so students can you any kind of computer device. Students work in pairs on the design challenges, though teachers often have students submit individual proposals at the end of each internship. Those proposals involve students in mounting a design argument: this design is strong because… the trade-offs we made are… etc. Thanks, Isabel!

  • Small default profile

    Fernando Figueroa

    Guest
    May 17, 2016 | 03:36 p.m.

    Very innovative! Excellent presentation and video. I am guessing this is focused on 4th/5th graders? Is this correlating with CCSS standards? Is there a hands on component (“maker” and/or proof of concept activities)?

  • Icon for: Jane Strohm

    Jane Strohm

    Co-Presenter
    May 17, 2016 | 07:27 p.m.

    Thanks! These are designed for grades 6-8. Each internship addresses the NGSS ETS standards and asks students to apply some specific NGSS content in life science, physical science or earth science. There are two internships intended for each scientific domain and thus we expect students could complete two for each grade level. We correlate for both math and ELA CCSS, and because student write a final proposal, there is a lot of language arts work. Each internship has some physical experience, though not always specifically to proof of concept.

  • Icon for: Roger Taylor

    Roger Taylor

    Facilitator
    May 17, 2016 | 04:35 p.m.

    Very impressive! Putting on my “Artist & HCI Designer” hat I commend your team for creating such an elegant and well-designed program.

    Switching to my “Learning Scientist” hat, I was hoping to learn more about your data collection and planned analyses. Personally, I’m currently analyzing longitudinal data from a STEM computer-based learning environment (Vanderbilt’s Teachable Agents system) so I can vouch for the challenges involved.

  • Icon for: Eric Greenwald

    Eric Greenwald

    Co-Presenter
    May 18, 2016 | 02:03 p.m.

    Our approach is grounded in Evidence-Centered Assessment Design (from Mislevy and colleagues). To that end, our basic question is, what would count as evidence of understanding in student task performance (in this case, what they’re doing within the digital Design Tool)? This involves careful articulation of the specific practice/understanding we are assessing (e.g., optimizing a design solution), and specifying what moves students could make within the Design Tool environment to show evidence of developing that practice. We are currently working through a combination of top-down (e.g., expert models) and bottom-up (e.g. machine learning with student data) approaches to building an analytic framework that yields useful, formative inferences about student learning for the teacher and student.
    As far as the data we are drawing on for our analytics framework, we are collecting event data (clicktream and student submissions) and back-end data from student use within the Design Tool, all in conjunction with data about the digital "stage"and teacher moves at the time we log student events—the expectation is that this will help us (and help teachers) learn, for example, the extent to which students are responding to feedback from the project director and evidence from their design test runs as they iterate toward an optimal solution to the design problem.

  • Icon for: Helen Teague

    Helen Teague

    A cyber-ensemble of inversion, immersion, collaborative workspaces, query and media-making in learning
    May 18, 2016 | 07:58 p.m.

    Thank you, Eric for this explanation!

  • Small default profile

    Jewel Barlow

    Guest
    May 17, 2016 | 09:43 p.m.

    Solving “drug resistant malaria” has many constraints and requires a very large amount of data. Are these students accessing real world data sources that reflect reality as far as it is documented? Or are they working in an artificial environment?

  • Icon for: Jane Strohm

    Jane Strohm

    Co-Presenter
    May 18, 2016 | 11:29 a.m.

    Agreed, Jewel, it’s a very challenging and real problem! For all of our internships we designed artificial models based on current scientific understanding, making simplifications so that students can see the cause and effect and apply the foundations of scientific concepts, in the case of malaria, natural selection.

  • Icon for: Katherine McNeill

    Katherine McNeill

    Associate Professor of Science Education
    May 17, 2016 | 10:53 p.m.

    Very cool project! And I think I can see some of Kat’s influence on your video. ;-)

  • Icon for: Kathryn Quigley

    Kathryn Quigley

    Co-Presenter
    May 19, 2016 | 06:05 p.m.

    Hah! Yes it probably looks very similar to our Argumentation Tool Kit video for the NSF Showcase last year! I interviewed Eric in the exact same place that I interviewed Suzy ; )

  • Small default profile

    Rinat Rosenberg-Kima

    Guest
    May 17, 2016 | 11:17 p.m.

    Wow, amazing! Cannot describe how inspired I am watching this video. Rinat

  • Icon for: Jane Strohm

    Jane Strohm

    Co-Presenter
    May 19, 2016 | 08:35 p.m.

    Hi Rinat! Your contributions to the RoofMod design tool still live on! Hope you and your family are doing well!

  • Icon for: Ron Ulseth

    Ron Ulseth

    May 18, 2016 | 08:12 p.m.

    Excellent work. The highlight is your use of formative feedback during the engineering design process.

  • Icon for: Roger Taylor

    Roger Taylor

    Facilitator
    May 20, 2016 | 11:11 a.m.

    Thank you for the explanation Eric – I’m looking forward to reading more about this exciting project in the future! Is there a website that I could check to find out more information?

  • Icon for: Christine Cunningham

    Christine Cunningham

    Founder & Director, Engineering is Elementary
    May 23, 2016 | 01:12 p.m.

    What an innovative project! This looks like a great way to get students fully engaged and excited about engineering. I would also be interested in seeing a website with more information if there’s one available.

  • Icon for: Betsy Stefany

    Betsy Stefany

    Coordinator STEM Literacy Community of Practice
    May 23, 2016 | 01:55 p.m.

    Truly perfect for students to engage by giving them traditional roles like the way you use “interns” and enable them to work within in class with this personalized, yet age/grade advanced label. Congrats on your vision!

  • Icon for: Jacqueline Barber

    Jacqueline Barber

    Presenter
    May 23, 2016 | 01:59 p.m.

    Thanks, Christine and Betsy! There’s definitely a website in our future.

  • Further posting is closed as the showcase has ended.