NSF Awards: 1417939
It’s a challenge to be able to monitor students’ progress in learning to engage in science and engineering practices; the dynamic nature of demonstrating facility with a practice is not easily captured through traditional assessments. This video showcases the early development of a collection of digital engineering internships for middle school students, designed to serve simultaneously as immersive learning experiences that reflect authentic engagement in design thinking, and environments that provide teachers with evidence of students’ growth in the associated science and engineering practices.
Teresa Eastburn
Digital Learning & UCAR Connect Lead
Kudos Jacqueline and team on Futura and the richness of the experiences you’ve created with the internships to engage the students in engineering design for the 21st century. What ages does the project target and is the program being implemented within school time, after-school time, or both? What tools are you using to assess engagement? Are all design challenges within the internship computer based? Are your resources online to view? Looking at student’s salient moves and use with the design tools is brilliant, combining learning and assessment! Interestingly, I am involved in an engineering program with middle schoolers now and would love to learn more. I’m intrigued! Kudos again on a great video and program.
Lauren Allen
Postdoctoral Research Associate
I totally agree, this project looks really promising! I am curious also about the details that Teresa asked about, and additionally would like to hear more about what teachers see (the video used the phrase “automatically analyze data”) and how they use that information. Is there a teacher professional development program that goes along with these virtual internships?
Jane Strohm
Engineering Curriculum Lead
The teachers use a platform to send messages and feedback letters to their interns. For example, when an intern submits the results from one of their design tests, the system evaluates the values and composes a feedback letter that gives the interns suggestions about what’s working and what could be improved.
Later, when the interns have identified their optimal design, they draft a proposal outline prior to completing a final proposal. The teacher uses the platform to evaluate the outline using a rubric tool. The selections the teacher makes generate a prepared feedback letter that can help interns improve their argument and use of evidence for how and why their design is a good solution to the problem.
The engineering units come with several supporting documents to help teachers implement, but at this time there is no formal professional development.
Eric Greenwald
Director of Assessment and Analytics
To follow up on the assessment questions, to shed light on engagement, we are collecting event data (clicktream and student submissions) and back-end data from student use within the Design Tool—this will give us things like how long students are using different features of the tool, as well as the specific moves they make within it—the expectation is that this will help us (and help teachers) learn, for example, whether students are making meaningful moves within the digital environments vs just randomly clicking buttons. Also, I’m not sure if this was a part of the engagement question, but we are using the Fascination scale from the Activation Lab suite of measures (see http://www.activationlab.org/ for more detail).
For the “automatically analyze data” question, we are building an analytics framework to draw inferences from all of the data we are collecting (see my response to Jennifer Adams, below, for more on that approach). Our goal is always to provide actionable information to teachers, based on these data—we are building a reporting infrastructure in conjunction with the analytics approach to digital data so that teachers can easily see and make use of the insights from student work in the digital environments.
Jacqueline Barber
Director of the Learning Design Group
Thank you Teresa and Lauren! We are excited about these Engineering Internships as well!
We have designed the experiences to be used in-school, by middle-school aged students (and their teachers). Each internship is designed to follow a science unit on a particular topic. For instance the Rooftops for Sustainable Cities internship, in which students work to make a city more energy efficient in order to reduce the carbon dioxide produced from combustion, is designed to follow a unit on climate change, in which students learn how changes in the atmosphere are affecting the energy balance in the Earth’s system, and about human’s role in these changes. The internship on Fighting Drug-Resistant Malaria, in which students develop, test, and refine treatments for drug-resistant malaria, is designed to follow a unit on natural selection. etc. This provides students with the opportunity to apply their newfound science understanding to another situation—in each case, a design challenge.
While we have worked to create hands-on experiences as possible, the design challenges are all computer based, providing an affordance that we are leveraging to create unobtrusive assessment capabilities.
I will let my colleague Eric Greenwald respond about more details of the assessment and engagement measure, and my colleague Jane Strohm respond about the teacher interface. Thanks so much for your comments and questions!
Susan Doubler
Jackie,
There are so many wonderful qualities in this program, but I’d like to focus on one that really captured me. Your internships engage students in current societal problems. These experiences are preparing students—creating the state of mind—to contribute to the new, hard questions that society will face in the future. Congratulations!
Jacqueline Barber
Director of the Learning Design Group
Thank you, Sue! It has been our goal to enable students to take on a problem-solving role related to real world problems, something that engineering easily affords. Student engagement has been super high. I really liked the comment by one teacher about how her students seemed more hopeful after participating in tackling climate change.
Jennifer Adams
Associate Professor
I agree about the promise of this project and I am especially interested in the capturing of the moves that students make while engaging in the process. Do you have any research questions (research on learning and engagement) around this data and, if yes, what has been revealed thus far?
Eric Greenwald
Director of Assessment and Analytics
Thanks, Jennifer! Our approach is grounded in Evidence-Centered Assessment Design (from Mislevy and colleagues). To that end, our basic question is, what would count as evidence of understanding in student task performance (in this case, what they’re doing within the digital Design Tool)? This involves careful articulation of the specific practice/understanding we are assessing (e.g., optimizing a design solution), and specifying what moves students could make within the Design Tool environment to show evidence of developing that practice. We are currently working through a combination of top-down (e.g., expert models) and bottom-up (e.g. machine learning with student data) approaches to building an analytic framework that yields useful, formative inferences about student learning for the teacher and student.
Jennifer Adams
Associate Professor
Thanks! It will be interesting to compare the outcomes of each (top-down vs bottom-up)
Teresa Eastburn
Digital Learning & UCAR Connect Lead
Thanks Jacqueline and colleagues. I’ll look for more info online as I’m particularly interested in the Rooftop lesson that you shared. Good luck this week and thank you for your efforts to strengthen STEM during the school day. That’s the best way to ensure broader impacts by reaching a broad segment of students. What class is this most often offered in?
Jane Strohm
Engineering Curriculum Lead
Hi Teresa, We expect these units to be part of a middle school science course.
Helen Teague
The focus on student-driven investigation is wonderful! Is there a specific framework you use for the Design Cycle in the 2nd stage?
Jane Strohm
Engineering Curriculum Lead
Hi Helen, We recognize there are a variety of Design Cycles/Processes out there. We’ve simplified the Design Cycle to Plan, Build, Test, Analyze… Repeat.
Helen Teague
Thank you, Jane! I like the emphasize on the action words!
William Finzer
This seems like a fabulous way to generate enthusiasm for doing design and science in the classroom. I love the way you keep the locus of control for what messaging goes out when with the teacher. I imagine that students buy in to the process as pretty realistic. True? What challenges are there in getting students to believe in it all as more than “school work?”
Jane Strohm
Engineering Curriculum Lead
Thanks! There’s notable buy-in because we ask students to consider problems that are current and in the news that affect humans around the globe. We’ve seen that some students really engage in the internship fiction, but it depends on how the teacher implements the immersion using internship language, referring to the project director, or modifying other classroom routines.
Isabel Huff
A very professional video and innovative ideas! Is there any way for students to collaborate on their internships? Can the internships be done on tablets?
Jacqueline Barber
Director of the Learning Design Group
The Futura Engineering Work Space is browser-based, so students can you any kind of computer device. Students work in pairs on the design challenges, though teachers often have students submit individual proposals at the end of each internship. Those proposals involve students in mounting a design argument: this design is strong because… the trade-offs we made are… etc. Thanks, Isabel!
Fernando Figueroa
Very innovative! Excellent presentation and video. I am guessing this is focused on 4th/5th graders? Is this correlating with CCSS standards? Is there a hands on component (“maker” and/or proof of concept activities)?
Jane Strohm
Engineering Curriculum Lead
Thanks! These are designed for grades 6-8. Each internship addresses the NGSS ETS standards and asks students to apply some specific NGSS content in life science, physical science or earth science. There are two internships intended for each scientific domain and thus we expect students could complete two for each grade level. We correlate for both math and ELA CCSS, and because student write a final proposal, there is a lot of language arts work. Each internship has some physical experience, though not always specifically to proof of concept.
Roger Taylor
Assistant Professor
Very impressive! Putting on my “Artist & HCI Designer” hat I commend your team for creating such an elegant and well-designed program.
Switching to my “Learning Scientist” hat, I was hoping to learn more about your data collection and planned analyses. Personally, I’m currently analyzing longitudinal data from a STEM computer-based learning environment (Vanderbilt’s Teachable Agents system) so I can vouch for the challenges involved.
Eric Greenwald
Director of Assessment and Analytics
Our approach is grounded in Evidence-Centered Assessment Design (from Mislevy and colleagues). To that end, our basic question is, what would count as evidence of understanding in student task performance (in this case, what they’re doing within the digital Design Tool)? This involves careful articulation of the specific practice/understanding we are assessing (e.g., optimizing a design solution), and specifying what moves students could make within the Design Tool environment to show evidence of developing that practice. We are currently working through a combination of top-down (e.g., expert models) and bottom-up (e.g. machine learning with student data) approaches to building an analytic framework that yields useful, formative inferences about student learning for the teacher and student.
As far as the data we are drawing on for our analytics framework, we are collecting event data (clicktream and student submissions) and back-end data from student use within the Design Tool, all in conjunction with data about the digital "stage"and teacher moves at the time we log student events—the expectation is that this will help us (and help teachers) learn, for example, the extent to which students are responding to feedback from the project director and evidence from their design test runs as they iterate toward an optimal solution to the design problem.
Helen Teague
Thank you, Eric for this explanation!
Jewel Barlow
Solving “drug resistant malaria” has many constraints and requires a very large amount of data. Are these students accessing real world data sources that reflect reality as far as it is documented? Or are they working in an artificial environment?
Jane Strohm
Engineering Curriculum Lead
Agreed, Jewel, it’s a very challenging and real problem! For all of our internships we designed artificial models based on current scientific understanding, making simplifications so that students can see the cause and effect and apply the foundations of scientific concepts, in the case of malaria, natural selection.
Katherine McNeill
Very cool project! And I think I can see some of Kat’s influence on your video. ;-)
Kathryn Quigley
Producer and Media Lead
Hah! Yes it probably looks very similar to our Argumentation Tool Kit video for the NSF Showcase last year! I interviewed Eric in the exact same place that I interviewed Suzy ; )
Rinat Rosenberg-Kima
Wow, amazing! Cannot describe how inspired I am watching this video. Rinat
Jane Strohm
Engineering Curriculum Lead
Hi Rinat! Your contributions to the RoofMod design tool still live on! Hope you and your family are doing well!
Ron Ulseth
Excellent work. The highlight is your use of formative feedback during the engineering design process.
Roger Taylor
Assistant Professor
Thank you for the explanation Eric – I’m looking forward to reading more about this exciting project in the future! Is there a website that I could check to find out more information?
Christine Cunningham
What an innovative project! This looks like a great way to get students fully engaged and excited about engineering. I would also be interested in seeing a website with more information if there’s one available.
Betsy Stefany
Truly perfect for students to engage by giving them traditional roles like the way you use “interns” and enable them to work within in class with this personalized, yet age/grade advanced label. Congrats on your vision!
Jacqueline Barber
Director of the Learning Design Group
Thanks, Christine and Betsy! There’s definitely a website in our future.
Further posting is closed as the showcase has ended.