Dan Freer

Robotic Telepresence in the Living Algorithm

 Spotlights  Comments Off on Robotic Telepresence in the Living Algorithm
Feb 082017
 

How is using robotic telepresence in the classroom different from videoconferencing? Ken Frank and his students know at least one way – agentive movement – and they recently used it in class as part of a “living algorithm” learning experience.

Dr. Ken Frank is an MSU Foundation Professor of sociometrics and one of the leading experts on social network theory. Ken also teaches CEP 991B – Special Topics in Educational Statistics and Research Design, a course on how to map out social networks to better understand how and why people interact within and between groups. In one lesson of the course, Ken has taught what he calls his “living algorithm” to predict what groups an individual will choose to associate with. In the past, physically participating in the living algorithm activity would have been restricted only to physically present students.  But Ken saw an opportunity to open this to online students using Beam telepresence.

In Dr. Frank’s CEP 991B class he was able to demonstrate his living algorithm using his students that were both face-to-face and online by giving the online students autonomy using Beam robots.  Dr. Frank gave the simple instruction for students to “get into groups” and predict beforehand how the students would split up into groups.  The Beam robots allowed online students to move around the classroom and select the groups they wish to associate with.  Utilizing the presence that Beam robots give students allows for participation in class activities such as these that improve student learning.  The student who used the beam in Dr. Frank’s class stated “I felt as though I was more a part of discussions, the lecture, and the classroom community. My peers noticed me more than if I was on a computer screen”.

 


Information on using Kliquefinder to make this map

Lessons such as this that have an interactive element, that are meant to improve students’ understanding of a complex topic by making the experience more salient, are typically hindered when students are online, but new technologies such as Beam robots allow online students the affordances to participate in interactive elements and improve their learning.  The technologies allow for participant embodiment so students and teachers may collaborate and learn across different contexts.  Dr. Frank stated “Without Beam and Zoom what do I do?  … you take a 2 day intensive course?  With Beam and Zoom you are a member of the class.  He’s doing a project with someone in the class… and it’s seamless”.  This benefits both online and physically present students by allowing them to share in the experience and learn from each other.  The robots and video conferencing bring different affordances to the classroom that knowledgeable teachers can use in different contexts to improve learning for face-to-face and and online students.

Studying the Use of Augmented Reality to Teach Spatial Skills in Engineering

 Spotlights  Comments Off on Studying the Use of Augmented Reality to Teach Spatial Skills in Engineering
Oct 242016
 

Design Studio’s SLATE Research Group and MSU’s College of Engineering are collaborating to study how augmented reality tools can help beginning engineering students master spatial reasoning problems. First year engineering students in EGR 291, which is focused on teaching students spatial skills, have recently agreed to participate in a study where they learn these skills using an augmented reality app that can be downloaded on student’s smartphone or tablet. The app was conceived and developed by Dr. John Bell and graduate student members of both the Design Studio and the SLATE Research Group.

The Context

Spatial skills are considered essential for learning and performing engineering. EGR 291 is a class offered to first year engineering students who can benefit more from spatial training, as determined by a spatial assessment given to all incoming engineering students. The class was created to help these students catch up on skills that professors deem necessary for success in the engineering school. This 1 semester course teaches students traditional spatial reasoning using tasks such as mental rotation, which has students mentally rotate objects to solve problems.  In addition, students are trained in paper folding, working with models, and piecing together different complex shapes.

thumb_img_0003_1024

thumb_img_0007_1024

 

 

 

 

 

 

The App

The app (designed by Design Studio’s John Bell with input from the SLATE Research Group’s Collaboratively Embodied Content (SLATE-CEC) team) features a series of mental rotation tasks using augmented reality.  Students then download the app to their mobile devices (phones, tablets, iPads, etc.) and use the app to perform mental rotation tasks similar to the problems found in their textbooks.

So what is augmented reality?  Augmented reality is when you are viewing the real world through a lens on a phone, tablet, or other computer device and the device overlays another image or information on top of the real world (see GIF below).  The overlay image is interactive in the case of this app, where students manipulate the image to solve mental rotation tasks. What the students see is a 3D geometric figure that they can walk around to see different perspectives of the object, then they are meant to look at the object at the correct angle.

57fe7eff25a10422937599

The Expectation

It is quite possible that augmented reality will open up many new possibilities in engineering education, particularly for those students struggling with early engineering concepts like spatial reasoning. Collaboration between the Design Studio and Dr. Tim Hinds in the school of engineering could open up the possibility of a new and exciting approach to helping students develop their spatial skills. Likewise, the SLATE-CEC group will conduct research studies to understand if and how augmented reality tools can help all students succeed in their engineering courses.