9/22/2014 Mike Koon
Written by Mike Koon
9/22/2014
Thanks in part to advances in technology, much has changed and continues to evolve in educational methods during the 21st century. A group of University of Illinois civil and environmental engineering (CEE) professors believe their use of sketch-based learning will be one of those advances that could help revolutionize learning in coming years.
Sketch recognition denotes teaching computers to understand the meaning and intent of what a human draws by hand; that understanding can then be integrated into a decision or learning system. The technology allows students to learn faster by giving them immediate feedback when solving a problem involving drawing.
Joshua Peschel is the principal investigator on a team of CEE professors that are piloting the technology in classrooms. The team is using funding from the College of Engineering’s Strategic Instructional Initiatives Program (SIIP) to get the project off the ground.
“Sketch Recognition really lies at the intersection between the liberal arts (understanding how and why people perceive things) and computational science and engineering (artificial intelligence),” said Peschel, who also holds a PhD in computer science. “We have a lot of experience with pen-based computing (such as recognizing basic shapes and handwriting). By taking the next step in teaching computers to understand what it is we’re drawing by hand, it is allowing us to take some specific algorithms and package those up in a new and different way."
Peschel has been on the ground floor of sketch recognition research, which has been around for less than a decade, and indicates that low-level recognition such as shapes and lines are perceived at about 90 percent accuracy, but work is still in the beginning stages for more sophisticated high-level recognition that would include science and engineering sketches.
“Sketch recognition is somewhat new,” Peschel said. “I would still characterize it as an area of both basic and applied research, but a lot of the core technology to make something like a learning system happen is mature enough.”
The technology is being tested in CEE courses taught by Megan Konar and Cassandra Rutherford using a concept called Flow Nets. Konar teaches a water resource engineering course with a focus on surface and subsurface hydrology, while Rutherford instructs geotechnical engineering, which concerns, among other things, how water flows into an excavation, including around buildings and under dams.
In both classes, students are asked to draw or diagram the lines and curves for flow net problems to visualize how water flows through soil; doing so provides the graphical solution to the Laplace equation. Flow lines (the imaginary path a water particle would take), and equipotential lines which indicate how flow is losing head (or energy) because the water is dragging other particles, are important for understanding many subsurface hydrologic and geotechnical engineering design problems.
Currently, students learn flow nets using the traditional pen and paper method, which typically takes weeks to grade and provide feedback for the 100 or more students in those classes. With sketch recognition, however, students can do so on a tablet, which can tell them immediately whether their drawing is correct while also reminding them of the set of properties by which the problem needs to be solved.
“The students usually know what the lines do,” Rutherford said. “However they don’t always understand how they interact with each other and how they change depending on the problem. With this program, if they start drawing the line incorrectly, it tells them.”
The students were divided into three groups, the first used the pen and paper method, which offered no feedback, the second solved problems on a computer using a finite element program, which asks them to copy what they see, and the third used the tablet with the sketch recognition software.The research team surveyed students prior to and after each semester and looked at how they did on homework and exams. The pilot study indicated that those who used the tablet with the sketch-based recognition performed an average of three to five points higher on those types of questions than those who did not use the tablets.
“If we take the metric of the exam, then we are seeing an improvement in learning,” Peschel said. “The feedback additionally says that the students who used the tablets felt like they knew and understood the flow net concepts much better, and we have found that those with higher self-efficacy tend to have improved performance in educational settings.”
While the sketch-recognition is being tested and refined in some basic courses in CEE, Peschel believes this is a launching point for uses in engineering and beyond, hoping to integrate it into physics and chemistry in the next year or so. He indicates that many across campus have been taking interest in it, including those who study the psychology of learning. He is also being careful not to develop the software in such a way that the technology itself hinders the learning process.
“The beauty of the software we have developed is we can apply it to many different uses, for instance physics, chemistry, mathematical equations, circuits and so on,” Peschel said. “We intend to integrate our core software to solve a wide range of civil engineering problems and the dream is to have it in every classroom, particularly those involving first- and second-year engineering students. However, we don’t think it’s limited to engineering. For example, imagine learning Japanese with this same concept.”
With additional modifications, down the road, sketch recognition could be used for online testing as well as homework with the ability to automatically grade and provide feedback for exam questions. This would allow professors who teach MOOCs, to have a much larger class size.
“Next year, we’re planning to expand this idea to group activities with two students working together to solve a problem on the same tablet,” Rutherford said. “It’s a great adaptation of the ‘think-pair-share’ concept, but now with technology. In a big class, even those professors with TAs, there aren’t enough of us to physically help every group. But if the students all have a tablet that’s giving them feedback, that makes you a clone for every group.”
The SIIP project draws together research in computer science and engineering with the learning sciences and educational implementation research. Peschel has recently been awarded two National Science Foundation grants that will intersect his computational research efforts with collaborators in educational technology. Peschel is a co-PI with Emma Mercier from the College of Education and Geoffrey Herman, a visiting professor in the College of Engineering, on an NSF Cyberlearning Future Learning Technologies grant; and with Herman on an NSF Research in Engineering Education (REE) grant.
“Every technology really needs a killer app,” Peschel concluded. “For sketch, that can be education. The technology has developed in such a way, the price point has come down on the tablets, and there are such beautiful interfaces in technology with strong computational power that we can now affordably put a nice piece of software together.”
Top: The PIs on the project are (left to right): Joshua Peschel, Cassandra Rutherford and Megan Konar.