Skip to main content
Alumni Magazine

Seeing stars — and atoms, and everything in between

An interdisciplinary team is using virtual reality to help students understand scale.

Every day, we are surrounded by things we can’t see — our individual cells, their molecules, the atoms making up those molecules. Then there are planets and stars, which are too far away for us to grasp just how large they really are.

Understanding the relative size of things too big or small for people to see or interact with is an important concept that can affect how well students grasp future concepts they come across in science, technology, engineering and mathematics (STEM) classes.

“While this phenomenon is everywhere, there’s research data showing that, unfortunately, learners of all ages, and we all do this, hold on to inaccurate ideas about sizes of scientifically relevant entities,” said Karen Chen, assistant professor in the Edward P. Fitts Department of Industrial and Systems Engineering.

That’s where virtual reality (VR) comes in — if students are able to experience objects that are very small or very large in comparison to their own bodies, faculty members at NC State think their understanding of scale and relative size will improve.

Chen is the principal investigator (PI) of a $1.3 million grant from the National Science Foundation, “Virtual Reality to Improve Students’ Understanding of the Extremes of Scale in STEM.” Cesar Delgado, associate professor in the College of Education, and Matthew Peterson, assistant professor in the College of Design, are the co-PIs. Graduate students from each college will be assisting with the development and assessment.

The team is developing a program called Scale Worlds, creating 31 unique scientific entities that differ in size by powers of 10. In the immersive experience, a student will first see a human-scale scenario, with a human about their size, and other entities at their corresponding sizes. If students use the navigation panel to shrink to a tenth of their height, they will see a chipmunk that appears to be about their size, with a human that now looks like a giant. After shrinking further, they’ll feel like the size of the honey bee beside them, with the chipmunk towering over them, and a table tennis-ball sized flea beside them.

These entities currently range from a molecule to the ocean liner SS United States. They will eventually go down to an atomic nucleus and all the way beyond stars to the Cat’s Eye Nebula. In an early prototype, students are able to see one larger and two smaller entities at a time.

From left to right, Matthew Peterson, Cesar Delgado and Karen Chen.
From left to right, Matthew Peterson, Cesar Delgado and Karen Chen.

The project is interdisciplinary by its very nature. “We believe that this won’t happen unless all three pieces are together,” Chen said. “We really need the expertise in education, and we also want to make sure it’s well-designed and aesthetically pleasing.”

The study will assess how well middle school and undergraduate students understand scale cognition after using Scale Worlds. Middle school is when science curriculums introduce concepts and entities that are too small to see, explained Delgado. Scale cognition is a cross-cutting concept, meaning it is critical to students’ success in a range of STEM subjects.

“Research shows that students have trouble in this area,” Delgado said. “Anything we can do to improve students’ understanding of these objects, and then in turn, the cross-cutting concept of scale, proportion and quantity, there will be a more robust understanding from students to make those interdisciplinary connections.”

The navigational tools in Scale Worlds will show scientific exponential notation, helping students think about powers of 10 while shrinking or growing along with these entities.

The team wants the entities to look as realistic as possible and to ensure that the distinct scale worlds are easy to navigate.

“I think it will be great coming from the design side for Karen and I to work on the visual and technological development because we will have a very different way of seeing those things, and I expect it to be complementary,” Peterson said.

Peterson is interested in how, at certain points, if a student was really becoming as large as a massive star, they could be growing faster than the speed of light. Multimodal cues, like sound effects or changing the background color, can help students recognize those thresholds.

“From a STEM-education perspective, a big part of it is trying to find these meaningful moments to make these transitions apparent to students so they can have a rich experience that they can map on to the scientific representations and concepts they’re seeing,” Peterson said.

For the research, there will be three different ways in which students experience Scale Worlds: a full immersion in the Cave Automatic Virtual Environment (CAVE) on Centennial Campus, head-mounted VR displays and a two-dimensional desktop version.

Each requires its own design approach, and each has its own limitations. While the researchers expect the CAVE to best improve student learning and the desktop version to be the least engaging, there might be an opposite pattern in terms of impact due to accessibility. The team wants to develop a version of Scale Worlds that is publicly available.

With advances in the technology, there is a lot of potential to use VR in STEM education, especially when approached from an interdisciplinary perspective, which Delgado, Peterson and Chen agree has been critical to this project.

One of the most natural ways of learning is to experience something first-hand, which isn’t normally possible with an atom or a planet. Scale Worlds brings these entities that are extremely small or large where students can understand them best — right in front of their eyes.