Home » Content Tags » robotics

robotics

11/2/22

By Maria Herd M.A. ’19

 

Fitness trackers and smartwatches are widely used to monitor health, activity and exercise, but they’re pretty sedentary themselves. They stay strapped on your wrist or clipped to your clothing despite the fact it’s more effective to monitor different areas—your upper body for breathing, for example, or your wrist to track typing or writing.

Now, researchers at the University of Maryland are putting wearable sensors on track to do their best work—literally—with a miniature robotics system capable of traversing numerous locations on the human body.

Their device, called Calico, mimics a toy train by traveling on a cloth track that can run up and down users’ limbs and around their torso, operating independently of external guidance through the use of magnets, sensors and connectors. Their paper describing the project was recently published in the ACM Journal on Interactive, Mobile, Wearable and Ubiquitous Technologies and presented at UBICOMP, a conference on ubiquitous computing.

 

closeup of wearable sensor on wrist

“Our device is a fast, reliable and precise personal assistant that lays the groundwork for future systems,” said Anup Sathya M.S. ’21, who led Calico’s development for his master’s thesis in human-computer interaction. Sathya is now a first-year Ph.D. student in computer science at the University of Chicago.

Most wearable workout devices are limited in the type of exercises they can monitor, but Calico is versatile. For example, it can track running on a user's arm, move to the elbow to count push-ups, to the back for planks, and then to the knee to count squats.

And unlike other devices, Calico moves quickly and accurately without getting stuck on clothing or at awkward angles. “For the first time, a wearable can traverse the user’s clothing with no restrictions to their movement,” said Huaishu Peng, an assistant professor of computer science who was Sathya’s adviser at UMD.

Peng, who also has an appointment in the University of Maryland Institute for Advanced Computer Studies (UMIACS), sees a future in which mini wearable devices like Calico will seamlessly integrate with humans for interaction, actuation and sensing.

He recently took Calico in a creative direction by establishing a new collaboration with Jonathan David Martin, a lecturer in Immersive Media Design; and Adriane Fang, an associate professor at the School of Theatre, Dance, and Performance Studies.

The interdisciplinary team is combining dance, music, immersive media, robotics and wearable technology into a novel and compelling series of interactive dance performances that are choreographed in real time through Calico.

First, Peng’s research group programmed Calico to instruct a dancer to execute specific movements using motion and light. Then, using their smartphones, the audience gets to collectively vote on how Calico should instruct the dancer.

The project is being funded with a $15,000 award from the Arts for All initiative, which leverages the combined power of the arts, technology and social justice to make the University of Maryland a leader in addressing grand challenges.

“The idea is to explore the dynamics and connections between human plus robot and performer plus audience,” said Peng. “In this instance, Calico will and act as the ‘mediator’ to broadening art and tech participation and understanding.”

Calico’s original creators include Jiasheng Li, a second-year Ph.D. student in computer science; Ge Gao, an assistant professor in the College of Information Studies with an appointment in UMIACS; and Tauhidur Rahman, an assistant professor in data science at the University of California, San Diego.

VIDEO: Calico: Relocatable On-cloth Wearables with Fast, Reliable, and Precise Locomotion

9/17/21

By Jessica Weiss ’05

Students learning classical violin usually have to wait until a session with a music teacher to get personalized feedback on their playing. Soon they may have a new tool to use between lessons: an app that can observe them play and guide them toward better posture and form—key elements both for sounding their best and avoiding overuse injuries.

Two University of Maryland researchers are drawing on very different academic backgrounds—one in classical violin and music education, the other in robotics and computer science—to develop this virtual “teacher’s aide” system powered by artificial intelligence (AI) technology. In addition to expanding the market for violin instruction, it will allow students who may not have access to private lessons to receive feedback on their playing.

Associate Professor of Violin in the School of Music Irina Muresanu, who is collaborating with Cornelia Fermüller, associate research scientist in UMD’s Institute for Advanced Computer Studies, said the technology will be revolutionary for a field rooted in tradition.

“While I believe that traditional methods are still the best way to pass on to our students the legacy and heritage of the classical music world, I am excited to explore ways in which artificial intelligence can be integrated as a feedback mechanism into daily practice—the central experience of any musician’s life,” she said.

The project is part of Arts for All, a new initiative to expand arts programming across campus and bolster interdisciplinary offerings through a fusion of the arts, technology and social justice.

Muresanu and Fermüller were recently awarded a $115,000 Phase I Maryland Innovation Initiative award by the Maryland Technology Development Corporation to support the project. The award, a partnership between the state of Maryland and five of its public universities, is designed to help propel research ideas from the lab to the commercial market.

An internationally renowned Romanian violinist, Muresanu has spent the last decade working at the intersection of music and technology. She previously collaborated with Amitabh Varshney, dean of the College of Computer, Mathematical, and Natural Sciences, on “Four Strings Around the Virtual World,” which embedded Muresanu’s solo violin project in famous global locales including concert halls, cathedrals and outdoor spaces.

When the COVID-19 pandemic made in-person teaching impossible, Muresanu began seeking new ways to allow violin students to continue learning remotely. Last year, she partnered with UM Ventures, a joint technology commercialization initiative of the University of Maryland, Baltimore and University of Maryland, College Park, to explore high-tech approaches for enhancing remote lessons.

Fermüller was a natural fit for the project. A researcher of computer vision and robotics, she works to enable computers to understand and enhance what people are doing in their daily activities.

In the Autonomy Robotics Cognition Lab, Muresanu and Fermüller, along with computer science Ph.D. student Snehesh Shrestha, are studying human-robot interaction in the context of playing the violin and how to integrate AI into the learning process. The technology they are producing—which will enable computers and phones to derive information from digital video—will let music teachers customize the type and amount of feedback students receive and survey the results.

Fermüller said the technology will be a major step forward in using AI for music education, and could potentially be applied to other instruments.

“The platform we are currently working on provides feedback to students based on their specific needs, and this is very novel,” she said. “I believe this is the future of AI-supported education.”

Subscribe to RSS - robotics