ATLAS research helps define the future of human-computer interaction

Helping robots behave tactfully in group situations, pinpointing ways social media can avoid reminding the bereaved of their losses, blending modern technology with ancient weaving practices to improvesmart textiles, encouragingvisually impaired children and sighted family members to learn Braille together through tangible blocks and computer games—these are some of the topics covered in the ninepapers and two workshops by researchers at ϳԹ’s ATLAS Institute that were accepted to CHI 2020, the world’s preeminent conference for the field of human-computer interaction.
Like so many other events, CHI 2020,also known as ACM’s Conference on Human Factors in Computing Systems, isn’t taking place this year, but the proceedings are published and faculty and students remain tremendously proud of their contributions. Commenting on their work, ATLAS Director Mark Gross said, “The interactions we all have with hardware and software range from the absurd to the sublime. The field of human-computer interaction has more impact today than ever before, and ATLAS students and faculty are contributing at the highest levels. I’m immensely proud of this work.”
Researchers in the Unstable Design Lab authored a remarkable four of the ninepapers admitted to the conference, two of which earned honorable mention, an accolade reserved for the top 5 percent of accepted conference papers. The THING, Superhuman Computing, Living Matter, ACME and IRON labs also had papers accepted to the conference.
"Each of these papers is unique and forward-thinking," saidLaura Devendorf, director of the Unstable Design Lab, of the researchers' papers."They shownew ways of both designing, engaging, but also recycling wearable tech devices. They not only present interesting design work, but present it in a way that ties in theories and practices from inside and outside our research community: from design for disassembly to ASMR channels onYouTube."
CHI 2020 was scheduled to take place April 25 – 30, in Hawaii. “I’m particularly disappointed for our students. It’s a big opportunity for them and their careers to get that kind of exposure,” saidDevendorf.
In all, CHI 2020 received 3,126 submissionsand accepted 760. In 2019, CHI accepted five ATLAS papers, including three from the Unstable Design Lab and two from the Superhuman Computing Lab.
CHI 2020 papers, position papers and workshops by ATLAS faculty and students

Unstable Design Lab
[Honorable Mention Award]
Laura Devendorf (ATLAS/INFO Faculty), Katya Arquilla (Aerospace PhD Student), Sandra Wirtanen, Allison Anderson (Aerospace Faculty), Steven Frost (Media Studies Faculty)
By broadening the idea of who and what is considered “technical,” this paper examines the ways HCI practitioners, engineers and craftspeople can productively collaborate.
[Honorable Mention Award]
Laura Devendorf (ATLAS/INFO) Faculty), Kristina Andersen, Aisling Kelliher
How can we design for difficult emotional experiences without reducing a person’s experience? In this paper three researchers design objects that illustrate their personal experiences as mothers to gain a deeper understanding of their individual struggles.
Shanel Wu (ATLAS), Laura Devendorf (ATLAS/INFO)
Being mindful of the massive waste streams for digital electronics and textiles, HCI researchers address sustainability and waste in smart textiles development through designing smart textile garments with reuse in mind.
Josephine Klefeker (ATLAS, TAM undergraduate), Libi Striegl (Intermedia Art, Writing and Performance), Laura Devendorf (ATLAS/INFO)
Researchers introduced the online subculture of autonomous sensory meridian response (ASMR) videos, showing people slowly interacting with objects and whispering into microphones and triggering a tingling bodily sensation in viewers and listeners, as a source of inspiration for wearables and experiences of enchantment, to cultivate deeper connections with our mundane and everyday environments.
IRON Lab
Hooman Hedayati (PhD student, Computer Science), James Kennedy, Daniel Szafir
While humans most often learn to interpret social situations and adjust their behavior accordingly, robots must be programmed to do so. This paper explores ways for robots to detect and predict the position of individuals in human conversational groups in order to more fluidly interact and participate in a conversation with them. More information
THING Lab & ACME Lab
Ryo Suzuki, Hooman Hedayati, (both PhD student, CS), Clement Zheng