Finally, this paper got a Best Paper Award π! Both lead authores Romain Nith and Yun Ho, and co-author Shan-Yuan Teng will be at #CHI2026.
Read the paper: https://embodied-ai.tech/static/pdfs/EmbodiedAI_CHI2026.pdf
Watch the video: https://youtube.com/watch?v=pJM2Z8mmwAw
Get the code: https://github.com/humancomputerintegration/EmbodiedAI
or DM us! 7/7
As AI agents enter the physical world via smart glasses and robots, they need to interpret usersβ ambiguous references, including imprecise pointing and underspecified language like βwhatβs that?β in complex real-world settings.
We introduce Uncertain Pointer at #CHI2026 to communicate system uncertainty with AR visualizations and facilitate user disambiguation!
Led by the amazing Ching-Yi Tsai w/ Nicole Tacconi & Andy Wilson
1/4 π§΅

Exploring the design of digital musical instruments with bidirectional tactile interaction through a user study. This study investigates the potential design and interaction implications when tradional unidirectional mapping paradigms are swapped out for bidirectional haptic interaction.
Text or video is not only form AI can take.
In https://embodied-ai.tech (#CHI2026, Best Paper π) we create an embodied AI that acts via muscle stimulation to perform physical tasks: e.g., place a bike on a bus rack and more!
https://www.youtube.com/watch?v=pJM2Z8mmwAw
... by my PhD students Yun Ho (https://www.yunho.org) and Romain Nith (https://romainnith.com)