Skip to content

Commit

Permalink
Update 2024-03-20-vid2real.md
Browse files Browse the repository at this point in the history
  • Loading branch information
eah13 authored May 1, 2024
1 parent eb909d8 commit a41a13a
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion _posts/2024-03-20-vid2real.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,6 @@ paper: true

*The Vid2RealHRI framework was used to design an online study using first-person videos of robots as real-world encounter surrogates. The online study (n = 385) distinguished the within-subjects effects of four robot behavioral conditions on perceived social intelligence and human willingness to help the robot enter an exterior door. A real-world, between- subjects replication (n = 26) using two conditions confirmed the validity of the online study's findings and the sufficiency of the participant recruitment target (22) based on a power analysis of online study results. The Vid2RealHRI framework offers HRI researchers a principled way to take advantage of the efficiency of video-based study modalities while generating directly transferable knowledge of real-world HRI.*

*Collaborative work with Yao-Cheng Chan, Sadanand Modak, Joydeep Biswas, and Justin Hart.*
*Collaborative work with my doctoral student [Yao-Cheng Chan](https://yaochengchan.com/), robotics doctoral student [Sadanand Modak](https://scholar.google.com/citations?user=yEPOWSYAAAAJ&hl=en), and Texas Robotics colleagues [Joydeep Biswas](https://www.joydeepb.com/) and [Justin Hart](http://justinhart.net/).*

*This work was supported by NSF grant #2219236 and UT Austin's Good Systems initiative.*

0 comments on commit a41a13a

Please sign in to comment.