Greg Welch, chair of healthcare simulation at University of Central Florida, spoke at the 2016 ASIS&T Annual Meeting as the first plenary speaker. He discussed the meaning of a telepresence valley, which is the gap between people in a situation where telepresence is involved. Welch noted that telepresence is markedly different from telecollaboration and, in an ideal telepresence situation, people from different remote locations can feel they are together in one common location. One telepresence application, TLE TeachLivE, is a simulated classroom that allows teachers to improve their skills in interacting with students. TLE simulates avatar students that react to the teacher, thus giving realistic feedback in real time. Welch also discussed applications in the healthcare industry, such as a touch-sensitive mannequin head that Welch and his team are developing to react to the actions of a provider.
multi user virtual environments
Bridging the Telepresence Valley
Greg Welch Speaks
by Steve Hardin
The first plenary speaker at the ASIS&T Annual Meeting outlined ways to bridge what he calls the “telepresence valley” – a metaphorical valley separating one person from another in a telepresence situation. Dr. Greg Welch is, among other things, the Florida Hospital Endowed Chair in Healthcare Simulation at the University of Central Florida, as well as the co-director of the UCF Synthetic Reality Laboratory as well as the Interactive Systems & User Experience Research Cluster at UCF.
Welch began by noting that humans have been representing humans at least since 24,000 to 22,000 BCE, when archaeologists believe the Venus of Willendorf – a model of a woman showing exaggerated breasts and hips – was created. He also showed a Mayan representation of a human head, complete with skin and underlying skull. Walt Disney’s “Carousel of Progress,” featuring robotic representations of actors, was displayed at the New York World’s Fair in 1964.
Welch also showed a video of Geminoid F, a robot that looks like a woman. It was developed by Hiroshi Ishiguro of Osaka University in cooperation with ATR Intelligent Robotics and Communication Laboratories . There’s even a new version of Teddy Ruxpin, featuring animated LCD eyes. The bear can blink, turn its irises into snowflakes and transform them into hearts and flowers .
In his 1970 article “The Uncanny Valley,” Masahiro Mori said that as a robot’s appearance becomes increasingly humanlike, a human will have an increasingly positive response to that robot. However, there’s a point at which the human appearance becomes revulsive . Welch also referenced “On the Psychology of the Uncanny,” a 1906 article by Ernst Jentsch in which he defines uncanny as referring to doubts whether an apparently animate being is really alive, or, conversely, whether a lifeless object may be, in fact, animate . Welch said Ishiguro’s extension of the uncanny valley – the synergy effect – is the expected balance between appearance and behavior when we recognize creatures .
Telepresence, Welch said, is different from telecollaboration. Sonnenwald et al , writing on Schütz and Luckmann’s theory of the life world  , noted that humans in remote locations, trying to work together, are motivated to develop a shared reality. The controlling agent can be a human (avatar) or a computer algorithm (agent).
TLE TeachLivE  is a “flight simulator” for teachers. A virtual classroom appears on a big screen in front of the teacher; virtual students react to the teacher. The software gives teachers a chance to improve their skills. The students are avatars, controlled by an “interactor” using a mixture of agent behavior and avatar behavior.
Facebook is active in virtual reality too, acquiring Oculus VR  in 2014. On the other hand, Apple CEO Tim Cook has said, “There’s no substitute for human contact.” Welch said it’s unclear what he meant by that; it remains to be seen what Apple will do.
The “telepresence valley” is a metaphorical valley separating one person from another in a telepresence scenario. It’s not a valley of revulsion, but more a valley of repulsion, Welch said. One wants the persons separated by distance to feel as if they are together in a common place, yet many factors pull each person back into his or her distinct remote environments.
Welch noted that augmented reality (AR) is trending now. Instead of immersing you in another world, AR adds more information to the world around you. For example, a person’s name may appear when you see him or her. There’s the audio-only approach, too, such as Apple’s Siri or Amazon’s Echo.
Physical-virtual (PV) telepresence can extend healthcare. Welch shared a vision in which a woman confined to a hospital would connect to a remote robotic avatar at the shopping mall. It could provide an escape for immobile or confined patients.
Welch also showed a demonstration at the ACM International Symposium on Mixed and Augmented Reality (ISMAR) with comedian Brian Bradley inhabiting a PV avatar. It was sometimes characterized as “creepy,” but people seemed engaged with substantive back-and-forth exchanges. Visitors appeared to follow the avatar’s gaze, which is a natural interaction behavior that was encouraging to see. It shows why physical presence is important. Welch says innovators in this area want a person to act as if he or she really is with another.
He has recently turned his attention to increasing social/co-presence via indirect effects, as a mechanism to bridge beyond direct human-human factors. An experiment was conducted with a virtual human and a virtual (real) confederate. If two avatars are observed speaking to each other as a person walks in, it affects how the newcomer interacts with them. In another experiment, a table was instrumented so that it would wobble. The wobble could be initiated by a real human as well as a virtual human. Even subtle wobbling enhances the shared sense of social/co-presence.
“What’s next?” Welch asked. One promising area is medicine. Currently, the faces of medical mannequins don’t change no matter what the healthcare provider is doing. Welch et al. are developing a touch-sensitive head that changes in response to the provider’s actions. They’re also working on expanding the concept to the entire body. You could walk up to a robotic patient which could turn and talk with you, even changing the temperature in its hands to simulate symptoms.
Resources Mentioned in the Article
 Guizzo, E. (April 4, 2010). Meet Geminoid F, a Smiling Female Android. Retrieved from http://spectrum.ieee.org/automaton/robotics/humanoids/040310-geminoid-f-hiroshi-ishiguro-unveils-new-smiling-female-android
 Moon, M. (September 30, 2016). Teddy Ruxpin returns with animated LCD eyes. MSN.com. Retrieved from http://www.msn.com/en-us/lifestyle/shopping-toys/teddy-ruxpin-returns-with-animated-lcd-eyes/ar-BBwOPAc
 Mori, M. (June 12, 2012), The uncanny valley. Retrieved from http://spectrum.ieee.org/automaton/robotics/humanoids/the-uncanny-valley
 Jentsch, E. (n.d.) On the psychology of the uncanny. Translated by R. Sellars. Originally published as Zur Psychologie des Unheimlichen in 1906. Retrieved from http://www.art3idea.psu.edu/locus/Jentsch_uncanny.pdf
 Takashi, M., Shimada, M, Ishiguro, H., & Itakura, S. (2004). Development of an android robot for studying human-robot interaction. Retrieved from http://www.geminoid.jp/~minato/papers/Minato04b.pdf
 Sonnenwald, D. H., Whitton, M. C., & Maglaughlin, K. L. (2003). Evaluating a scientific collaborator: Results of a controlled experiment. ACM Transactions on Computer-Human Interaction 10 (2), 150-176.
 TeachLive.org: http://teachlive.org/
 Oculus touch: https://www.oculus.com/
Greg Welch holds the Florida Hospital Endowed Chair in Healthcare Simulation at the University of Central Florida. He can be reached at welch<at>ucf.edu
Steve Hardin is public services librarian, Cunningham Memorial Library, Indiana State University Steve.Hardin<at>indstate.edu