The Articulated Head

Making an Interactive Attention Driven Robotic Exhibit

The Articulated Head exhibit was a component of the larger scale Thinking Head Project [1], and is a recreation of a previous exhibit [2] design project undertaken by my supervisor. As a result, this article also refers to online resources (see the final section Links) regarding this project’s history in addition to academic paper citations.

Introduction

The Articulated Head (AH) is not just a robot-art installation but a public interactive experience. It was a product of a collaboration between roboticists and Australian performance artist Stelarc. The exhibit, initially designed as a system that uses an attention model to mimic the behaviour of an active listener in an interaction between robots and people, can be broadly described as a monitor with a rendering of Stelarc attached to a robot arm (Fanuc LR Mate 200iC) that would move to engage and discuss with the public. People could directly communicate and chat with the system via a computer terminal in front of the exhibit. The first iteration of this project was deployed at the Sydney Powerhouse Museum between 2010 and 2012 as a public exhibition that was part of the broader Thinking Head Project.

persontest

The Articulated Head under development at the University of Canberra.

Part of the novelty of the original system could be attributed to the Thinking Head Attention Model and Behavioral System (THAMBS) [Kroos 12], which directed the robot to respond to stimuli as an interactive agent in the world. The robot could respond to visual, acoustic, and keystroke input (via the terminal) stimuli to engage and react to a dynamic social environment via THAMBS [Kroos 14]. My colleagues and I built upon this original exhibit, in which my role was to take the THAMBS system and integrate it onto a new robot (the UR10) and the ROS stack. An image of the early stage of development in our lab at UC can be seen above. In addition, modern sensors had to be integrated, including Astra RGB-D cameras, microphones, and LIDAR mechanisms for safety. This article will summarise the tech work conducted, the academic outputs, and deployment details.

Technical Development

The first step in translating the original exhibit into a more modern computing framework was establishing the ROS framework. Previously, the system used MATLAB for the THAMBS component and a local network with custom drivers to move the robot and process the various sensor data [Herath 10]. However, during the redeployment, integrating the exhibit into the ROS stack was determined to be the most viable course of action for long-term integration and modularity. The first step was the integration of various hardware components. The exhibit ran on a local network configured for a ROS network. A server rack was embedded behind the main body of the exhibit, hosting the hardware for running the THAMBS system and primary ROS system (Kinetic). In addition, a periphery Raspberry Pi ran the interactive keyboard terminal at the front of the exhibit. The camera and microphone sensors are connected to this primary ROS computer. The UR10 robot was controlled via a network interface and a trajectory controller hosted on the primary ROS PC. The video below shows a trajectory evaluation routine run at Questacon during the preliminary integration phase.

As a safety measure, a LiDAR sensor was embedded into the exhibit floor to ensure the robot would shut down if public members stepped over the barrier and into the enclosure. The LiDAR sensor was wired into the UR10’s configurable I/O to ensure a quick response if a safety violation occurred. After integrating the major hardware components, the ROS software development process occurred. The depth cameras first tracked and detected people in the environment via the NuiTrack SDK [3]. A ROS node collecting and publishing this information sent this information into THAMBS via a pub/sub pairing. Broadly speaking, all the sensor data would flow into THAMBS via similar pairings to sensor data streams. As THAMBS was a MATLAB program, extensions were available to interface with ROS publishers.

THUMBS could broadly be considered the intelligent system behind the robot. It would take in sensor data and output motion behaviour as part of the exhibit. Thanks to collaborators, the THAMBS MATLAB code was modified to use the UR10, and it was a relatively smooth transition to the new robot with ROS. A ROS node received outputted motion commands from THAMBS to move the system and respond to sensor stimuli. The final development was the chat and terminal component, in which participants would send messages to the robot via the keyboard. The robot would respond verbally via the rendered head on the monitor attached to the robot’s end coordinate frame. The ROS framework would send keystroke data to the THAMBS system and then display it on the GUI so participants could see the content they were inputting. After hitting the input keystroke, the ROS system would communicate strings to the chatbot module, receive the response, and publish it in the rendering of Stelarc, who would verbalise the response.

Deployment at Questacon

Several months were dedicated to developing the system before deploying it to Questacon. The system ended up being part of the Born or Built exhibit at Questacon and staying there for around a year during 2019 and 2020. The public would walk past and interact with the exhibit. The videos below display the Articulated Head at Questacon while we set it up on-site. The videos demonstrate moving, reacting and talking in an in-the-wild setting [4].

Broader Academic Context

Aside from the public deployment and response, the exhibit enabled numerous human-centred studies, one of which was a master’s student at the University of Canberra who used the platform to publish several articles about how the public reacted and interacted with the system [Gurung 21A, Gurung 21B, Gurung 23]. In addition, we also submitted the video presentation seen below to the robot design competition hosted at the International Conference on Social Robots (ICSR) 2022, which took second place, as shown in the accolades section.

References

[Herath 10] Damith, C. Herath, Zhengzi Zhang, and Nitin Yadav. “Thinking Head Framework: An open architecture for human centred robotics.” 2010 Fifth International Conference on Information and Automation for Sustainability. IEEE, 2010.

[Kroos 14] Kroos, Christian, and Damith C. Herath. “We, robots: Correlated behaviour as observed by humans.” Social Robotics: 6th International Conference, ICSR 2014, Sydney, NSW, Australia, October 27-29, 2014. Proceedings 6. Springer International Publishing, 2014.

[Kroos 12] Kroos, Christian, Damith C. Herath, and Stelarc. “Evoking agency: attention model and behavior control in a robotic art installation.” Leonardo 45.5 (2012): 401-407.

[Gurung 21A] Gurung, Neelu, Damith Herath, and Janie Busby Grant. “Feeling safe: A study on trust with an interactive robotic art installation.” Companion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction. 2021.

[Gurung 21B] Gurung, Neelu, Janie Busby Grant, and Damith Herath. “What’s in a face? The effect of faces in human robot interaction.” 2021 30th IEEE International conference on robot & human interactive communication (RO-MAN). IEEE, 2021.

[Gurung 23] Gurung, Neelu, Janie Busby Grant, and Damith Hearth. “The Uncanny Effect of Speech: The Impact of Appearance and Speaking on Impression Formation in Human–Robot Interactions.” International Journal of Social Robotics (2023): 1-16.

[1] - Robotic Art. The Research, Practice, and History. A website made by Dr Herath discussing the intersection of art and engineering
[2] - The original webpage describing the thinking head project and AH exhibit.
[3] - The NUITrack software used for pose estimation.
[4] - UC Uncover article outlining the exhibit