Imitation of Joint Attention in Human-Robot Interaction (HRI) during Two-Matchstick Problem Solving
- Authors: Stolyarova A.N.1,2
-
Affiliations:
- Russian Presidential Academy of National Economy and Public Administration
- National Research Center “Kurchatov Institute”
- Issue: No 1 (2026)
- Pages: 27-36
- Section: Articles
- URL: https://vektornaukipedagogika.ru/jour/article/view/1292
- DOI: https://doi.org/10.18323/3034-2996-2026-1-64-3
- ID: 1292
Cite item
Abstract
Problem. In the field of human-robot interaction (HRI), despite the development of anthropomorphic robots capable of initiating joint attention, it remains unclear how different types of cues (gesture+gaze vs. gaze only) and their accuracy influence automatic human following and subjective evaluation of the interaction. Aim. To compare the effectiveness of different types of cues from an anthropomorphic robot (pointing gestures combined with gaze and gaze-only cues) in a task requiring joint attention, as well as to assess the influence of cue accuracy on participants’ behaviour. Methods. The study involved 43 students from RANEPA (Russian Presidential Academy of National Economy and Public Administration): 38 females and 5 males aged 19 to 27 years (M=20.51; SD=1.82). The number of participant movements following the robot’s cues and coinciding with the cue direction was evaluated to assess the effectiveness of the robot’s cues in each condition. To study participants’ reactions to the cues after each task, a questionnaire based on Danek’s metacognitive scales was used. Results. The results demonstrated the robot’s ability to imitate the process of joint attention during problem-solving with the participant. The hypothesis regarding the greater effectiveness of robot cues using pointing gestures combined with gaze and head movement compared to gaze-only cues was confirmed. The hypothesis regarding the greater effectiveness of correct cues compared to incorrect robot cues was confirmed. Conclusions. The robot’s ability to imitate the joint attention process during problem-solving with the participant was demonstrated; participants paid attention to the robot’s cues and attempted to follow them in both correct and incorrect cue conditions. However, in the condition with correct cues, the percentage of response attempts coinciding with the cue direction was significantly higher than in the condition with incorrect cues.
About the authors
Anastasia N. Stolyarova
Russian Presidential Academy of National Economy and Public Administration; National Research Center “Kurchatov Institute”
Author for correspondence.
Email: anastasiyas050298@gmail.com
ORCID iD: 0009-0005-7155-0426
postgraduate student, research engineer.
Russian Federation, 119571, Russia, Moscow, Prospekt Vernadskogo, 82.; 123182, Russia, Moscow, academician Kurchatov Square, 1.References
- Morandini S., Curro F., Parlangeli O., Pietrantoni L. Collaborative Robots Adapting Their Behavior Based on Workers‘ Psychological States: A Systematic Scoping Review. Human Behavior and Emerging Technologies, 2025, vol. 2025, no. 1, article number 6361777. doi: 10.1155/hbe2/6361777.
- Garcia-Martinez J., Gamboa-Montero J.J., Castillo J.C., Castro-Gonzalez A. Analyzing the Impact of Responding to Joint Attention on the User Perception of the Robot in Human-Robot Interaction. Biomimetics, 2024, vol. 9, no. 12, article number 769. doi: 10.3390/biomimetics9120769.
- Galin R.R., Serebrennyy V.V., Tevyashov G.K., Shirokiy A.A. Human-robot interaction in collaborative robotic systems. Proceedings of Southwest State University, 2020, vol. 24, no. 4, pp. 180–199. doi: 10.21869/2223-1560-2020-24-4-180-199.
- Cochet H., Guidetti M. Contribution of Developmental Psychology to the Study of Social Interactions: Some Factors in Play, Joint Attention and Joint Action and Implications for Robotics. Frontiers in Psychology, 2018, vol. 9, article number 1992. doi: 10.3389/fpsyg.2018.01992.
- Urakami J., Seaborn K. Nonverbal Cues in Human–Robot Interaction: A Communication Studies Perspective. ACM Transactions on Human-Robot Interaction, 2023, vol. 12, no. 2, pp. 1–21. doi: 10.1145/3570169.
- Jing Yang, Barragan J.A., Farrow J.M., Sundaram C.P., Wachs J.P., Denny Yu. An Adaptive Human-Robotic Interaction Architecture for Augmenting Surgery Performance Using Real-Time Workload Sensing – Demonstration of a SemiAutonomous Suction Tool. Human Factors 66, 2024, vol. 4, pp. 1081–1102. doi: 10.1177/00187208221129940.
- Salminen-Saari J.F.A., Moreno-Esteva E.G., Haataja E., Toivanen M., Hannula M.S., Laine A. Phases of collaborative mathematical problem solving and joint attention: a case study utilizing mobile gaze tracking. ZDM Mathematics Education, 2021, vol. 53, pp. 771–784. doi: 10.1007/s11858-021-01280-z.
- Anzalone S.M., Xavier J., Boucenna S., Billeci L., Narzisi A., Muratori F., Cohen D., Chetouaniet M. Quantifying patterns of joint attention during human-robot interactions: An application for autism spectrum disorder assessment. Pattern Recognition Letters, 2018, vol. 118, pp. 42–50. doi: 10.1016/j.patrec.2018.03.007.
- Kumazaki H., Yoshikawa Y., Yoshimura Y., Ikeda T., Hasegawa C., Saito D.N., Tomiyama S., Kyung-min An, Shimaya J., Ishiguro H., Matsumoto Y., Minabe Y., Kikuchi M. The impact of robotic intervention on joint attention in children with autism spectrum disorders. Molecular Autism, 2018, vol. 9, article number 46. doi: 10.1186/s13229-018-0230-8.
- Kim M., Kwon T., Kim K. Can Human–Robot Interaction Promote the Same Depth of Social Information Processing as Human–Human Interaction? International Journal of Social Robotics, 2017, vol. 10, pp. 33–42. doi: 10.1007/s12369-017-0428-5.
- Mwangi E., Barakova E.I., Díaz-Boladeras M., Mallofré A.C., Rauterberg M. Directing Attention Through Gaze Hints Improves Task Solving in Human–Humanoid Interaction. International Journal of Social Robotics, 2018, vol. 10, pp. 343–355. doi: 10.1007/s12369-018-0473-8.
- Chevalier P., Kompatsiari K., Ciardo F., Wykowska A. Examining joint attention with the use of humanoid robots-A new approach to study fundamental mechanisms of social cognition. Psychonomic Bulletin and Review, 2020, vol. 27, no. 2, pp. 217–236. doi: 10.3758/s13423-019-01689-4.
- Fournier É., Jeoffrion C., Hmedan B., Pellier D., Fiorino H., Landry A. Human-Cobot Collaboration’s Impact on Success, Time Completion, Errors, Workload, Gestures and Acceptability During an Assembly Task. Applied Ergonomics, 2024, vol. 119, article number 104306. doi: 10.1016/j.apergo.2024.104306.
- Hoang-Long Cao, Simut R.E., Krepel N., Vanderborght B., Vanderfaeillie J. Could NAO Robot Function as Model Demonstrating Joint Attention Skills for Children with Autism Spectrum Disorder? An Exploratory Study. International Journal of Humanoid Robotics, 2022, vol. 19, no. 4, article number 2240006. doi: 10.1142/S0219843622400060.
- Okafuji Y., Baba J., Nakanishi J., Kuramoto I., Ogawa K., Yoshikawa Y., Ishiguro H. Can a humanoid robot continue to draw attention in an office environment? Advanced Robotics, 2020, vol. 34, no. 14, pp. 931–946. doi: 10.1080/01691864.2020.1769724.
- Knoblich G., Ohlsson S., Haider H., Rhenius D. Constraint relaxation and chunk decomposition in insight problem solving. Journal of Experimental Psychology: Learning, Memory, and Cognition, 1999, vol. 25, no. 6, pp. 1534–1555. doi: 10.1037/0278-7393.25.6.1534.
- Spiridonov V.F., Erofeeva M.A., Klovayt N.O., Ardislamov V.V., Morozov M.I., Zdilar S. Interactive problem solving revisited: replicating the effects of interactivity using matchstick algebra problems. Psychological studies, 2021, vol. 14, no. 79, pp. 1–46. doi: 10.54359/ps.v14i79.119.
- Öllinger H.M., Jones G., Knoblich G. Euristics and representational change in two-move matchstick tasks. Advances in Cognitive Psychology, 2006, vol. 2, no. 4, no. 239–253. doi: 10.2478/v10053-008-0059-3.
- Scorza Azzará G., Zonca J., Rea F., Joo-Hyun Song, Sciutti A. Collaborating with a robot biases human spatial attention. iScience, 2025, vol. 28, no. 7, article number 112791. doi: 10.1016/j.isci.2025.112791.
- Pöysä-Tarhonen J., Shupin Li, Hautala J., Awwal N., Häkkinen P. Making the Invisible Visible: Exploring Joint Attention Behaviour in Remote Collaborative Problem‐Solving. Journal of Computer Assisted Learning, 2025, vol. 41, no. 4, article number e70068. doi: 10.1111/jcal.70068.
- Ehrlich S.K., Dean-Leon E., Tacca N., Armleder S., Dimova-Edeleva V., Gordon Cheng. Human-robot collaborative task planning using anticipatory brain responses. PLoS ONE, 2023, vol. 8, no. 7, article number e0287958. doi: 10.1371/journal.pone.0287958.
Supplementary files


