<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE root>
<article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:ali="http://www.niso.org/schemas/ali/1.0/" article-type="research-article" dtd-version="1.2" xml:lang="en"><front><journal-meta><journal-id journal-id-type="publisher-id">Evidence-based education studies</journal-id><journal-title-group><journal-title xml:lang="en">Evidence-based education studies</journal-title><trans-title-group xml:lang="ru"><trans-title>Доказательная педагогика, психология</trans-title></trans-title-group></journal-title-group><issn publication-format="print">3034-2996</issn><issn publication-format="electronic">3034-4220</issn><publisher><publisher-name xml:lang="en">Togliatti State University</publisher-name></publisher></journal-meta><article-meta><article-id pub-id-type="publisher-id">1292</article-id><article-id pub-id-type="doi">10.18323/3034-2996-2026-1-64-3</article-id><article-categories><subj-group subj-group-type="toc-heading" xml:lang="en"><subject>Articles</subject></subj-group><subj-group subj-group-type="toc-heading" xml:lang="ru"><subject>Статьи</subject></subj-group><subj-group subj-group-type="article-type"><subject>Research Article</subject></subj-group></article-categories><title-group><article-title xml:lang="en">Imitation of Joint Attention in Human-Robot Interaction (HRI) during Two-Matchstick Problem Solving</article-title><trans-title-group xml:lang="ru"><trans-title>Имитация совместного внимания при взаимодействии робота и человека (HRI) при решении двухспичечных задач</trans-title></trans-title-group></title-group><contrib-group><contrib contrib-type="author"><contrib-id contrib-id-type="orcid">https://orcid.org/0009-0005-7155-0426</contrib-id><name-alternatives><name xml:lang="en"><surname>Stolyarova</surname><given-names>Anastasia N.</given-names></name><name xml:lang="ru"><surname>Столярова</surname><given-names>Анастасия Николаевна</given-names></name></name-alternatives><address><country country="RU">Russian Federation</country></address><bio xml:lang="en"><p>postgraduate student, research engineer.</p></bio><bio xml:lang="ru"><p>аспирант, инженер-исследователь.</p></bio><email>anastasiyas050298@gmail.com</email><xref ref-type="aff" rid="aff1"/><xref ref-type="aff" rid="aff2"/></contrib></contrib-group><aff-alternatives id="aff1"><aff><institution xml:lang="en">Russian Presidential Academy of National Economy and Public Administration</institution></aff><aff><institution xml:lang="ru">Российская академия народного хозяйства и государственной службы&#13;
при Президенте Российской Федерации</institution></aff></aff-alternatives><aff-alternatives id="aff2"><aff><institution xml:lang="en">National Research Center “Kurchatov Institute”</institution></aff><aff><institution xml:lang="ru">Национальный исследовательский центр «Курчатовский институт»</institution></aff></aff-alternatives><pub-date date-type="pub" iso-8601-date="2026-03-31" publication-format="electronic"><day>31</day><month>03</month><year>2026</year></pub-date><issue>1</issue><issue-title xml:lang="en"/><issue-title xml:lang="ru"/><fpage>27</fpage><lpage>36</lpage><history><date date-type="received" iso-8601-date="2026-03-31"><day>31</day><month>03</month><year>2026</year></date><date date-type="accepted" iso-8601-date="2026-03-31"><day>31</day><month>03</month><year>2026</year></date></history><permissions><copyright-statement xml:lang="en">Copyright ©; 2026, Stolyarova A.N.</copyright-statement><copyright-statement xml:lang="ru">Copyright ©; 2026, Столярова А.Н.</copyright-statement><copyright-year>2026</copyright-year><copyright-holder xml:lang="en">Stolyarova A.N.</copyright-holder><copyright-holder xml:lang="ru">Столярова А.Н.</copyright-holder><ali:free_to_read xmlns:ali="http://www.niso.org/schemas/ali/1.0/"/><license><ali:license_ref xmlns:ali="http://www.niso.org/schemas/ali/1.0/">https://creativecommons.org/licenses/by/4.0</ali:license_ref></license></permissions><self-uri xlink:href="https://vektornaukipedagogika.ru/jour/article/view/1292">https://vektornaukipedagogika.ru/jour/article/view/1292</self-uri><abstract xml:lang="en"><p><bold>Problem.</bold> In the field of human-robot interaction (HRI), despite the development of anthropomorphic robots capable of initiating joint attention, it remains unclear how different types of cues (gesture+gaze vs. gaze only) and their accuracy influence automatic human following and subjective evaluation of the interaction. <bold>Aim.</bold> To compare the effectiveness of different types of cues from an anthropomorphic robot (pointing gestures combined with gaze and gaze-only cues) in a task requiring joint attention, as well as to assess the influence of cue accuracy on participants’ behaviour. <bold>Methods.</bold> The study involved 43 students from RANEPA (Russian Presidential Academy of National Economy and Public Administration): 38 females and 5 males aged 19 to 27 years (M=20.51; SD=1.82). The number of participant movements following the robot’s cues and coinciding with the cue direction was evaluated to assess the effectiveness of the robot’s cues in each condition. To study participants’ reactions to the cues after each task, a questionnaire based on Danek’s metacognitive scales was used. <bold>Results.</bold> The results demonstrated the robot’s ability to imitate the process of joint attention during problem-solving with the participant. The hypothesis regarding the greater effectiveness of robot cues using pointing gestures combined with gaze and head movement compared to gaze-only cues was confirmed. The hypothesis regarding the greater effectiveness of correct cues compared to incorrect robot cues was confirmed. <bold>Conclusions.</bold> The robot’s ability to imitate the joint attention process during problem-solving with the participant was demonstrated; participants paid attention to the robot’s cues and attempted to follow them in both correct and incorrect cue conditions. However, in the condition with correct cues, the percentage of response attempts coinciding with the cue direction was significantly higher than in the condition with incorrect cues.</p></abstract><trans-abstract xml:lang="ru"><p><bold>Проблема.</bold> В области человеко-роботного взаимодействия (Human-Robot Interaction, HRI) несмотря на развитие антропоморфных роботов, способных инициировать совместное внимание, остается неясным, как различные типы подсказок (жест + взгляд vs. только взгляд) и их правильность влияют на автоматическое следование человека и его субъективную оценку взаимодействия. <bold>Цель.</bold> Сравнить эффективность различных типов подсказок антропоморфного робота (указательные жесты, совмещенные со взглядом, и подсказки только взглядом) в задаче, требующей совместного внимания, а также оценить влияние правильности подсказок на поведение испытуемых. <bold>Методы.</bold> В исследовании приняли участие 43 студента РАНХИГС: 38 девушек и 5 юношей в возрасте от 19 до 27 лет (M=20,51; SD=1,82). Количество движений испытуемых, следующих за подсказками робота и совпадающих с направлением подсказки, оценивались для оценки эффективности подсказок робота в каждом из условий. Для изучения реакции испытуемых на подсказки после каждой задачи применялся опросник, созданный на основе метакогнитивных шкал Данек. <bold>Результаты.</bold> Результаты показали, что способность робота имитировать процесс совместного внимания при решении задач с испытуемым была доказана. Гипотеза о большей эффективности подсказок робота с помощью указательных жестов, объединенных с перемещением взгляда и головы робота, по сравнению с подсказками подтвердилась. Гипотеза о большей эффективности правильных подсказок по сравнению с неправильными подсказками робота подтвердилась. <bold>Выводы.</bold> Способность робота имитировать процесс совместного внимания при решении задач с испытуемым была доказана, испытуемые обращали внимание на подсказки робота и старались следовать им как в условиях правильных подсказок, так и неправильных. Однако в условиях с правильными подсказками процент совпадающих по направлению с подсказками попыток решения был значительно выше, чем в условиях с неправильными подсказками.</p></trans-abstract><kwd-group xml:lang="en"><kwd>Human-Robot Interaction</kwd><kwd>HRI</kwd><kwd>anthropomorphic robots</kwd><kwd>joint attention</kwd><kwd>gaze and gesture cues</kwd><kwd>modified Knoblich tasks</kwd><kwd>Danek’s metacognitive scales</kwd></kwd-group><kwd-group xml:lang="ru"><kwd>взаимодействие человека и робота</kwd><kwd>Human-Robot Interaction</kwd><kwd>HRI</kwd><kwd>антропоморфные роботы</kwd><kwd>совместное внимание</kwd><kwd>подсказки взглядом и жестами</kwd><kwd>модифицированные задачи Кноблиха</kwd><kwd>метакогнитивные шкалы Данек</kwd></kwd-group><funding-group><funding-statement xml:lang="en">This work was carried out within the state assignment to the National Research Center “Kurchatov Institute”. The development of the methodology, including the experimental conditions, gestures, and communication strategies of the anthropomorphic interface, was carried out as part of the state assignment of the Kurchatov Institute National Research Center. The statistical analysis of the experimental results, including comparisons of the experimental conditions and the analysis of the participants' subjective and behavioral assessments, was carried out as part of the Russian Science Foundation grant No. 25-78-10154, https://rscf.ru/project/25-78-10154/.</funding-statement><funding-statement xml:lang="ru">Работа выполнена в рамках государственного задания НИЦ «Курчатовский институт». Разработка методики, в том числе экспериментальные условия, жесты и стратегии общения антропоморфного интерфейса было выполнено в рамках государственного задания НИЦ «Курчатовский институт». Статистический анализ результатов эксперимента, включающий сравнения экспериментальных условий, анализ субъективных и поведенческих оценок участников выполнялось в рамках гранта Российского научного фонда № 25-78-10154, https://rscf.ru/project/25-78-10154/.</funding-statement></funding-group></article-meta></front><body></body><back><ref-list><ref id="B1"><label>1.</label><citation-alternatives><mixed-citation xml:lang="en">Morandini S., Curro F., Parlangeli O., Pietrantoni L. Collaborative Robots Adapting Their Behavior Based on Workers‘ Psychological States: A Systematic Scoping Review. Human Behavior and Emerging Technologies, 2025, vol. 2025, no. 1, article number 6361777. DOI: 10.1155/hbe2/6361777.</mixed-citation><mixed-citation xml:lang="ru">Morandini S., Curro F., Parlangeli O., Pietrantoni L. Collaborative Robots Adapting Their Behavior Based on Workers‘ Psychological States: A Systematic Scoping Review // Human Behavior and Emerging Technologies. 2025. Vol. 2025. № 1. Article number 6361777. DOI: 10.1155/hbe2/6361777.</mixed-citation></citation-alternatives></ref><ref id="B2"><label>2.</label><citation-alternatives><mixed-citation xml:lang="en">Garcia-Martinez J., Gamboa-Montero J.J., Castillo J.C., Castro-Gonzalez A. Analyzing the Impact of Responding to Joint Attention on the User Perception of the Robot in Human-Robot Interaction. Biomimetics, 2024, vol. 9, no. 12, article number 769. DOI: 10.3390/biomimetics9120769.</mixed-citation><mixed-citation xml:lang="ru">Garcia-Martinez J., Gamboa-Montero J.J., Castillo J.C., Castro-Gonzalez A. Analyzing the Impact of Responding to Joint Attention on the User Perception of the Robot in Human-Robot Interaction // Biomimetics. 2024. Vol. 9. № 12. Article number 769. DOI: 10.3390/biomimetics9120769.</mixed-citation></citation-alternatives></ref><ref id="B3"><label>3.</label><citation-alternatives><mixed-citation xml:lang="en">Galin R.R., Serebrennyy V.V., Tevyashov G.K., Shirokiy A.A. Human-robot interaction in collaborative robotic systems. Proceedings of Southwest State University, 2020, vol. 24, no. 4, pp. 180–199. DOI: 10.21869/2223-1560-2020-24-4-180-199.</mixed-citation><mixed-citation xml:lang="ru">Галин Р.Р., Серебренный В.В., Тевяшов Г.К., Широкий А.А. Взаимодействие человека и робота в коллаборативных робототехнических системах // Известия Юго-Западного государственного университета. 2020. Т. 24. № 4. С. 180–199. DOI: 10.21869/2223-1560-2020-24-4-180-199.</mixed-citation></citation-alternatives></ref><ref id="B4"><label>4.</label><citation-alternatives><mixed-citation xml:lang="en">Cochet H., Guidetti M. Contribution of Developmental Psychology to the Study of Social Interactions: Some Factors in Play, Joint Attention and Joint Action and Implications for Robotics. Frontiers in Psychology, 2018, vol. 9, article number 1992. DOI: 10.3389/fpsyg.2018.01992.</mixed-citation><mixed-citation xml:lang="ru">Cochet H., Guidetti M. Contribution of Developmental Psychology to the Study of Social Interactions: Some Factors in Play, Joint Attention and Joint Action and Implications for Robotics // Frontiers in Psychology. 2018. Vol. 9. Article number 1992. DOI: 10.3389/fpsyg.2018.01992.</mixed-citation></citation-alternatives></ref><ref id="B5"><label>5.</label><citation-alternatives><mixed-citation xml:lang="en">Urakami J., Seaborn K. Nonverbal Cues in Human–Robot Interaction: A Communication Studies Perspective. ACM Transactions on Human-Robot Interaction, 2023, vol. 12, no. 2, pp. 1–21. DOI: 10.1145/3570169.</mixed-citation><mixed-citation xml:lang="ru">Urakami J., Seaborn K. Nonverbal Cues in Human–Robot Interaction: A Communication Studies Perspective // ACM Transactions on Human-Robot Interaction. 2023. Vol. 12. № 2. P. 1–21. DOI: 10.1145/3570169.</mixed-citation></citation-alternatives></ref><ref id="B6"><label>6.</label><citation-alternatives><mixed-citation xml:lang="en">Jing Yang, Barragan J.A., Farrow J.M., Sundaram C.P., Wachs J.P., Denny Yu. An Adaptive Human-Robotic Interaction Architecture for Augmenting Surgery Performance Using Real-Time Workload Sensing – Demonstration of a SemiAutonomous Suction Tool. Human Factors 66, 2024, vol. 4, pp. 1081–1102. DOI: 10.1177/00187208221129940.</mixed-citation><mixed-citation xml:lang="ru">Jing Yang, Barragan J.A., Farrow J.M., Sundaram C.P., Wachs J.P., Denny Yu. An Adaptive Human-Robotic Interaction Architecture for Augmenting Surgery Performance Using Real-Time Workload Sensing – Demonstration of a SemiAutonomous Suction Tool // Human Factors 66. 2024. Vol. 4. P. 1081–1102. DOI: 10.1177/00187208221129940.</mixed-citation></citation-alternatives></ref><ref id="B7"><label>7.</label><citation-alternatives><mixed-citation xml:lang="en">Salminen-Saari J.F.A., Moreno-Esteva E.G., Haataja E., Toivanen M., Hannula M.S., Laine A. Phases of collaborative mathematical problem solving and joint attention: a case study utilizing mobile gaze tracking. ZDM Mathematics Education, 2021, vol. 53, pp. 771–784. DOI: 10.1007/s11858-021-01280-z.</mixed-citation><mixed-citation xml:lang="ru">Salminen-Saari J.F.A., Moreno-Esteva E.G., Haataja E., Toivanen M., Hannula M.S., Laine A. Phases of collaborative mathematical problem solving and joint attention: a case study utilizing mobile gaze tracking // ZDM Mathematics Education. 2021. Vol. 53. P. 771–784. DOI: 10.1007/s11858-021-01280-z.</mixed-citation></citation-alternatives></ref><ref id="B8"><label>8.</label><citation-alternatives><mixed-citation xml:lang="en">Anzalone S.M., Xavier J., Boucenna S., Billeci L., Narzisi A., Muratori F., Cohen D., Chetouaniet M. Quantifying patterns of joint attention during human-robot interactions: An application for autism spectrum disorder assessment. Pattern Recognition Letters, 2018, vol. 118, pp. 42–50. DOI: 10.1016/j.patrec.2018.03.007.</mixed-citation><mixed-citation xml:lang="ru">Anzalone S.M., Xavier J., Boucenna S., Billeci L., Narzisi A., Muratori F., Cohen D., Chetouaniet M. Quantifying patterns of joint attention during human-robot interactions: An application for autism spectrum disorder assessment // Pattern Recognition Letters. 2018. Vol. 118. P. 42–50. DOI: 10.1016/j.patrec.2018.03.007.</mixed-citation></citation-alternatives></ref><ref id="B9"><label>9.</label><citation-alternatives><mixed-citation xml:lang="en">Kumazaki H., Yoshikawa Y., Yoshimura Y., Ikeda T., Hasegawa C., Saito D.N., Tomiyama S., Kyung-min An, Shimaya J., Ishiguro H., Matsumoto Y., Minabe Y., Kikuchi M. The impact of robotic intervention on joint attention in children with autism spectrum disorders. Molecular Autism, 2018, vol. 9, article number 46. DOI: 10.1186/s13229-018-0230-8.</mixed-citation><mixed-citation xml:lang="ru">Kumazaki H., Yoshikawa Y., Yoshimura Y., Ikeda T., Hasegawa C., Saito D.N., Tomiyama S., Kyung-min An, Shimaya J., Ishiguro H., Matsumoto Y., Minabe Y., Kikuchi M. The impact of robotic intervention on joint attention in children with autism spectrum disorders // Molecular Autism. 2018. Vol. 9. Article number 46. DOI: 10.1186/s13229-018-0230-8.</mixed-citation></citation-alternatives></ref><ref id="B10"><label>10.</label><citation-alternatives><mixed-citation xml:lang="en">Kim M., Kwon T., Kim K. Can Human–Robot Interaction Promote the Same Depth of Social Information Processing as Human–Human Interaction? International Journal of Social Robotics, 2017, vol. 10, pp. 33–42. DOI: 10.1007/s12369-017-0428-5.</mixed-citation><mixed-citation xml:lang="ru">Kim M., Kwon T., Kim K. Can Human–Robot Interaction Promote the Same Depth of Social Information Processing as Human–Human Interaction? // International Journal of Social Robotics. 2017. Vol. 10. P. 33–42. DOI: 10.1007/s12369-017-0428-5.</mixed-citation></citation-alternatives></ref><ref id="B11"><label>11.</label><citation-alternatives><mixed-citation xml:lang="en">Mwangi E., Barakova E.I., Díaz-Boladeras M., Mallofré A.C., Rauterberg M. Directing Attention Through Gaze Hints Improves Task Solving in Human–Humanoid Interaction. International Journal of Social Robotics, 2018, vol. 10, pp. 343–355. DOI: 10.1007/s12369-018-0473-8.</mixed-citation><mixed-citation xml:lang="ru">Mwangi E., Barakova E.I., Díaz-Boladeras M., Mallofré A.C., Rauterberg M. Directing Attention Through Gaze Hints Improves Task Solving in Human–Humanoid Interaction // International Journal of Social Robotics. 2018. Vol. 10. P. 343–355. DOI: 10.1007/s12369-018-0473-8.</mixed-citation></citation-alternatives></ref><ref id="B12"><label>12.</label><citation-alternatives><mixed-citation xml:lang="en">Chevalier P., Kompatsiari K., Ciardo F., Wykowska A. Examining joint attention with the use of humanoid robots-A new approach to study fundamental mechanisms of social cognition. Psychonomic Bulletin and Review, 2020, vol. 27, no. 2, pp. 217–236. DOI: 10.3758/s13423-019-01689-4.</mixed-citation><mixed-citation xml:lang="ru">Chevalier P., Kompatsiari K., Ciardo F., Wykowska A. Examining joint attention with the use of humanoid robots-A new approach to study fundamental mechanisms of social cognition // Psychonomic Bulletin and Review. 2020. Vol. 27. № 2. P. 217–236. DOI: 10.3758/s13423-019-01689-4.</mixed-citation></citation-alternatives></ref><ref id="B13"><label>13.</label><citation-alternatives><mixed-citation xml:lang="en">Fournier É., Jeoffrion C., Hmedan B., Pellier D., Fiorino H., Landry A. Human-Cobot Collaboration’s Impact on Success, Time Completion, Errors, Workload, Gestures and Acceptability During an Assembly Task. Applied Ergonomics, 2024, vol. 119, article number 104306. DOI: 10.1016/j.apergo.2024.104306.</mixed-citation><mixed-citation xml:lang="ru">Fournier É., Jeoffrion C., Hmedan B., Pellier D., Fiorino H., Landry A. Human-Cobot Collaboration’s Impact on Success, Time Completion, Errors, Workload, Gestures and Acceptability During an Assembly Task // Applied Ergonomics. 2024. Vol. 119. Article number 104306. DOI: 10.1016/j.apergo.2024.104306.</mixed-citation></citation-alternatives></ref><ref id="B14"><label>14.</label><citation-alternatives><mixed-citation xml:lang="en">Hoang-Long Cao, Simut R.E., Krepel N., Vanderborght B., Vanderfaeillie J. Could NAO Robot Function as Model Demonstrating Joint Attention Skills for Children with Autism Spectrum Disorder? An Exploratory Study. International Journal of Humanoid Robotics, 2022, vol. 19, no. 4, article number 2240006. DOI: 10.1142/S0219843622400060.</mixed-citation><mixed-citation xml:lang="ru">Hoang-Long Cao, Simut R.E., Krepel N., Vanderborght B., Vanderfaeillie J. Could NAO Robot Function as Model Demonstrating Joint Attention Skills for Children with Autism Spectrum Disorder? An Exploratory Study // International Journal of Humanoid Robotics. 2022. Vol. 19. № 4. Article number 2240006. DOI: 10.1142/S0219843622400060.</mixed-citation></citation-alternatives></ref><ref id="B15"><label>15.</label><citation-alternatives><mixed-citation xml:lang="en">Okafuji Y., Baba J., Nakanishi J., Kuramoto I., Ogawa K., Yoshikawa Y., Ishiguro H. Can a humanoid robot continue to draw attention in an office environment? Advanced Robotics, 2020, vol. 34, no. 14, pp. 931–946. DOI: 10.1080/01691864.2020.1769724.</mixed-citation><mixed-citation xml:lang="ru">Okafuji Y., Baba J., Nakanishi J., Kuramoto I., Ogawa K., Yoshikawa Y., Ishiguro H. Can a humanoid robot continue to draw attention in an office environment? // Advanced Robotics. 2020. Vol. 34. № 14. P. 931–946. DOI: 10.1080/01691864.2020.1769724.</mixed-citation></citation-alternatives></ref><ref id="B16"><label>16.</label><citation-alternatives><mixed-citation xml:lang="en">Knoblich G., Ohlsson S., Haider H., Rhenius D. Constraint relaxation and chunk decomposition in insight problem solving. Journal of Experimental Psychology: Learning, Memory, and Cognition, 1999, vol. 25, no. 6, pp. 1534–1555. DOI: 10.1037/0278-7393.25.6.1534.</mixed-citation><mixed-citation xml:lang="ru">Knoblich G., Ohlsson S., Haider H., Rhenius D. Constraint relaxation and chunk decomposition in insight problem solving // Journal of Experimental Psychology: Learning, Memory, and Cognition. 1999. Vol. 25. № 6. P. 1534–1555. DOI: 10.1037/0278-7393.25.6.1534.</mixed-citation></citation-alternatives></ref><ref id="B17"><label>17.</label><citation-alternatives><mixed-citation xml:lang="en">Spiridonov V.F., Erofeeva M.A., Klovayt N.O., Ardislamov V.V., Morozov M.I., Zdilar S. Interactive problem solving revisited: replicating the effects of interactivity using matchstick algebra problems. Psychological studies, 2021, vol. 14, no. 79, pp. 1–46. DOI: 10.54359/ps.v14i79.119.</mixed-citation><mixed-citation xml:lang="ru">Спиридонов В.Ф., Ерофеева М.А., Кловайт Н.О., Ардисламов В.В., Морозов М.И., Здилар С. Репликация эффектов интерактивного решения задач спичечной алгебры // Психологические исследования. 2021. Т. 14. № 79. С. 1–46. DOI: 10.54359/ps.v14i79.119.</mixed-citation></citation-alternatives></ref><ref id="B18"><label>18.</label><citation-alternatives><mixed-citation xml:lang="en">Öllinger H.M., Jones G., Knoblich G. Euristics and representational change in two-move matchstick tasks. Advances in Cognitive Psychology, 2006, vol. 2, no. 4, no. 239–253. DOI: 10.2478/v10053-008-0059-3.</mixed-citation><mixed-citation xml:lang="ru">Öllinger H.M., Jones G., Knoblich G. Euristics and representational change in two-move matchstick tasks // Advances in Cognitive Psychology. 2006. Vol. 2. № 4. P. 239–253. DOI: 10.2478/v10053-008-0059-3.</mixed-citation></citation-alternatives></ref><ref id="B19"><label>19.</label><citation-alternatives><mixed-citation xml:lang="en">Scorza Azzará G., Zonca J., Rea F., Joo-Hyun Song, Sciutti A. Collaborating with a robot biases human spatial attention. iScience, 2025, vol. 28, no. 7, article number 112791. DOI: 10.1016/j.isci.2025.112791.</mixed-citation><mixed-citation xml:lang="ru">Scorza Azzará G., Zonca J., Rea F., Joo-Hyun Song, Sciutti A. Collaborating with a robot biases human spatial attention // iScience. 2025. Vol. 28. № 7. Article number 112791. DOI: 10.1016/j.isci.2025.112791.</mixed-citation></citation-alternatives></ref><ref id="B20"><label>20.</label><citation-alternatives><mixed-citation xml:lang="en">Pöysä-Tarhonen J., Shupin Li, Hautala J., Awwal N., Häkkinen P. Making the Invisible Visible: Exploring Joint Attention Behaviour in Remote Collaborative Problem‐Solving. Journal of Computer Assisted Learning, 2025, vol. 41, no. 4, article number e70068. DOI: 10.1111/jcal.70068.</mixed-citation><mixed-citation xml:lang="ru">Pöysä-Tarhonen J., Shupin Li, Hautala J., Awwal N., Häkkinen P. Making the Invisible Visible: Exploring Joint Attention Behaviour in Remote Collaborative Problem‐Solving // Journal of Computer Assisted Learning. 2025. Vol. 41. № 4. Article number e70068. DOI: 10.1111/jcal.70068.</mixed-citation></citation-alternatives></ref><ref id="B21"><label>21.</label><citation-alternatives><mixed-citation xml:lang="en">Ehrlich S.K., Dean-Leon E., Tacca N., Armleder S., Dimova-Edeleva V., Gordon Cheng. Human-robot collaborative task planning using anticipatory brain responses. PLoS ONE, 2023, vol. 8, no. 7, article number e0287958. DOI: 10.1371/journal.pone.0287958.</mixed-citation><mixed-citation xml:lang="ru">Ehrlich S.K., Dean-Leon E., Tacca N., Armleder S., Dimova-Edeleva V., Gordon Cheng. Human-robot collaborative task planning using anticipatory brain responses // PLoS ONE. 2023. Vol. 8. № 7. Article number e0287958. DOI: 10.1371/journal.pone.0287958.</mixed-citation></citation-alternatives></ref></ref-list></back></article>
