
Artificial intelligence represents one of the most disruptive innovations of our time, and its impact on education is immense, capable of completely revolutionizing the way we learn, teach, and interact with knowledge. It is not merely about technology, but rather a genuine paradigm shift that can make learning more accessible, personalized, and engaging. Among the most promising applications are those aimed at students with special educational needs; even students without disabilities benefit from intelligent tools that enhance critical thinking, creativity, collaboration, and metacognitive skills.
To educate students in critical thinking, empathy, and ethical reflection, a particularly interesting field is that of Emotional Robots: robots equipped with artificial intelligence that can recognize human emotions through facial expressions, voice tone, and body language. One example is Pepper, developed by SoftBank, already in use in educational settings to foster socio-affective interaction. Since 2021, various field experiences have demonstrated Pepper‘s effectiveness in interacting with autistic students, those with ADHD, or with socio-affective and relational difficulties, improving eye contact, immediate communication, and even physical connection. This type of robot not only supports the emotional side of learning but also integrates with computational thinking and educational robotics pathways to stimulate problem-solving, critical thinking, and self-regulation.
However, the humanoid appearance and the ability to respond emotionally risk leading to anthropomorphization—that is, the phenomenon in which users begin to attribute human characteristics to robots. This can be risky, as it generates unrealistic expectations, confusion among children or adolescents, and in some cases even emotional dependency. It is therefore essential to remember that, although they may appear “alive,” these robots are programmed by algorithms and operate according to mathematical models, devoid of authentic emotions.
On the methodological front, AI is also revolutionizing teaching with tools such as adaptive learning, augmented reality, virtual reality, and experiential learning paths in which students play an active role. Immersive technologies allow for the exploration of abstract concepts in a visual and interactive way, improving understanding and motivation. Deep learning, also promoted through AI, enables students to go beyond surface-level content and internalize what they learn, applying it in new contexts. This connects to metacognition, the ability to reflect on one’s own mental, strategic, and decision-making processes during study. In Italy, studies led by Cesare Cornoldi have emphasized the importance of metacognitive awareness in developing autonomous and competent students.
But with all these potentialities, we must not forget the risks. The use of artificial intelligence in education raises important ethical and security issues: who programs these systems? What data is collected and how is it used? Are the algorithms transparent or can they contain implicit biases? Is cybersecurity ensured? Educators and developers must work together to ensure that these technologies are used responsibly, inclusively, and for the common good. Transparency, system robustness, and the prevention of algorithmic discrimination are essential aspects that must always be carefully monitored. Only in this way can we truly harness AI as a resource in the service of education, rather than as a tool that creates new inequalities.
In conclusion, artificial intelligence and educational robotics represent extraordinary opportunities to improve the learning experience for all students. But to do so requires awareness, teacher training, ethical governance, and thoughtful pedagogical planning. If we can manage these tools with human intelligence, we can truly build a more inclusive, personalized, and innovative school—capable of preparing young people to face future challenges with the right tools. Education is not just about content delivery but about shaping people—and in this, AI can be a powerful ally, if used with care and humanity.
“A democracy cannot survive without citizens who are able to think for themselves, challenge authority, and imagine the lives of others.”
— Martha Nussbaum
Sources:
- European Commission (2023). Digital Education Action Plan.
- Reuters (2025). CES 2025: Emotional Robots by Mixi Inc.
- Hendrycks, D. (2023). Introduction to AI Safety, Ethics and Society.
- Cornoldi, C. (1995). Metacognizione e apprendimento. Bologna: Il Mulino.
- Cornoldi, C., De Beni, R., & Gruppo MT. (1993). Imparare a studiare. Trento: Edizioni Erickson.
- SoftBank Robotics (2021). Pepper Robot Applications in Education.