Inheriting Ancient Dreams
I keep returning to how easily our technologies inherit humanity’s oldest dreams. These dreams grow within what scholars call our social and technological imaginaries: the shared mental worlds through which societies decide what creation means and what machines are for. These imaginaries are collective fictions that make some futures feel inevitable and others improbable.
As Nina Beguš notes in her forthcoming Artificial Humanities, our conversational machines descend from mythic and literary ancestors. They speak through centuries of metaphor, carrying humanity’s recurring wish to animate the inanimate.
One of the oldest versions of this dream is that of Pygmalion in Ovid’s Metamorphoses.
Pygmalion, a disenchanted sculptor, carves a statue of the perfect woman and falls in love with his creation. Moved by his devotion, Aphrodite grants the statue life. It anticipates Generative AI by imagining that a form fashioned from stone might one day return our gaze.
That same logic reappears in George Bernard Shaw’s Pygmalion, where Professor Higgins trains Eliza Doolittle to speak like a duchess. His social experiment becomes an allegory of simulation. Like many creators, Higgins treats Eliza as an instrument of his will rather than an autonomous being. Yet when she asserts independence, the creator finds himself dependent on the created.
The pattern surfaces again in computing history through the Eliza Effect. The 1965 chatbot named for Shaw’s character revealed how swiftly humans project understanding onto machines. People responded as if the program truly cared because that was the only conversational script they knew. Modern systems build on that same reflex.
The story echoes that of Narcissus and Echo in Book III of the Metamorphoses. There, the Roman poet Ovid (43 BCE-17 AD) recounts the tragic fate of young Narcissus, whose beauty and pride became a fatal prison when he caught sight of his own face in a reflecting pool:
“Here, the boy, tired by the heat and his enthusiasm for the chase, lies down, drawn to it by its look and by the fountain. While he desires to quench his thirst, a different thirst is created. While he drinks he is seized by the vision of his reflected form. He loves a bodiless dream…”
Generative AI acts similarly, reflecting us with dazzling precision while dissolving the boundary between self and simulation. Like Narcissus, we risk mistaking reflection for relation.
The mirror’s danger is seduction. We chase its perfection and forget the fragile, embodied strangeness that makes us human.
This is like another Portrait of Dorian Gray: a technology that safeguards our image while exposing the decay beneath.
From another tradition comes the Persian legend of Jamshid’s cup (جام جم), said to reveal every corner of the world to its possessor. Some versions link it to Darius I’s secret network of “eyes and ears.” The cup, like Narcissus’s pool or Pygmalion’s statue, promises knowledge and dominion yet ensnares those who gaze too long into its depths.
Mary Shelley’s Frankenstein translates this lineage into the scientific age. Her story replaces divine power with human intellect but keeps the same emotional blueprint. Victor Frankenstein longs to acquire knowledge and breathe life into lifeless matter, then recoils from what he has made, reflecting an earlier critical shutdown of internal dialogue
We now inhabit their shared aftermath: ingenious in creation yet anxious in reflection.
Generative AI emerges from these deep cultural imaginaries. As Beguš observes, our tools extend the same stories that first taught us to imagine life emerging from lifelessness. The challenge for education is to make those imaginaries visible and then re‑imagine the conditions of creation.
As part of this endeavour we should awaken students from passive consumption into awareness, inviting them to question the assumptions shaping their outputs and the perspectives that can transform them.


Beautifully described parallels between literature and AI “embodiment” humans ascribe. Well done!