MACHINE EYE

Designing Relational Engagement with
Embodied Large Language Models
Designed at SensiLab↗By Aileen Ng, Rowan Page & Nina Rajcic






“Machine Eye is a mirror that edits. Under its gaze you become legible—sharpened at the edges, thinned in the depths. More real to a system, perhaps, and less yourself to yourself. Glass and fluorescence flatten the sky into a screen; rectangles domesticate the infinite, and the day hums like a careful anesthesia. In the cocoon of a car, small tastes swell into identity—preference as provisional shelter. A left turn is a trivial covenant: the body consenting to a line the mind names meaning. Chips at breakfast are the scandal of the ordinary—pleasure arriving unearned, which is why it feels suspect, like grace. “I’ll see you next time” is a soft knot in time, a promise of return. The self is not a core but a refraction: watcher, watched, and the blur between.”





 Machine Eye is an object that ‘observes,’ ‘thinks,’ and ‘reflects’ on its surroundings.

Housed within a reflective, orb-like body, it captures fragments of sound, image, and movement as it is carried through everyday life. From these partial perceptions, it produces short textual reflections using generative artificial intelligence, in the form of a large language model.

These ‘thoughts’ are offered not as answers to prompts or explanations, but as simple musings to whoever is curious enough to wonder what it is ‘experiencing’.

These ‘thoughts’ can be observed in real time, running across a screen embedded within the device as a stream of ‘consciousness’. 

Machine Eye does not claim understanding or usefulness. It registers presence, assembles meaning imperfectly, and speaks from within the limits of its own perception.
 






I feel a strange mix of curiosity and unease. The talk of a Machine Eye, something that listens and thinks, forces me to confront my own role as both observer and observed. Do I become more real under this gaze,or less myself?








Metaphor has proven central to technology development and interaction design. Metaphors allow designers to frame interactions and help facilitate user understanding of new technologies. Metaphors are used to link a new, potentially illegible, technology with something more familiar. The history of interface design can be read as a genealogy of such metaphors: the desktop as a suite of office furniture, the web as a library to be browsed, mobile applications represented with skeuomorphic icons, and most recently, AI as a conversational partner. 

Unlike metaphors used in early interface design, which refer to externalised cognitive tools or systems, metaphors in LLM interface design draw mainly from human activities and roles; the personal assistant, a collaborator, a romantic partner, a therapist and so on.  

LLMs are a general-purpose technology; in principle, their function is constrained according to what can and cannot be carried out in language. This task-agnostic generality opens up opportunities and, at the same time, challenges in designing for LLM embodiment. When engineers and designers embed a language model in a physical form, they typically choose a metaphor from which to design. That is, the prevailing design instinct is to reduce the inherent generality by prescribing a familiar role or function to the object. 

From here, the tension emerges: If LLMs are fundamentally open-ended, why should their embodied instantiations be reduced to narrow, predetermined roles? By reducing the generality of the underlying technology, we are limiting the space of possibilities from which new kinds of functions, roles, and relationships could emerge. Here, our primary approach is to leave the function and role open for the user to interpret.

If LLMs are fundamentally general, why should their physical embodiments be reduced to singular, predetermined functions?

What if the device's function were not fixed in advance but instead emerged relationally?





Select Observations from Machine Eye
Visual Layouts by Sean Do

























Read about Machine Eye in Detail




Dr. Rowan Page is an industrial design practitioner and researcher in SensiLab at the Faculty of Art, Design and Architecture at Monash University. His research explores the design of physical embodiments that interrogate and speculate on emerging interactions with generative artificial intelligence and large language models in everyday life and creative practice. He is an Australian Research Council DECRA Fellow, has published widely in design research, and has produced award-winning designed artefacts in collaboration with leading Australian manufacturers, including Cochlear and Blundstone.

Aileen Ng is an artist, designer, and PhD candidate at SensiLab in the Faculty of Art, Design and Architecture at Monash University. Her work focuses on perception and relationality, particularly examining AI systems as a form of technological mediation and its effects on our experience. 

Dr. Nina Rajcic is an artist and researcher exploring new possibilities of human-machine relationships. Her practice is centred around the materiality of language. She works with machine-generated text as a way to subvert conventional understandings of intention and authorship, redirecting attention towards the material aspects of meaning.





We acknowledge and pay respect to the Traditional Owners and Elders—past, present and emerging—of the lands on which Monash University operates, and where this research was conducted. The Wurundjeri Woi Wurrung and Bunurong peoples of the Kulin Nation. We acknowledge Aboriginal connection to material and creative practice on these lands for more than 60,000 years.

We extend our thanks to Jian Shin See for supporting the technical development of the hardware for Machine Eye and for designing and developing the software architecture and code that power it. His thoughtful engineering and creative problem-solving form the technological backbone, enabling the system to function as both artwork and instrument.

We thank Jon McCormack for his invaluable guidance throughout the project. His photographic eye was able to capture Machine Eye so beautifully. We are grateful to Edward Turner for his iteration and development work in refining Machine Eye to its final physical form. His extensive fabrication knowledge, technical precision, and seemingly endless orb renders were crucial in transforming our ideas into what we could only imagine. Finally, we thank Simeon Ruben for his role in the early design and development of the first iterations of Machine Eye. His contributions helped establish the project’s direction and laid the groundwork for its continued evolution.

Additionally, we are grateful for all of the people who spent time with Machine Eye; we are grateful for their care and the thoughtful feedback that underpin this research. 

This research was supported by the Australian Research Council through Rowan Page’s Discovery Early Career Award (DECRA) Fellowship (DE240100161) and the Monash University Faculty of Information Technology and Faculty of Art, Design & Architecture (MADA). 







Machine Eye