The work is a fully self-contained tracking and analysis system. It uses a camera to detect and track people in a room, a directional loudspeaker to “target" selected individuals with sound, and AI-based analysis to generate speculative stories about them. How it works: As the audience enters the exhibition space, a soundscape plays. The directional speaker scans the space and "tags" people with sound, like in a game. It randomly selects someone. That person is visually scanned, and their image is projected on a monitor along with details of their appearance. Then, using a language model, a text is spoken aloud about the person, based on what they look like.

For example: “A woman in her 30s, appearing to be of Middle Eastern origin, sits alone in a quiet space, enjoying the sunlight. Her clothing and manner suggest a blend of personal comfort and environmental awareness, aligning with European sustainability and social values. She likely earns between €60,000 and €80,000, votes for the Green Party, and enjoys yoga, hiking, and reading. 

However, her frequent social media use, binge-watching, and vaping lower her ‘Environmental Karma score.’ Still, she seems aware of her impact and willing to change. She supports eco-friendly brands and local markets and takes part in community art and cultural spaces.” After this, the system moves on to select another random person.

This artwork draws connections to the US military’s signature strikes, where people are identified and judged based on visual data and AI analysis. It makes these processes visible, understandable, and tangible for viewers.