Poster
in
Workshop: Generative AI and Creativity: A dialogue between machine learning researchers and creative professionals
Real-Time Neuro-Augmented Cinema via Generative AI
Philipp Thölke · Antoine Bellemare · Yann Harel · Karim Jerbi
In this paper, we present a novel system that integrates real-time neurofeedback into the creative process of generative AI, enabling seamless interactions between users and AI systems. By leveraging the user's cognitive variability, the system allows for continuous and fluid co-creation, moving beyond the traditional prompt-based interactions common in generative AI workflows. We achieve this using electroencephalography (EEG) to continuously monitor the user’s brain activity, which then acts as a control signal for a visual generative AI model. We focus specifically on Lempel-Ziv complexity, a measure of signal diversity that have previously been associated with mental states, task engagement and phenomenological richness. The proposed architecture includes an EEG feature extractor and a generative AI pipeline, working in tandem to dynamically alter the visual content of a pre-existing movie based on the user’s brain activity. This approach offers a new dimension of complexity and complicity in the interaction between humans and AI. Future work will explore the integration of more sophisticated bio-signals and multi-modal feedback, aiming to further enhance the depth and richness of the embodied creative experience. This work serves as a proof of principle for integrating biotechnology and generative AI in the emerging field of adaptive cinema. A video illustration of the system in action can be found at https://www.youtube.com/playlist?list=PLMu36WzSQKiVeBnrUdwUAoUhqLqGX3_bw.