Collaborate in VR to create musical creatures that transform environments and generate soundscapes with AI.
A project by Elisabeth Ermisch, Paul Lakos, Lili Weirich and Kaan Kocak,
6. Semester, 2025
Supervised by Prof. Dr. Grimm and Prof. Philip Hausmeier
Augmented and Virtual Reality Design
The foundation of caVeRave lies in collaborative music creation, requiring deep research into rhythm, harmony, and user-friendly interaction. Initial studies drew inspiration from Animal Crossing’s gyroids—small creatures that sync to music—and tools like Chrome Music Lab to simplify music theory for players. The team dissected tracks into stems (drums, melody, bass, ambient) to map them to creature behaviors.
Key goals included ensuring harmonic compatibility between user-generated sounds and dynamic environmental feedback. Research focused on beat patterns, tempo synchronization, and visual music theory (e.g., color-coded rhythms). Challenges included balancing simplicity for casual users with depth for enthusiasts. This led to a system where assembling creature blocks directly influences stems, creating a cohesive symphony.


The environment in caVeRave evolved from cold, static caves to dynamic ecosystems blending rave culture and natural mysticism. Early concepts focused on stalactite caves but shifted to include AI-generated 360° skyboxes and triplanar shaders for seamless texture blending. Research into rave culture’s inclusivity and underground aesthetics inspired neon-lit, bioluminescent elements contrasting with organic stone textures. The team tested procedural materials and particle systems (fog, fireflies) to enhance immersion.
Environmental states (Nature/Space) transition via creature interactions, altering textures, lighting, and ambient effects. Key challenges included optimizing performance for CAVE’s 8K screens while maintaining visual richness. The final design merges brutalism-inspired geometry with psychedelic accents, creating a responsive “living” cave that amplifies music through visual feedback.



Creatures began as abstract geometric shapes inspired by gyroids and Studio Ghibli’s whimsy. The team prioritized the “child schema” theory—round faces, wide eyes—to evoke empathy. Early sketches explored modular bodies (dodecahedrons, icosahedrons) with interchangeable parts. Three face types (round, square, triangular) were designed to reflect emotions, synced to animations triggered by music. Iterations addressed usability, reducing complexity from multi-socket connections to single-top attachments.
Creatures “come alive” via leg animations and pathfinding (NavMesh), walking to activation zones to dance. Testing revealed the need for clear visual feedback (glowing sockets, haptic vibrations) to guide interactions. The final designs balance cuteness and functionality, acting as both musical instruments and environmental catalysts, enhancing the player experience with seamless interaction.

UI/UX focused on minimalism to preserve immersion. A translucent tutorial screen (glass-like with pictograms) introduces mechanics without overwhelming players. Multiplayer integration required synchronizing XR interactions across CAVE and HMD users, with avatars represented by abstract shapes (spheres/cubes). Challenges included GitHub collaboration (merge conflicts, URP transitions) and ensuring intuitive creature assembly. Haptics and sound cues (clicking sockets, creature activation) provided tactile feedback.
The CAVE player’s stationary view contrasts with HMD users’ mobility, balanced via spatial UI placement. Iterations simplified radial menus into context-sensitive interactions (e.g., grabbing AI frames). Post-processing (vignette, bloom) directed focus to central gameplay, while particle systems (mushroom spores) highlighted interactables, improving user experience and guiding interaction flow.


The layout centers on collaborative play: CAVE and HMD players occupy triangular platforms around a central “activation zone.” Four color-coded mushroom pads act as creature destinations, each linked to a music stem (drums, melody, etc.). Creature blocks spawn on elevated podiums for easy access, with activation pads positioned for optimal visibility. The environment spirals outward with parallax depth (stalactites, floating rocks) to enhance scale.
Testing revealed CAVE players’ depth perception issues, leading to elevated platforms for better oversight. The map’s radial design ensures equal participation, while mushrooms bounce rhythmically to audio input. Creature paths follow predefined routes to avoid collisions, with particle trails guiding players, ensuring a smooth and immersive collaborative experience.

Gameplay revolves around assembling creatures, assigning them to pads, and shaping the environment. Players grab blocks (trigger-based XR interaction), fuse them via sockets, and place them on pads to activate dancing creatures. Each creature type amplifies a stem (e.g., drums), with volume tied to creature count. Environmental shifts (Nature/Space) occur as stems dominate, altering skyboxes and textures.
Multiplayer dynamics encourage cooperation—shared blocks enable hybrid creatures. The AI Controller (NavMesh) handles creature pathfinding, while Music Bouncer scripts sync mushroom animations to beats. Testing refined the loop: disassembling creatures resets stems, allowing experimentation. Haptics, sound cues, and visual feedback (glowing trails) create a satisfying, rhythm-driven flow, enhancing player immersion.

The AI Frame lets HMD players capture in-game moments and transform them via Stable Diffusion. Grabbing the frame triggers a screenshot sent to an AI API, which generates surreal, dream-like reinterpretations displayed on the frame. Challenges included optimizing render textures for VR performance and synchronizing outputs across multiplayer.
The feature blends creativity with unpredictability—users “deep dream” their creations, fostering emergent art. Though disabled for CAVE players due to performance limits, it adds a meta-layer to the experience, turning gameplay into collaborative art that extends beyond traditional gameplay mechanics.

The campaign targets music/VR enthusiasts with immersive demos and merch. Launch events feature glow sticks and darkened rooms to mimic rave vibes. Custom T-shirts printed with AI-generated screenshots personalize takeaways. Trailers juxtapose serene cave exploration with pulsating creature dances, set to original tracks. Social media teasers highlight creature customization and environmental shifts.
Partnerships with VR arcades and music festivals offer free trials, while a “caVeRave Creator Contest” encourages user-generated content (creature designs, AI art). Post-launch plans include DLC (new environments, instruments) and live DJ sets synced to in-game visuals, keeping the experience fresh and engaging for players long after release.

caVeRave reimagines collaborative music-making as a tactile, immersive journey. By blending VR interaction, AI, and dynamic environments, it transforms players into composers and explorers. The project’s success lies in balancing simplicity (intuitive creature assembly) with depth (layered stems, environmental storytelling).
Future expansions could add VR instruments, deeper AI integration, and user-generated content tools. Ultimately, caVeRave invites players to “feel” music through play, fostering connection in a digital age, creating a unique experience that goes beyond traditional gameplay.

