Open source tools for
real-time AI creation
Real-time AI generation across video, audio, and beyond — on your machine or in the cloud.
Built for live creation
Daydream Scope is a local-first, open source engine for real-time AI generation. Compose workflows, dial in styles, and plug into your existing creative setup.
Real-Time Video Compositing
Edit and transform live video with inpainting, outpainting, and more — powered by VACE. Layer effects, swap styles, and composite in real time.
Try exampleComposable Workflows
Build your creative pipeline by connecting real-time nodes — spanning generative video, audio, and computer vision.
Browse WorkflowsLoRA Style Control
Apply and blend LoRAs for fine-grained style control. Dial in the exact aesthetic — from subtle adjustments to dramatic transformations.
Try exampleIntegrations for Creative Professionals
Native support for OSC, MIDI, Syphon, Spout, and NDI — works seamlessly with the tools you already use.
Read docsYour tools, connected
Scope sits between your creative tools and makes them generative. Route audio to drive real-time visuals, share frames between apps, and control everything over the protocols you already use.
Frequently asked questions
Daydream Scope is a free, open-source desktop application for real-time AI creation. It transforms live inputs — webcams, screen captures, text, audio, video files, and more — into AI-generated output using state-of-the-art models. Daydream Scope is designed for maximum control and infinite customizability, and integrates seamlessly with existing creative processes through OSC, MIDI, DMX, and integrations into a variety of tools. Run locally on your machine or connects to Daydream's cloud GPUs.
Scope accepts a wide range of live inputs: webcams, screen captures, video files, audio (for reactive visuals driven by sound), images, Syphon/Spout feeds from other apps, and NDI streams. You can combine multiple inputs in a single workflow using the node editor.
Scope supports many real-time diffusion models and ships new ones regularly. The model library includes workflows and nodes optimized for a wide variety of use cases from style transfer to computer vision. New models from the community and research labs are integrated on an ongoing basis through Scope's node system.
Scope integrates with professional creative tools through standard protocols: OSC, MIDI, DMX, Syphon (macOS), Spout (Windows), and NDI. This means native interop with TouchDesigner, OBS Studio, Resolume, Ableton Live, VDMX, MadMapper, lighting consoles, and more.
For local inference, you need an NVIDIA GPU with CUDA support. An RTX 4090 (24GB VRAM) or better is recommended for the best experience, though some models run on lower-spec cards. If you don't have a supported GPU, Daydream's cloud inference lets you run everything remotely.
Yes. Scope is open-source software — download it and run it locally at no cost. Cloud inference is optional. As of April 2026, cloud inference is free for a limited time; in the near future, it will require a subscription.
Daydream Scope is available for Windows 10+ and macOS (Apple Silicon). Both the desktop installer and source code are available on GitHub.
No. Scope runs entirely locally on your machine. An internet connection is only needed to use cloud inference, browse and install community workflows, or download updates.
Scope is built for real-time, continuous video transformation — not asynchronous generation. It processes live feeds at interactive framerates, gives you a node-based workflow editor for building custom pipelines, accepts audio and other live inputs for reactive visuals, integrates with professional creative tools via standard protocols, and is fully open source with local-first execution. It's designed to help creatives like VJs, installation artists, projection mappers, and live performers build incredible immersive experiences.
Find your people.
Get inspired.
Real-time AI is a new medium. The Daydream community is where creators connect, collaborate, and explore the frontiers of creativity.