Press Release

Daydream Launches Scope and Expands StreamDiffusion with SDXL Support, Advancing the Open-Source Real-Time AI Video Ecosystem

The launch of Scope and support for SDXL extends Daydream’s open infrastructure, linking creators, developers, and researchers building and experimenting with real-time video generation.


NEW YORK–(BUSINESS WIRE)–Daydream, the community hub for open-source real-time AI video and world models, announced two major milestones today: the release of Daydream Scope, an open-source development environment for real-time AI workflows, and support for StreamDiffusion SDXL in the Daydream API and Playground web experience, bringing high-fidelity, low-latency video generation capabilities to creators and developers everywhere. Together, these releases mark Daydream’s evolution into the hub of the real-time video generation stack, connecting models, creators, and infrastructure in one open ecosystem, and bringing coherence to a landscape fragmented across tools, models, and communities.

Scope: A New Era of Real-Time AI Video Development

Daydream Scope is an open-source toolkit that allows developers to build, test, and visualize real-time video and world-model workflows locally. It provides modular interfaces, enabling seamless integration of models for inference, control, and remixing in real time.

“Scope represents a foundational layer for the next generation of world models,” said Eric Tang, Co-Founder of Livepeer, Inc., the parent company behind Daydream. “It gives creative technologists and builders an extensible workspace to experiment with real-time AI pipelines, whether for generative video, virtual production, or academic research.”

Scope is currently in community alpha. It already supports LongLive, StreamDiffusionV2, and Krea Realtime 14B, with new models being added every week.

SDXL: A Major Leap Forward in Real-Time Quality and Control

The SDXL release builds upon StreamDiffusion’s open architecture, allowing Daydream to merge multiple research tracks into one cohesive, production-ready stack. Beyond today’s SDXL release, key components of the Daydream platform include:

  • Image-Based Style Control (IPAdapters): Enables dynamic, image-driven style transfer with two primary modes:

    • IPAdapter Standard for artistic style control.
    • IPAdapter FaceID for consistent character rendering across frames.
  • Multi-ControlNet Support: Accelerated HED, Depth, Pose, Tile, and Canny ControlNets provide unprecedented spatial and temporal precision, allowing users to fine-tune multiple parameters in real time.
  • TensorRT Acceleration: Optimized NVIDIA inference ensures smooth playback and consistent performance at 15-25 FPS, even with complex model configurations.
  • For creators who prefer SD1.5, Daydream has also paired that model with accelerated IPAdapters to deliver high-framerate style transfer and enhanced usability.

Already, creative technologists such as DotSimulate, the creator of StreamDiffusionTD, a popular TouchDesigner component, are incorporating Daydream’s SDXL release into their applications. Other developers are building SDXL-based tools either by using the Daydream API or self-hosting Daydream’s open-source StreamDiffusion fork.

About Daydream

Daydream, a product of Livepeer, Inc., is a community hub for open-source real-time AI video and world model technology. Daydream provides the infrastructure, research, and tools for developers, researchers, and creative technologists to build, deploy, and share next-generation interactive AI systems. Learn more at https://daydream.live.

Contacts

Eric Tang

Co-Founder, Daydream (a Livepeer product)

[email protected]

Author

Related Articles

Back to top button