sw

streamlit-webrtc

Real-time video and audio streams over the network, with Streamlit.

Showing:

Popularity

Downloads/wk

0

GitHub Stars

239

Maintenance

Last Commit

8d ago

Contributors

3

Package

Dependencies

3

License

MIT

Categories

Readme

streamlit-webrtc

Handling and transmitting real-time video/audio streams over the network with Streamlit Open in Streamlit

Tests Frontend Tests

PyPI PyPI - Python Version PyPI - License PyPI - Downloads

GitHub Sponsors

Buy Me A Coffee

Examples

⚡️Showcase including following examples and more: 🎈Online demo

  • Object detection
  • OpenCV filter
  • Uni-directional video streaming
  • Audio processing

You can try out this sample app using the following commands on your local env.

$ pip install streamlit-webrtc opencv-python-headless matplotlib pydub
$ streamlit run https://raw.githubusercontent.com/whitphx/streamlit-webrtc-example/main/app.py

⚡️Real-time Speech-to-Text: 🎈Online demo

Converting your voice into text in real time. This app is self-contained; it does not depend on any external API.

⚡️Real-time video style transfer: 🎈Online demo

It applies wide variety of style transfer filters to real-time video streams.

⚡️Video chat

(Online demo not available)

You can create video chat apps with ~100 lines of Python code.

⚡️Tokyo 2020 Pictogram: 🎈Online demo

MediaPipe is used for pose estimation.

Install

$ pip install -U streamlit-webrtc

Quick tutorial

Create app.py with the content below.

from streamlit_webrtc import webrtc_streamer

webrtc_streamer(key="sample")

Unlike other Streamlit components, webrtc_streamer() requires the key argument as a unique identifier. Set an arbitrary string to it.

Then run it with Streamlit and open http://localhost:8501/.

$ streamlit run app.py

You see the app view, so click the "START" button.

Then, video and audio streaming starts. If asked for permissions to access the camera and microphone, allow it. Basic example of streamlit-webrtc

Next, edit app.py as below and run it again.

from streamlit_webrtc import webrtc_streamer
import av


class VideoProcessor:
    def recv(self, frame):
        img = frame.to_ndarray(format="bgr24")

        flipped = img[::-1,:,:]

        return av.VideoFrame.from_ndarray(flipped, format="bgr24")


webrtc_streamer(key="example", video_processor_factory=VideoProcessor)

Now the video is vertically flipped. Vertically flipping example

As an example above, you can edit the video frames by defining a class with a callback method recv(self, frame) and passing it to the video_processor_factory argument. The callback receives and returns a frame. The frame is an instance of av.VideoFrame (or av.AudioFrame when dealing with audio) of PyAV library.

You can inject any kinds of image (or audio) processing inside the callback. See examples above for more applications.

Note that there are some limitations in this callback. See the section below.

Limitations

The callback methods (VideoProcessor.recv() and similar ones) are executed in threads different from the main thread, so there are some limitations:

  • Streamlit methods (st.* such as st.write()) do not work inside the callbacks.
  • Variables outside the callbacks cannot be referred to from inside, and vice versa.
    • It's impossible even with the global keyword, which also does not work in the callbacks properly.
  • You have to care about thread-safety when accessing the same objects both from outside and inside the callbacks.

A technique to pass values between inside and outside the callbacks

As stated above, you cannot directly pass variables from/to outside and inside the callback and have to consider about thread-safety.

Usual cases are

  • to change some parameters used in the callback to new values passed from the main scope.
  • to refer to the results of some processing inside the callback from the main scope.

The solution is to use the properties of the processor object which is accessible via the context object returned from webrtc_streamer() as below.

class VideoProcessor:
    def __init__(self):
        self.some_value = 0.5

    def recv(self, frame):
        img = frame.to_ndarray(format="bgr24")

        ...
        self.do_something(img, self.some_value)  # `some_value` is used here
        ...

        return av.VideoFrame.from_ndarray(img, format="bgr24")


ctx = webrtc_streamer(key="example", video_processor_factory=VideoProcessor)

if ctx.video_processor:
    ctx.video_processor.some_value = st.slider(...)  # `some_value` is set here

If the passed value is a complex object, you may also have to consider about using something like threading.Lock or queue.Queue for thread-safety.

The sample app, app.py has many cases where this technique is used and can be a hint for this topic.

Serving from remote host

When deploying apps to remote servers, there are some things you need to be aware of.

HTTPS

streamlit-webrtc uses getUserMedia() API to access local media devices, and this method does not work in an insecure context.

This document says

A secure context is, in short, a page loaded using HTTPS or the file:/// URL scheme, or a page loaded from localhost.

So, when hosting your app on a remote server, it must be served via HTTPS if your app is using webcam or microphone.

Streamlit Cloud is a recommended way. You can easily deploy Streamlit apps with it, and most importantly for this topic, it serves the apps via HTTPS automatically by defualt.

Network connectivity

Video streaming does not work in some network environments. For example, in some office or public networks, there are firewalls which drop the WebRTC packets.

In such environments, setting up a TURN server is a solution. See https://github.com/whitphx/streamlit-webrtc/issues/335#issuecomment-897326755.

API

Currently there is no documentation about the interface. See the example app.py for the usage. The API is not finalized yet and can be changed without backward compatiblity in the future releases until v1.0.

For users since versions <0.20

VideoTransformerBase and its transform method have been marked as deprecated in v0.20.0. Please use VideoProcessorBase#recv() instead. Note that the signature of the recv method is different from the transform in that the recv has to return an instance of av.VideoFrame or av.AudioFrame. See the samples in app.py.

Resources

Rate & Review

Great Documentation0
Easy to Use0
Performant0
Highly Customizable0
Bleeding Edge0
Responsive Maintainers0
Poor Documentation0
Hard to Use0
Slow0
Buggy0
Abandoned0
Unwelcoming Community0
100
No reviews found
Be the first to rate

Alternatives

No alternatives found

Tutorials

No tutorials found
Add a tutorial