Updated MediaStream API Upstreamed

Mid-April Updates!

Upstream of RTCPeerConnection started

Dark Light

The MediaStream API, with its primary interfaces MediaStream and MediaStreamTrack, is the glue of the WebRTC APIs. You use it to direct real-time media from your camera and/or microphone to a media element for rendering or an RTCPeerConnection to be sent over the network. Several other APIs in the Web Platform use the MediaStream API to control real-time media. You can capture video and still images with MediaStream Recording and MediaStream Image Capture and the WebAudio APIs lets you process real-time audio. The MediaStream API is also a control surface where you can temporarily mute audio or make a video track render blackness.

Quite a lot has happened since the MediaStream API was first implemented in WebKit back in 2013. Some things have been removed, for example the specific types for audio and video tracks, and other things have been added and refined.

We recently took a big step forwards in the WebRTCinWebKit project when we upstreamed our updated MediaStream API to WebKit. Some things still need more work, like the Constraints mechanism and the functionality to enumerate local devices, but the foundation is now up to date and testable. 

We will keep you updated on our future progress.

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Posts

WebKitGTK+ and Chrome WebRTC Interop

The process to upstream an RTCPeerConnection implementation, based on OpenWebRTC, to WebKit is under way. We are starting…

Bringing WebRTC to WebKit

For years now, the WebRTC developers and users community has been asking for support for WebRTC on more…