public static class PlayerEndpoint.Builder extends AbstractBuilder<PlayerEndpoint>
genericProperties, props
Constructor and Description |
---|
Builder(MediaPipeline mediaPipeline,
String uri)
Creates a Builder for PlayerEndpoint
|
Modifier and Type | Method and Description |
---|---|
PlayerEndpoint.Builder |
useEncodedMedia()
Feed an encoded media as-is to the Media Pipeline, instead of first decoding it.
|
PlayerEndpoint.Builder |
with(String name,
Object value) |
PlayerEndpoint.Builder |
withNetworkCache(int networkCache)
Sets a value for networkCache in Builder for PlayerEndpoint.
|
PlayerEndpoint.Builder |
withProperties(Properties properties) |
build, build, buildAsync
public Builder(MediaPipeline mediaPipeline, String uri)
public PlayerEndpoint.Builder withProperties(Properties properties)
withProperties
in class AbstractBuilder<PlayerEndpoint>
public PlayerEndpoint.Builder with(String name, Object value)
with
in class AbstractBuilder<PlayerEndpoint>
public PlayerEndpoint.Builder useEncodedMedia()
This property is disabled by default. The input media gets always decoded into a raw format upon receiving it, before being processed by the rest of the Media Pipeline. This is done to ensure that Kurento is able to keep track of lost keyframes among other quality-control measurements. Of course, having to decode the media has a cost in terms of CPU usage, but ensures that the output streaming will be more robust and reliable.
When this property is enabled, Kurento simply passes the encoded media as-is to the rest of the Media Pipeline, without decoding. Enabling this mode of operation could have a severe effect on stability, because lost video keyframes will not be regenerated; however, not having to encode the video greatly reduces the CPU load.
Keep in mind that if this property is enabled, the original source media MUST already be in a format that is compatible with the destination target. For example: Given a Pipeline that reads a file and then streams it to a WebRTC browser such as Chrome, the file must already be encoded with a VP8 or H.264 codec profile, which Chrome is able to decode.
Of special note is that you cannot feed any random combination of H.264 encoding options to a web browser; instead, they tend to support only a very specific subset of the codec features (also known as 'profiles'). The most compatible config for H.264 is Constrained Baseline profile, level 3.1.
Code examples:
# Java
PlayerEndpoint player = new PlayerEndpoint
.Builder(pipeline, 'rtsp://localhost:5000/video')
.useEncodedMedia()
.build();
# JavaScript
let player = await pipeline.create('PlayerEndpoint', {
uri: 'rtsp://localhost:5000/video',
useEncodedMedia: true,
});
public PlayerEndpoint.Builder withNetworkCache(int networkCache)
networkCache
- RTSP buffer length.
When receiving media from an RTSP source, the streamed video can suffer spikes or stuttering, caused by hardware or network issues. Having a reception buffer helps alleviate these problems, because it smoothes the stream of incoming data to the receiving endpoint.
Finding a buffer length that works best for your connection might take some tweaking, which can be done with this optional property. Note that a longer buffer will be able to fix bigger network spikes, but at the cost of introducing more latency to the media playback.
Copyright © 2022 Kurento. All rights reserved.