Interface MediaElement

All Superinterfaces:
KurentoObject, MediaObject
All Known Subinterfaces:
BaseRtpEndpoint, Endpoint, FaceOverlayFilter, Filter, GenericMediaElement, GStreamerFilter, HttpEndpoint, HttpPostEndpoint, HubPort, ImageOverlayFilter, OpenCVFilter, PassThrough, PlayerEndpoint, RecorderEndpoint, RtpEndpoint, SdpEndpoint, SessionEndpoint, UriEndpoint, WebRtcEndpoint, ZBarFilter

public interface MediaElement extends MediaObject
The basic building block of the media server, that can be interconnected inside a pipeline.

A MediaElement is a module that encapsulates a specific media capability, and that is able to exchange media with other MediaElements through an internal element called pad.

A pad can be defined as an input or output interface. Input pads are called sinks, and it's where the media elements receive media from other media elements. Output interfaces are called sources, and it's the pad used by the media element to feed media to other media elements. There can be only one sink pad per media element. On the other hand, the number of source pads is unconstrained. This means that a certain media element can receive media only from one element at a time, while it can send media to many others. Pads are created on demand, when the connect method is invoked. When two media elements are connected, one media pad is created for each type of media connected. For example, if you connect AUDIO and VIDEO between two media elements, each one will need to create two new pads: one for AUDIO and one for VIDEO.

When media elements are connected, it can be the case that the encoding required in both input and output pads is not the same, and thus it needs to be transcoded. This is something that is handled transparently by the MediaElement internals, but such transcoding has a toll in the form of a higher CPU load, so connecting MediaElements that need media encoded in different formats is something to consider as a high load operation. The event `MediaTranscodingStateChanged` allows to inform the client application of whether media transcoding is being enabled or not inside any MediaElement object.

  • Method Details

    • getEncoderBitrate

      int getEncoderBitrate()
      Get Target video bitrate for media transcoding.

      The bitrate of a video has a direct impact on its perceived image quality. Higher bitrate means higher quality, but also a larger amount of bytes to transmit or store. Use this parameter to set the desired average bitrate in videos that are transcoded by the media server.

      This parameter is most useful for :rom:cls:`RecorderEndpoint` and :rom:cls:`RtpEndpoint`: when media is being transcoded (either for streaming or storing on disk), the resulting quality is directly controlled with this value.

      For :rom:cls:`WebRtcEndpoint`, this value should be left as default, as remote WebRTC receivers will already send feedback to inform the media server about what is the optimal bitrate to send.

      Setting a value will only work if done before the media starts to flow.

      • Unit: bps (bits per second).
      • Default: 300000 (300 kbps).
    • getEncoderBitrate

      void getEncoderBitrate(Continuation<Integer> cont)
      Get Target video bitrate for media transcoding.

      The bitrate of a video has a direct impact on its perceived image quality. Higher bitrate means higher quality, but also a larger amount of bytes to transmit or store. Use this parameter to set the desired average bitrate in videos that are transcoded by the media server.

      This parameter is most useful for :rom:cls:`RecorderEndpoint` and :rom:cls:`RtpEndpoint`: when media is being transcoded (either for streaming or storing on disk), the resulting quality is directly controlled with this value.

      For :rom:cls:`WebRtcEndpoint`, this value should be left as default, as remote WebRTC receivers will already send feedback to inform the media server about what is the optimal bitrate to send.

      Setting a value will only work if done before the media starts to flow.

      • Unit: bps (bits per second).
      • Default: 300000 (300 kbps).
    • getEncoderBitrate

      TFuture<Integer> getEncoderBitrate(Transaction tx)
      Get Target video bitrate for media transcoding.

      The bitrate of a video has a direct impact on its perceived image quality. Higher bitrate means higher quality, but also a larger amount of bytes to transmit or store. Use this parameter to set the desired average bitrate in videos that are transcoded by the media server.

      This parameter is most useful for :rom:cls:`RecorderEndpoint` and :rom:cls:`RtpEndpoint`: when media is being transcoded (either for streaming or storing on disk), the resulting quality is directly controlled with this value.

      For :rom:cls:`WebRtcEndpoint`, this value should be left as default, as remote WebRTC receivers will already send feedback to inform the media server about what is the optimal bitrate to send.

      Setting a value will only work if done before the media starts to flow.

      • Unit: bps (bits per second).
      • Default: 300000 (300 kbps).
    • setEncoderBitrate

      void setEncoderBitrate(int encoderBitrate)
      Set Target video bitrate for media transcoding.

      The bitrate of a video has a direct impact on its perceived image quality. Higher bitrate means higher quality, but also a larger amount of bytes to transmit or store. Use this parameter to set the desired average bitrate in videos that are transcoded by the media server.

      This parameter is most useful for :rom:cls:`RecorderEndpoint` and :rom:cls:`RtpEndpoint`: when media is being transcoded (either for streaming or storing on disk), the resulting quality is directly controlled with this value.

      For :rom:cls:`WebRtcEndpoint`, this value should be left as default, as remote WebRTC receivers will already send feedback to inform the media server about what is the optimal bitrate to send.

      Setting a value will only work if done before the media starts to flow.

      • Unit: bps (bits per second).
      • Default: 300000 (300 kbps).
    • setEncoderBitrate

      void setEncoderBitrate(int encoderBitrate, Continuation<Void> cont)
      Set Target video bitrate for media transcoding.

      The bitrate of a video has a direct impact on its perceived image quality. Higher bitrate means higher quality, but also a larger amount of bytes to transmit or store. Use this parameter to set the desired average bitrate in videos that are transcoded by the media server.

      This parameter is most useful for :rom:cls:`RecorderEndpoint` and :rom:cls:`RtpEndpoint`: when media is being transcoded (either for streaming or storing on disk), the resulting quality is directly controlled with this value.

      For :rom:cls:`WebRtcEndpoint`, this value should be left as default, as remote WebRTC receivers will already send feedback to inform the media server about what is the optimal bitrate to send.

      Setting a value will only work if done before the media starts to flow.

      • Unit: bps (bits per second).
      • Default: 300000 (300 kbps).
    • setEncoderBitrate

      void setEncoderBitrate(int encoderBitrate, Transaction tx)
      Set Target video bitrate for media transcoding.

      The bitrate of a video has a direct impact on its perceived image quality. Higher bitrate means higher quality, but also a larger amount of bytes to transmit or store. Use this parameter to set the desired average bitrate in videos that are transcoded by the media server.

      This parameter is most useful for :rom:cls:`RecorderEndpoint` and :rom:cls:`RtpEndpoint`: when media is being transcoded (either for streaming or storing on disk), the resulting quality is directly controlled with this value.

      For :rom:cls:`WebRtcEndpoint`, this value should be left as default, as remote WebRTC receivers will already send feedback to inform the media server about what is the optimal bitrate to send.

      Setting a value will only work if done before the media starts to flow.

      • Unit: bps (bits per second).
      • Default: 300000 (300 kbps).
    • getMinEncoderBitrate

      int getMinEncoderBitrate()
      Get Minimum video bitrate for media transcoding.

      This parameter can be used to fine tune the automatic bitrate selection that normally takes place within elements that are able to dynamically change the encoding bitrate according to the conditions of the streaming, such as :rom:cls:`WebRtcEndpoint`.

      This should be left as default in most cases, given that remote WebRTC receivers already send feedback to inform the media server about what is the optimal bitrate to send. Otherwise, this parameter could be used for example to force a higher bitrate than what is being requested by receivers.

      Setting a value will only work if done before the media starts to flow.

      • Unit: bps (bits per second).
      • Default: 0.
    • getMinEncoderBitrate

      void getMinEncoderBitrate(Continuation<Integer> cont)
      Get Minimum video bitrate for media transcoding.

      This parameter can be used to fine tune the automatic bitrate selection that normally takes place within elements that are able to dynamically change the encoding bitrate according to the conditions of the streaming, such as :rom:cls:`WebRtcEndpoint`.

      This should be left as default in most cases, given that remote WebRTC receivers already send feedback to inform the media server about what is the optimal bitrate to send. Otherwise, this parameter could be used for example to force a higher bitrate than what is being requested by receivers.

      Setting a value will only work if done before the media starts to flow.

      • Unit: bps (bits per second).
      • Default: 0.
    • getMinEncoderBitrate

      TFuture<Integer> getMinEncoderBitrate(Transaction tx)
      Get Minimum video bitrate for media transcoding.

      This parameter can be used to fine tune the automatic bitrate selection that normally takes place within elements that are able to dynamically change the encoding bitrate according to the conditions of the streaming, such as :rom:cls:`WebRtcEndpoint`.

      This should be left as default in most cases, given that remote WebRTC receivers already send feedback to inform the media server about what is the optimal bitrate to send. Otherwise, this parameter could be used for example to force a higher bitrate than what is being requested by receivers.

      Setting a value will only work if done before the media starts to flow.

      • Unit: bps (bits per second).
      • Default: 0.
    • setMinEncoderBitrate

      void setMinEncoderBitrate(int minEncoderBitrate)
      Set Minimum video bitrate for media transcoding.

      This parameter can be used to fine tune the automatic bitrate selection that normally takes place within elements that are able to dynamically change the encoding bitrate according to the conditions of the streaming, such as :rom:cls:`WebRtcEndpoint`.

      This should be left as default in most cases, given that remote WebRTC receivers already send feedback to inform the media server about what is the optimal bitrate to send. Otherwise, this parameter could be used for example to force a higher bitrate than what is being requested by receivers.

      Setting a value will only work if done before the media starts to flow.

      • Unit: bps (bits per second).
      • Default: 0.
    • setMinEncoderBitrate

      void setMinEncoderBitrate(int minEncoderBitrate, Continuation<Void> cont)
      Set Minimum video bitrate for media transcoding.

      This parameter can be used to fine tune the automatic bitrate selection that normally takes place within elements that are able to dynamically change the encoding bitrate according to the conditions of the streaming, such as :rom:cls:`WebRtcEndpoint`.

      This should be left as default in most cases, given that remote WebRTC receivers already send feedback to inform the media server about what is the optimal bitrate to send. Otherwise, this parameter could be used for example to force a higher bitrate than what is being requested by receivers.

      Setting a value will only work if done before the media starts to flow.

      • Unit: bps (bits per second).
      • Default: 0.
    • setMinEncoderBitrate

      void setMinEncoderBitrate(int minEncoderBitrate, Transaction tx)
      Set Minimum video bitrate for media transcoding.

      This parameter can be used to fine tune the automatic bitrate selection that normally takes place within elements that are able to dynamically change the encoding bitrate according to the conditions of the streaming, such as :rom:cls:`WebRtcEndpoint`.

      This should be left as default in most cases, given that remote WebRTC receivers already send feedback to inform the media server about what is the optimal bitrate to send. Otherwise, this parameter could be used for example to force a higher bitrate than what is being requested by receivers.

      Setting a value will only work if done before the media starts to flow.

      • Unit: bps (bits per second).
      • Default: 0.
    • getMaxEncoderBitrate

      int getMaxEncoderBitrate()
      Get Maximum video bitrate for media transcoding.

      This parameter can be used to fine tune the automatic bitrate selection that normally takes place within elements that are able to dynamically change the encoding bitrate according to the conditions of the streaming, such as :rom:cls:`WebRtcEndpoint`.

      This should be left as default in most cases, given that remote WebRTC receivers already send feedback to inform the media server about what is the optimal bitrate to send. Otherwise, this parameter could be used for example to limit the total bitrate that is handled by the server, by setting a low maximum output for all endpoints.

      This should be left as default in most cases, given that remote WebRTC receivers already send feedback to inform the media server about what is the optimal bitrate to send. Otherwise, this parameter could be used for example to limit the total bitrate that is handled by the server, by setting a low maximum output for all endpoints.

      Setting a value will only work if done before the media starts to flow.

      • Unit: bps (bits per second).
      • Default: 0.
      • 0 = unlimited. Encoding performed with bitrate as requested by receivers.
    • getMaxEncoderBitrate

      void getMaxEncoderBitrate(Continuation<Integer> cont)
      Get Maximum video bitrate for media transcoding.

      This parameter can be used to fine tune the automatic bitrate selection that normally takes place within elements that are able to dynamically change the encoding bitrate according to the conditions of the streaming, such as :rom:cls:`WebRtcEndpoint`.

      This should be left as default in most cases, given that remote WebRTC receivers already send feedback to inform the media server about what is the optimal bitrate to send. Otherwise, this parameter could be used for example to limit the total bitrate that is handled by the server, by setting a low maximum output for all endpoints.

      This should be left as default in most cases, given that remote WebRTC receivers already send feedback to inform the media server about what is the optimal bitrate to send. Otherwise, this parameter could be used for example to limit the total bitrate that is handled by the server, by setting a low maximum output for all endpoints.

      Setting a value will only work if done before the media starts to flow.

      • Unit: bps (bits per second).
      • Default: 0.
      • 0 = unlimited. Encoding performed with bitrate as requested by receivers.
    • getMaxEncoderBitrate

      TFuture<Integer> getMaxEncoderBitrate(Transaction tx)
      Get Maximum video bitrate for media transcoding.

      This parameter can be used to fine tune the automatic bitrate selection that normally takes place within elements that are able to dynamically change the encoding bitrate according to the conditions of the streaming, such as :rom:cls:`WebRtcEndpoint`.

      This should be left as default in most cases, given that remote WebRTC receivers already send feedback to inform the media server about what is the optimal bitrate to send. Otherwise, this parameter could be used for example to limit the total bitrate that is handled by the server, by setting a low maximum output for all endpoints.

      This should be left as default in most cases, given that remote WebRTC receivers already send feedback to inform the media server about what is the optimal bitrate to send. Otherwise, this parameter could be used for example to limit the total bitrate that is handled by the server, by setting a low maximum output for all endpoints.

      Setting a value will only work if done before the media starts to flow.

      • Unit: bps (bits per second).
      • Default: 0.
      • 0 = unlimited. Encoding performed with bitrate as requested by receivers.
    • setMaxEncoderBitrate

      void setMaxEncoderBitrate(int maxEncoderBitrate)
      Set Maximum video bitrate for media transcoding.

      This parameter can be used to fine tune the automatic bitrate selection that normally takes place within elements that are able to dynamically change the encoding bitrate according to the conditions of the streaming, such as :rom:cls:`WebRtcEndpoint`.

      This should be left as default in most cases, given that remote WebRTC receivers already send feedback to inform the media server about what is the optimal bitrate to send. Otherwise, this parameter could be used for example to limit the total bitrate that is handled by the server, by setting a low maximum output for all endpoints.

      This should be left as default in most cases, given that remote WebRTC receivers already send feedback to inform the media server about what is the optimal bitrate to send. Otherwise, this parameter could be used for example to limit the total bitrate that is handled by the server, by setting a low maximum output for all endpoints.

      Setting a value will only work if done before the media starts to flow.

      • Unit: bps (bits per second).
      • Default: 0.
      • 0 = unlimited. Encoding performed with bitrate as requested by receivers.
    • setMaxEncoderBitrate

      void setMaxEncoderBitrate(int maxEncoderBitrate, Continuation<Void> cont)
      Set Maximum video bitrate for media transcoding.

      This parameter can be used to fine tune the automatic bitrate selection that normally takes place within elements that are able to dynamically change the encoding bitrate according to the conditions of the streaming, such as :rom:cls:`WebRtcEndpoint`.

      This should be left as default in most cases, given that remote WebRTC receivers already send feedback to inform the media server about what is the optimal bitrate to send. Otherwise, this parameter could be used for example to limit the total bitrate that is handled by the server, by setting a low maximum output for all endpoints.

      This should be left as default in most cases, given that remote WebRTC receivers already send feedback to inform the media server about what is the optimal bitrate to send. Otherwise, this parameter could be used for example to limit the total bitrate that is handled by the server, by setting a low maximum output for all endpoints.

      Setting a value will only work if done before the media starts to flow.

      • Unit: bps (bits per second).
      • Default: 0.
      • 0 = unlimited. Encoding performed with bitrate as requested by receivers.
    • setMaxEncoderBitrate

      void setMaxEncoderBitrate(int maxEncoderBitrate, Transaction tx)
      Set Maximum video bitrate for media transcoding.

      This parameter can be used to fine tune the automatic bitrate selection that normally takes place within elements that are able to dynamically change the encoding bitrate according to the conditions of the streaming, such as :rom:cls:`WebRtcEndpoint`.

      This should be left as default in most cases, given that remote WebRTC receivers already send feedback to inform the media server about what is the optimal bitrate to send. Otherwise, this parameter could be used for example to limit the total bitrate that is handled by the server, by setting a low maximum output for all endpoints.

      This should be left as default in most cases, given that remote WebRTC receivers already send feedback to inform the media server about what is the optimal bitrate to send. Otherwise, this parameter could be used for example to limit the total bitrate that is handled by the server, by setting a low maximum output for all endpoints.

      Setting a value will only work if done before the media starts to flow.

      • Unit: bps (bits per second).
      • Default: 0.
      • 0 = unlimited. Encoding performed with bitrate as requested by receivers.
    • getSourceConnections

      List<ElementConnectionData> getSourceConnections(MediaType mediaType, String description)
      Gets information about the sink pads of this media element.

      Since sink pads are the interface through which a media element gets it's media, whatever is connected to an element's sink pad is formally a source of media. Media can be filtered by type, or by the description given to the pad though which both elements are connected.

      Parameters:
      mediaType - One of MediaType.AUDIO, MediaType.VIDEO or MediaType.DATA
      description - A textual description of the media source. Currently not used, aimed mainly for MediaType.DATA sources
      Returns:
      A list of the connections information that are sending media to this element. The list will be empty if no sources are found. *
    • getSourceConnections

      void getSourceConnections(MediaType mediaType, String description, Continuation<List<ElementConnectionData>> cont)
      Asynchronous version of getSourceConnections: Continuation.onSuccess(F) is called when the action is done. If an error occurs, Continuation.onError(java.lang.Throwable) is called.
      Parameters:
      mediaType - One of MediaType.AUDIO, MediaType.VIDEO or MediaType.DATA
      description - A textual description of the media source. Currently not used, aimed mainly for MediaType.DATA sources
      See Also:
    • getSourceConnections

      TFuture<List<ElementConnectionData>> getSourceConnections(Transaction tx, MediaType mediaType, String description)
      Gets information about the sink pads of this media element.

      Since sink pads are the interface through which a media element gets it's media, whatever is connected to an element's sink pad is formally a source of media. Media can be filtered by type, or by the description given to the pad though which both elements are connected.

      Parameters:
      mediaType - One of MediaType.AUDIO, MediaType.VIDEO or MediaType.DATA
      description - A textual description of the media source. Currently not used, aimed mainly for MediaType.DATA sources
      Returns:
      A list of the connections information that are sending media to this element. The list will be empty if no sources are found. *
    • getSinkConnections

      List<ElementConnectionData> getSinkConnections(MediaType mediaType, String description)
      Gets information about the source pads of this media element.

      Since source pads connect to other media element's sinks, this is formally the sink of media from the element's perspective. Media can be filtered by type, or by the description given to the pad though which both elements are connected.

      Parameters:
      mediaType - One of MediaType.AUDIO, MediaType.VIDEO or MediaType.DATA
      description - A textual description of the media source. Currently not used, aimed mainly for MediaType.DATA sources
      Returns:
      A list of the connections information that are receiving media from this element. The list will be empty if no sources are found. *
    • getSinkConnections

      void getSinkConnections(MediaType mediaType, String description, Continuation<List<ElementConnectionData>> cont)
      Asynchronous version of getSinkConnections: Continuation.onSuccess(F) is called when the action is done. If an error occurs, Continuation.onError(java.lang.Throwable) is called.
      Parameters:
      mediaType - One of MediaType.AUDIO, MediaType.VIDEO or MediaType.DATA
      description - A textual description of the media source. Currently not used, aimed mainly for MediaType.DATA sources
      See Also:
    • getSinkConnections

      TFuture<List<ElementConnectionData>> getSinkConnections(Transaction tx, MediaType mediaType, String description)
      Gets information about the source pads of this media element.

      Since source pads connect to other media element's sinks, this is formally the sink of media from the element's perspective. Media can be filtered by type, or by the description given to the pad though which both elements are connected.

      Parameters:
      mediaType - One of MediaType.AUDIO, MediaType.VIDEO or MediaType.DATA
      description - A textual description of the media source. Currently not used, aimed mainly for MediaType.DATA sources
      Returns:
      A list of the connections information that are receiving media from this element. The list will be empty if no sources are found. *
    • connect

      void connect(MediaElement sink, MediaType mediaType, String sourceMediaDescription, String sinkMediaDescription)
      Connects two elements, with the media flowing from left to right.

      The element that invokes the connect will be the source of media, creating one sink pad for each type of media connected. The element given as parameter to the method will be the sink, and it will create one sink pad per media type connected.

      If otherwise not specified, all types of media are connected by default (AUDIO, VIDEO and DATA). It is recommended to connect the specific types of media if not all of them will be used. For this purpose, the connect method can be invoked more than once on the same two elements, but with different media types.

      The connection is unidirectional. If a bidirectional connection is desired, the position of the media elements must be inverted. For instance, webrtc1.connect(webrtc2) is connecting webrtc1 as source of webrtc2. In order to create a WebRTC one-2one conversation, the user would need to specify the connection on the other direction with webrtc2.connect(webrtc1).

      Even though one media element can have one sink pad per type of media, only one media element can be connected to another at a given time. If a media element is connected to another, the former will become the source of the sink media element, regardless whether there was another element connected or not.

      Parameters:
      sink - the target MediaElement that will receive media
      mediaType - the MediaType of the pads that will be connected
      sourceMediaDescription - A textual description of the media source. Currently not used, aimed mainly for MediaType.DATA sources
      sinkMediaDescription - A textual description of the media source. Currently not used, aimed mainly for MediaType.DATA sources
    • connect

      void connect(MediaElement sink, MediaType mediaType, String sourceMediaDescription, String sinkMediaDescription, Continuation<Void> cont)
      Asynchronous version of connect: Continuation.onSuccess(F) is called when the action is done. If an error occurs, Continuation.onError(java.lang.Throwable) is called.
      Parameters:
      sink - the target MediaElement that will receive media
      mediaType - the MediaType of the pads that will be connected
      sourceMediaDescription - A textual description of the media source. Currently not used, aimed mainly for MediaType.DATA sources
      sinkMediaDescription - A textual description of the media source. Currently not used, aimed mainly for MediaType.DATA sources
      See Also:
    • connect

      void connect(Transaction tx, MediaElement sink, MediaType mediaType, String sourceMediaDescription, String sinkMediaDescription)
      Connects two elements, with the media flowing from left to right.

      The element that invokes the connect will be the source of media, creating one sink pad for each type of media connected. The element given as parameter to the method will be the sink, and it will create one sink pad per media type connected.

      If otherwise not specified, all types of media are connected by default (AUDIO, VIDEO and DATA). It is recommended to connect the specific types of media if not all of them will be used. For this purpose, the connect method can be invoked more than once on the same two elements, but with different media types.

      The connection is unidirectional. If a bidirectional connection is desired, the position of the media elements must be inverted. For instance, webrtc1.connect(webrtc2) is connecting webrtc1 as source of webrtc2. In order to create a WebRTC one-2one conversation, the user would need to specify the connection on the other direction with webrtc2.connect(webrtc1).

      Even though one media element can have one sink pad per type of media, only one media element can be connected to another at a given time. If a media element is connected to another, the former will become the source of the sink media element, regardless whether there was another element connected or not.

      Parameters:
      sink - the target MediaElement that will receive media
      mediaType - the MediaType of the pads that will be connected
      sourceMediaDescription - A textual description of the media source. Currently not used, aimed mainly for MediaType.DATA sources
      sinkMediaDescription - A textual description of the media source. Currently not used, aimed mainly for MediaType.DATA sources
    • disconnect

      void disconnect(MediaElement sink, MediaType mediaType, String sourceMediaDescription, String sinkMediaDescription)
      Disconnects two media elements. This will release the source pads of the source media element, and the sink pads of the sink media element.
      Parameters:
      sink - the target MediaElement that will stop receiving media
      mediaType - the MediaType of the pads that will be connected
      sourceMediaDescription - A textual description of the media source. Currently not used, aimed mainly for MediaType.DATA sources
      sinkMediaDescription - A textual description of the media source. Currently not used, aimed mainly for MediaType.DATA sources
    • disconnect

      void disconnect(MediaElement sink, MediaType mediaType, String sourceMediaDescription, String sinkMediaDescription, Continuation<Void> cont)
      Asynchronous version of disconnect: Continuation.onSuccess(F) is called when the action is done. If an error occurs, Continuation.onError(java.lang.Throwable) is called.
      Parameters:
      sink - the target MediaElement that will stop receiving media
      mediaType - the MediaType of the pads that will be connected
      sourceMediaDescription - A textual description of the media source. Currently not used, aimed mainly for MediaType.DATA sources
      sinkMediaDescription - A textual description of the media source. Currently not used, aimed mainly for MediaType.DATA sources
      See Also:
    • disconnect

      void disconnect(Transaction tx, MediaElement sink, MediaType mediaType, String sourceMediaDescription, String sinkMediaDescription)
      Disconnects two media elements. This will release the source pads of the source media element, and the sink pads of the sink media element.
      Parameters:
      sink - the target MediaElement that will stop receiving media
      mediaType - the MediaType of the pads that will be connected
      sourceMediaDescription - A textual description of the media source. Currently not used, aimed mainly for MediaType.DATA sources
      sinkMediaDescription - A textual description of the media source. Currently not used, aimed mainly for MediaType.DATA sources
    • setAudioFormat

      void setAudioFormat(AudioCaps caps)
      Set the type of data for the audio stream.

      MediaElements that do not support configuration of audio capabilities will throw a MEDIA_OBJECT_ILLEGAL_PARAM_ERROR exception.

      NOTE: This method is not implemented yet by the Media Server to do anything useful.

      Parameters:
      caps - The format for the stream of audio
    • setAudioFormat

      void setAudioFormat(AudioCaps caps, Continuation<Void> cont)
      Asynchronous version of setAudioFormat: Continuation.onSuccess(F) is called when the action is done. If an error occurs, Continuation.onError(java.lang.Throwable) is called.
      Parameters:
      caps - The format for the stream of audio
      See Also:
    • setAudioFormat

      void setAudioFormat(Transaction tx, AudioCaps caps)
      Set the type of data for the audio stream.

      MediaElements that do not support configuration of audio capabilities will throw a MEDIA_OBJECT_ILLEGAL_PARAM_ERROR exception.

      NOTE: This method is not implemented yet by the Media Server to do anything useful.

      Parameters:
      caps - The format for the stream of audio
    • setVideoFormat

      void setVideoFormat(VideoCaps caps)
      Set the type of data for the video stream.

      MediaElements that do not support configuration of video capabilities will throw a MEDIA_OBJECT_ILLEGAL_PARAM_ERROR exception

      NOTE: This method is not implemented yet by the Media Server to do anything useful.

      Parameters:
      caps - The format for the stream of video
    • setVideoFormat

      void setVideoFormat(VideoCaps caps, Continuation<Void> cont)
      Asynchronous version of setVideoFormat: Continuation.onSuccess(F) is called when the action is done. If an error occurs, Continuation.onError(java.lang.Throwable) is called.
      Parameters:
      caps - The format for the stream of video
      See Also:
    • setVideoFormat

      void setVideoFormat(Transaction tx, VideoCaps caps)
      Set the type of data for the video stream.

      MediaElements that do not support configuration of video capabilities will throw a MEDIA_OBJECT_ILLEGAL_PARAM_ERROR exception

      NOTE: This method is not implemented yet by the Media Server to do anything useful.

      Parameters:
      caps - The format for the stream of video
    • dumpGstreamerDot

      void dumpGstreamerDot(GstreamerDotDetails details)
      If GST_DEBUG_DUMP_DOT_DIR environment variable is defined dumps in that directoy a file with the GStreamer dot of the Media Element.

      The element can be queried for certain type of data:

      • SHOW_ALL: default value
      • SHOW_CAPS_DETAILS
      • SHOW_FULL_PARAMS
      • SHOW_MEDIA_TYPE
      • SHOW_NON_DEFAULT_PARAMS
      • SHOW_STATES
      • SHOW_VERBOSE
      Parameters:
      details - Details of graph
    • dumpGstreamerDot

      void dumpGstreamerDot(GstreamerDotDetails details, Continuation<Void> cont)
      Asynchronous version of dumpGstreamerDot: Continuation.onSuccess(F) is called when the action is done. If an error occurs, Continuation.onError(java.lang.Throwable) is called.
      Parameters:
      details - Details of graph
      See Also:
    • dumpGstreamerDot

      void dumpGstreamerDot(Transaction tx, GstreamerDotDetails details)
      If GST_DEBUG_DUMP_DOT_DIR environment variable is defined dumps in that directoy a file with the GStreamer dot of the Media Element.

      The element can be queried for certain type of data:

      • SHOW_ALL: default value
      • SHOW_CAPS_DETAILS
      • SHOW_FULL_PARAMS
      • SHOW_MEDIA_TYPE
      • SHOW_NON_DEFAULT_PARAMS
      • SHOW_STATES
      • SHOW_VERBOSE
      Parameters:
      details - Details of graph
    • getGstreamerDot

      String getGstreamerDot(GstreamerDotDetails details)
      Return a .dot file describing the topology of the media element.

      The element can be queried for certain type of data:

      • SHOW_ALL: default value
      • SHOW_CAPS_DETAILS
      • SHOW_FULL_PARAMS
      • SHOW_MEDIA_TYPE
      • SHOW_NON_DEFAULT_PARAMS
      • SHOW_STATES
      • SHOW_VERBOSE
      Parameters:
      details - Details of graph
      Returns:
      The dot graph. *
    • getGstreamerDot

      void getGstreamerDot(GstreamerDotDetails details, Continuation<String> cont)
      Asynchronous version of getGstreamerDot: Continuation.onSuccess(F) is called when the action is done. If an error occurs, Continuation.onError(java.lang.Throwable) is called.
      Parameters:
      details - Details of graph
      See Also:
    • getGstreamerDot

      TFuture<String> getGstreamerDot(Transaction tx, GstreamerDotDetails details)
      Return a .dot file describing the topology of the media element.

      The element can be queried for certain type of data:

      • SHOW_ALL: default value
      • SHOW_CAPS_DETAILS
      • SHOW_FULL_PARAMS
      • SHOW_MEDIA_TYPE
      • SHOW_NON_DEFAULT_PARAMS
      • SHOW_STATES
      • SHOW_VERBOSE
      Parameters:
      details - Details of graph
      Returns:
      The dot graph. *
    • getStats

      Map<String,Stats> getStats(MediaType mediaType)
      Gets the statistics related to an endpoint. If no media type is specified, it returns statistics for all available types.
      Parameters:
      mediaType - One of MediaType.AUDIO or MediaType.VIDEO
      Returns:
      Delivers a successful result in the form of a RTC stats report. A RTC stats report represents a map between strings, identifying the inspected objects (RTCStats.id), and their corresponding RTCStats objects. *
    • getStats

      void getStats(MediaType mediaType, Continuation<Map<String,Stats>> cont)
      Asynchronous version of getStats: Continuation.onSuccess(F) is called when the action is done. If an error occurs, Continuation.onError(java.lang.Throwable) is called.
      Parameters:
      mediaType - One of MediaType.AUDIO or MediaType.VIDEO
      See Also:
    • getStats

      TFuture<Map<String,Stats>> getStats(Transaction tx, MediaType mediaType)
      Gets the statistics related to an endpoint. If no media type is specified, it returns statistics for all available types.
      Parameters:
      mediaType - One of MediaType.AUDIO or MediaType.VIDEO
      Returns:
      Delivers a successful result in the form of a RTC stats report. A RTC stats report represents a map between strings, identifying the inspected objects (RTCStats.id), and their corresponding RTCStats objects. *
    • isMediaFlowingIn

      boolean isMediaFlowingIn(MediaType mediaType, String sinkMediaDescription)
      This method indicates whether the media element is receiving media of a certain type. The media sink pad can be identified individually, if needed. It is only supported for AUDIO and VIDEO types, raising a MEDIA_OBJECT_ILLEGAL_PARAM_ERROR otherwise. If the pad indicated does not exist, if will return false.
      Parameters:
      mediaType - One of MediaType.AUDIO or MediaType.VIDEO
      sinkMediaDescription - Description of the sink
      Returns:
      TRUE if there is media, FALSE in other case. *
    • isMediaFlowingIn

      void isMediaFlowingIn(MediaType mediaType, String sinkMediaDescription, Continuation<Boolean> cont)
      Asynchronous version of isMediaFlowingIn: Continuation.onSuccess(F) is called when the action is done. If an error occurs, Continuation.onError(java.lang.Throwable) is called.
      Parameters:
      mediaType - One of MediaType.AUDIO or MediaType.VIDEO
      sinkMediaDescription - Description of the sink
      See Also:
    • isMediaFlowingIn

      TFuture<Boolean> isMediaFlowingIn(Transaction tx, MediaType mediaType, String sinkMediaDescription)
      This method indicates whether the media element is receiving media of a certain type. The media sink pad can be identified individually, if needed. It is only supported for AUDIO and VIDEO types, raising a MEDIA_OBJECT_ILLEGAL_PARAM_ERROR otherwise. If the pad indicated does not exist, if will return false.
      Parameters:
      mediaType - One of MediaType.AUDIO or MediaType.VIDEO
      sinkMediaDescription - Description of the sink
      Returns:
      TRUE if there is media, FALSE in other case. *
    • isMediaFlowingOut

      boolean isMediaFlowingOut(MediaType mediaType, String sourceMediaDescription)
      This method indicates whether the media element is emitting media of a certain type. The media source pad can be identified individually, if needed. It is only supported for AUDIO and VIDEO types, raising a MEDIA_OBJECT_ILLEGAL_PARAM_ERROR otherwise. If the pad indicated does not exist, if will return false.
      Parameters:
      mediaType - One of MediaType.AUDIO or MediaType.VIDEO
      sourceMediaDescription - Description of the source
      Returns:
      TRUE if there is media, FALSE in other case. *
    • isMediaFlowingOut

      void isMediaFlowingOut(MediaType mediaType, String sourceMediaDescription, Continuation<Boolean> cont)
      Asynchronous version of isMediaFlowingOut: Continuation.onSuccess(F) is called when the action is done. If an error occurs, Continuation.onError(java.lang.Throwable) is called.
      Parameters:
      mediaType - One of MediaType.AUDIO or MediaType.VIDEO
      sourceMediaDescription - Description of the source
      See Also:
    • isMediaFlowingOut

      TFuture<Boolean> isMediaFlowingOut(Transaction tx, MediaType mediaType, String sourceMediaDescription)
      This method indicates whether the media element is emitting media of a certain type. The media source pad can be identified individually, if needed. It is only supported for AUDIO and VIDEO types, raising a MEDIA_OBJECT_ILLEGAL_PARAM_ERROR otherwise. If the pad indicated does not exist, if will return false.
      Parameters:
      mediaType - One of MediaType.AUDIO or MediaType.VIDEO
      sourceMediaDescription - Description of the source
      Returns:
      TRUE if there is media, FALSE in other case. *
    • isMediaTranscoding

      boolean isMediaTranscoding(MediaType mediaType, String binName)
      Indicates whether this media element is actively transcoding between input and output pads. This operation is only supported for AUDIO and VIDEO media types, raising a MEDIA_OBJECT_ILLEGAL_PARAM_ERROR otherwise. The internal GStreamer processing bin can be indicated, if needed; if the bin doesn't exist, the return value will be FALSE.
      Parameters:
      mediaType - One of MediaType.AUDIO or MediaType.VIDEO
      binName - Internal name of the processing bin, as previously given by MediaTranscodingStateChanged.
      Returns:
      TRUE if media is being transcoded, FALSE otherwise. *
    • isMediaTranscoding

      void isMediaTranscoding(MediaType mediaType, String binName, Continuation<Boolean> cont)
      Asynchronous version of isMediaTranscoding: Continuation.onSuccess(F) is called when the action is done. If an error occurs, Continuation.onError(java.lang.Throwable) is called.
      Parameters:
      mediaType - One of MediaType.AUDIO or MediaType.VIDEO
      binName - Internal name of the processing bin, as previously given by MediaTranscodingStateChanged.
      See Also:
    • isMediaTranscoding

      TFuture<Boolean> isMediaTranscoding(Transaction tx, MediaType mediaType, String binName)
      Indicates whether this media element is actively transcoding between input and output pads. This operation is only supported for AUDIO and VIDEO media types, raising a MEDIA_OBJECT_ILLEGAL_PARAM_ERROR otherwise. The internal GStreamer processing bin can be indicated, if needed; if the bin doesn't exist, the return value will be FALSE.
      Parameters:
      mediaType - One of MediaType.AUDIO or MediaType.VIDEO
      binName - Internal name of the processing bin, as previously given by MediaTranscodingStateChanged.
      Returns:
      TRUE if media is being transcoded, FALSE otherwise. *
    • getSourceConnections

      List<ElementConnectionData> getSourceConnections()
      Gets information about the sink pads of this media element.

      Since sink pads are the interface through which a media element gets it's media, whatever is connected to an element's sink pad is formally a source of media. Media can be filtered by type, or by the description given to the pad though which both elements are connected.

      Returns:
      A list of the connections information that are sending media to this element. The list will be empty if no sources are found. *
    • getSourceConnections

      void getSourceConnections(Continuation<List<ElementConnectionData>> cont)
      Asynchronous version of getSourceConnections: Continuation.onSuccess(F) is called when the action is done. If an error occurs, Continuation.onError(java.lang.Throwable) is called.
      See Also:
    • getSourceConnections

      TFuture<List<ElementConnectionData>> getSourceConnections(Transaction tx)
      Gets information about the sink pads of this media element.

      Since sink pads are the interface through which a media element gets it's media, whatever is connected to an element's sink pad is formally a source of media. Media can be filtered by type, or by the description given to the pad though which both elements are connected.

      Returns:
      A list of the connections information that are sending media to this element. The list will be empty if no sources are found. *
    • getSourceConnections

      List<ElementConnectionData> getSourceConnections(MediaType mediaType)
      Gets information about the sink pads of this media element.

      Since sink pads are the interface through which a media element gets it's media, whatever is connected to an element's sink pad is formally a source of media. Media can be filtered by type, or by the description given to the pad though which both elements are connected.

      Parameters:
      mediaType - One of MediaType.AUDIO, MediaType.VIDEO or MediaType.DATA
      Returns:
      A list of the connections information that are sending media to this element. The list will be empty if no sources are found. *
    • getSourceConnections

      void getSourceConnections(MediaType mediaType, Continuation<List<ElementConnectionData>> cont)
      Asynchronous version of getSourceConnections: Continuation.onSuccess(F) is called when the action is done. If an error occurs, Continuation.onError(java.lang.Throwable) is called.
      Parameters:
      mediaType - One of MediaType.AUDIO, MediaType.VIDEO or MediaType.DATA
      See Also:
    • getSourceConnections

      TFuture<List<ElementConnectionData>> getSourceConnections(Transaction tx, MediaType mediaType)
      Gets information about the sink pads of this media element.

      Since sink pads are the interface through which a media element gets it's media, whatever is connected to an element's sink pad is formally a source of media. Media can be filtered by type, or by the description given to the pad though which both elements are connected.

      Parameters:
      mediaType - One of MediaType.AUDIO, MediaType.VIDEO or MediaType.DATA
      Returns:
      A list of the connections information that are sending media to this element. The list will be empty if no sources are found. *
    • getSinkConnections

      List<ElementConnectionData> getSinkConnections()
      Gets information about the source pads of this media element.

      Since source pads connect to other media element's sinks, this is formally the sink of media from the element's perspective. Media can be filtered by type, or by the description given to the pad though which both elements are connected.

      Returns:
      A list of the connections information that are receiving media from this element. The list will be empty if no sources are found. *
    • getSinkConnections

      void getSinkConnections(Continuation<List<ElementConnectionData>> cont)
      Asynchronous version of getSinkConnections: Continuation.onSuccess(F) is called when the action is done. If an error occurs, Continuation.onError(java.lang.Throwable) is called.
      See Also:
    • getSinkConnections

      TFuture<List<ElementConnectionData>> getSinkConnections(Transaction tx)
      Gets information about the source pads of this media element.

      Since source pads connect to other media element's sinks, this is formally the sink of media from the element's perspective. Media can be filtered by type, or by the description given to the pad though which both elements are connected.

      Returns:
      A list of the connections information that are receiving media from this element. The list will be empty if no sources are found. *
    • getSinkConnections

      List<ElementConnectionData> getSinkConnections(MediaType mediaType)
      Gets information about the source pads of this media element.

      Since source pads connect to other media element's sinks, this is formally the sink of media from the element's perspective. Media can be filtered by type, or by the description given to the pad though which both elements are connected.

      Parameters:
      mediaType - One of MediaType.AUDIO, MediaType.VIDEO or MediaType.DATA
      Returns:
      A list of the connections information that are receiving media from this element. The list will be empty if no sources are found. *
    • getSinkConnections

      void getSinkConnections(MediaType mediaType, Continuation<List<ElementConnectionData>> cont)
      Asynchronous version of getSinkConnections: Continuation.onSuccess(F) is called when the action is done. If an error occurs, Continuation.onError(java.lang.Throwable) is called.
      Parameters:
      mediaType - One of MediaType.AUDIO, MediaType.VIDEO or MediaType.DATA
      See Also:
    • getSinkConnections

      TFuture<List<ElementConnectionData>> getSinkConnections(Transaction tx, MediaType mediaType)
      Gets information about the source pads of this media element.

      Since source pads connect to other media element's sinks, this is formally the sink of media from the element's perspective. Media can be filtered by type, or by the description given to the pad though which both elements are connected.

      Parameters:
      mediaType - One of MediaType.AUDIO, MediaType.VIDEO or MediaType.DATA
      Returns:
      A list of the connections information that are receiving media from this element. The list will be empty if no sources are found. *
    • connect

      void connect(MediaElement sink)
      Connects two elements, with the media flowing from left to right.

      The element that invokes the connect will be the source of media, creating one sink pad for each type of media connected. The element given as parameter to the method will be the sink, and it will create one sink pad per media type connected.

      If otherwise not specified, all types of media are connected by default (AUDIO, VIDEO and DATA). It is recommended to connect the specific types of media if not all of them will be used. For this purpose, the connect method can be invoked more than once on the same two elements, but with different media types.

      The connection is unidirectional. If a bidirectional connection is desired, the position of the media elements must be inverted. For instance, webrtc1.connect(webrtc2) is connecting webrtc1 as source of webrtc2. In order to create a WebRTC one-2one conversation, the user would need to specify the connection on the other direction with webrtc2.connect(webrtc1).

      Even though one media element can have one sink pad per type of media, only one media element can be connected to another at a given time. If a media element is connected to another, the former will become the source of the sink media element, regardless whether there was another element connected or not.

      Parameters:
      sink - the target MediaElement that will receive media
    • connect

      void connect(MediaElement sink, Continuation<Void> cont)
      Asynchronous version of connect: Continuation.onSuccess(F) is called when the action is done. If an error occurs, Continuation.onError(java.lang.Throwable) is called.
      Parameters:
      sink - the target MediaElement that will receive media
      See Also:
    • connect

      void connect(Transaction tx, MediaElement sink)
      Connects two elements, with the media flowing from left to right.

      The element that invokes the connect will be the source of media, creating one sink pad for each type of media connected. The element given as parameter to the method will be the sink, and it will create one sink pad per media type connected.

      If otherwise not specified, all types of media are connected by default (AUDIO, VIDEO and DATA). It is recommended to connect the specific types of media if not all of them will be used. For this purpose, the connect method can be invoked more than once on the same two elements, but with different media types.

      The connection is unidirectional. If a bidirectional connection is desired, the position of the media elements must be inverted. For instance, webrtc1.connect(webrtc2) is connecting webrtc1 as source of webrtc2. In order to create a WebRTC one-2one conversation, the user would need to specify the connection on the other direction with webrtc2.connect(webrtc1).

      Even though one media element can have one sink pad per type of media, only one media element can be connected to another at a given time. If a media element is connected to another, the former will become the source of the sink media element, regardless whether there was another element connected or not.

      Parameters:
      sink - the target MediaElement that will receive media
    • connect

      void connect(MediaElement sink, MediaType mediaType)
      Connects two elements, with the media flowing from left to right.

      The element that invokes the connect will be the source of media, creating one sink pad for each type of media connected. The element given as parameter to the method will be the sink, and it will create one sink pad per media type connected.

      If otherwise not specified, all types of media are connected by default (AUDIO, VIDEO and DATA). It is recommended to connect the specific types of media if not all of them will be used. For this purpose, the connect method can be invoked more than once on the same two elements, but with different media types.

      The connection is unidirectional. If a bidirectional connection is desired, the position of the media elements must be inverted. For instance, webrtc1.connect(webrtc2) is connecting webrtc1 as source of webrtc2. In order to create a WebRTC one-2one conversation, the user would need to specify the connection on the other direction with webrtc2.connect(webrtc1).

      Even though one media element can have one sink pad per type of media, only one media element can be connected to another at a given time. If a media element is connected to another, the former will become the source of the sink media element, regardless whether there was another element connected or not.

      Parameters:
      sink - the target MediaElement that will receive media
      mediaType - the MediaType of the pads that will be connected
    • connect

      void connect(MediaElement sink, MediaType mediaType, Continuation<Void> cont)
      Asynchronous version of connect: Continuation.onSuccess(F) is called when the action is done. If an error occurs, Continuation.onError(java.lang.Throwable) is called.
      Parameters:
      sink - the target MediaElement that will receive media
      mediaType - the MediaType of the pads that will be connected
      See Also:
    • connect

      void connect(Transaction tx, MediaElement sink, MediaType mediaType)
      Connects two elements, with the media flowing from left to right.

      The element that invokes the connect will be the source of media, creating one sink pad for each type of media connected. The element given as parameter to the method will be the sink, and it will create one sink pad per media type connected.

      If otherwise not specified, all types of media are connected by default (AUDIO, VIDEO and DATA). It is recommended to connect the specific types of media if not all of them will be used. For this purpose, the connect method can be invoked more than once on the same two elements, but with different media types.

      The connection is unidirectional. If a bidirectional connection is desired, the position of the media elements must be inverted. For instance, webrtc1.connect(webrtc2) is connecting webrtc1 as source of webrtc2. In order to create a WebRTC one-2one conversation, the user would need to specify the connection on the other direction with webrtc2.connect(webrtc1).

      Even though one media element can have one sink pad per type of media, only one media element can be connected to another at a given time. If a media element is connected to another, the former will become the source of the sink media element, regardless whether there was another element connected or not.

      Parameters:
      sink - the target MediaElement that will receive media
      mediaType - the MediaType of the pads that will be connected
    • connect

      void connect(MediaElement sink, MediaType mediaType, String sourceMediaDescription)
      Connects two elements, with the media flowing from left to right.

      The element that invokes the connect will be the source of media, creating one sink pad for each type of media connected. The element given as parameter to the method will be the sink, and it will create one sink pad per media type connected.

      If otherwise not specified, all types of media are connected by default (AUDIO, VIDEO and DATA). It is recommended to connect the specific types of media if not all of them will be used. For this purpose, the connect method can be invoked more than once on the same two elements, but with different media types.

      The connection is unidirectional. If a bidirectional connection is desired, the position of the media elements must be inverted. For instance, webrtc1.connect(webrtc2) is connecting webrtc1 as source of webrtc2. In order to create a WebRTC one-2one conversation, the user would need to specify the connection on the other direction with webrtc2.connect(webrtc1).

      Even though one media element can have one sink pad per type of media, only one media element can be connected to another at a given time. If a media element is connected to another, the former will become the source of the sink media element, regardless whether there was another element connected or not.

      Parameters:
      sink - the target MediaElement that will receive media
      mediaType - the MediaType of the pads that will be connected
      sourceMediaDescription - A textual description of the media source. Currently not used, aimed mainly for MediaType.DATA sources
    • connect

      void connect(MediaElement sink, MediaType mediaType, String sourceMediaDescription, Continuation<Void> cont)
      Asynchronous version of connect: Continuation.onSuccess(F) is called when the action is done. If an error occurs, Continuation.onError(java.lang.Throwable) is called.
      Parameters:
      sink - the target MediaElement that will receive media
      mediaType - the MediaType of the pads that will be connected
      sourceMediaDescription - A textual description of the media source. Currently not used, aimed mainly for MediaType.DATA sources
      See Also:
    • connect

      void connect(Transaction tx, MediaElement sink, MediaType mediaType, String sourceMediaDescription)
      Connects two elements, with the media flowing from left to right.

      The element that invokes the connect will be the source of media, creating one sink pad for each type of media connected. The element given as parameter to the method will be the sink, and it will create one sink pad per media type connected.

      If otherwise not specified, all types of media are connected by default (AUDIO, VIDEO and DATA). It is recommended to connect the specific types of media if not all of them will be used. For this purpose, the connect method can be invoked more than once on the same two elements, but with different media types.

      The connection is unidirectional. If a bidirectional connection is desired, the position of the media elements must be inverted. For instance, webrtc1.connect(webrtc2) is connecting webrtc1 as source of webrtc2. In order to create a WebRTC one-2one conversation, the user would need to specify the connection on the other direction with webrtc2.connect(webrtc1).

      Even though one media element can have one sink pad per type of media, only one media element can be connected to another at a given time. If a media element is connected to another, the former will become the source of the sink media element, regardless whether there was another element connected or not.

      Parameters:
      sink - the target MediaElement that will receive media
      mediaType - the MediaType of the pads that will be connected
      sourceMediaDescription - A textual description of the media source. Currently not used, aimed mainly for MediaType.DATA sources
    • disconnect

      void disconnect(MediaElement sink)
      Disconnects two media elements. This will release the source pads of the source media element, and the sink pads of the sink media element.
      Parameters:
      sink - the target MediaElement that will stop receiving media
    • disconnect

      void disconnect(MediaElement sink, Continuation<Void> cont)
      Asynchronous version of disconnect: Continuation.onSuccess(F) is called when the action is done. If an error occurs, Continuation.onError(java.lang.Throwable) is called.
      Parameters:
      sink - the target MediaElement that will stop receiving media
      See Also:
    • disconnect

      void disconnect(Transaction tx, MediaElement sink)
      Disconnects two media elements. This will release the source pads of the source media element, and the sink pads of the sink media element.
      Parameters:
      sink - the target MediaElement that will stop receiving media
    • disconnect

      void disconnect(MediaElement sink, MediaType mediaType)
      Disconnects two media elements. This will release the source pads of the source media element, and the sink pads of the sink media element.
      Parameters:
      sink - the target MediaElement that will stop receiving media
      mediaType - the MediaType of the pads that will be connected
    • disconnect

      void disconnect(MediaElement sink, MediaType mediaType, Continuation<Void> cont)
      Asynchronous version of disconnect: Continuation.onSuccess(F) is called when the action is done. If an error occurs, Continuation.onError(java.lang.Throwable) is called.
      Parameters:
      sink - the target MediaElement that will stop receiving media
      mediaType - the MediaType of the pads that will be connected
      See Also:
    • disconnect

      void disconnect(Transaction tx, MediaElement sink, MediaType mediaType)
      Disconnects two media elements. This will release the source pads of the source media element, and the sink pads of the sink media element.
      Parameters:
      sink - the target MediaElement that will stop receiving media
      mediaType - the MediaType of the pads that will be connected
    • disconnect

      void disconnect(MediaElement sink, MediaType mediaType, String sourceMediaDescription)
      Disconnects two media elements. This will release the source pads of the source media element, and the sink pads of the sink media element.
      Parameters:
      sink - the target MediaElement that will stop receiving media
      mediaType - the MediaType of the pads that will be connected
      sourceMediaDescription - A textual description of the media source. Currently not used, aimed mainly for MediaType.DATA sources
    • disconnect

      void disconnect(MediaElement sink, MediaType mediaType, String sourceMediaDescription, Continuation<Void> cont)
      Asynchronous version of disconnect: Continuation.onSuccess(F) is called when the action is done. If an error occurs, Continuation.onError(java.lang.Throwable) is called.
      Parameters:
      sink - the target MediaElement that will stop receiving media
      mediaType - the MediaType of the pads that will be connected
      sourceMediaDescription - A textual description of the media source. Currently not used, aimed mainly for MediaType.DATA sources
      See Also:
    • disconnect

      void disconnect(Transaction tx, MediaElement sink, MediaType mediaType, String sourceMediaDescription)
      Disconnects two media elements. This will release the source pads of the source media element, and the sink pads of the sink media element.
      Parameters:
      sink - the target MediaElement that will stop receiving media
      mediaType - the MediaType of the pads that will be connected
      sourceMediaDescription - A textual description of the media source. Currently not used, aimed mainly for MediaType.DATA sources
    • dumpGstreamerDot

      void dumpGstreamerDot()
      If GST_DEBUG_DUMP_DOT_DIR environment variable is defined dumps in that directoy a file with the GStreamer dot of the Media Element.

      The element can be queried for certain type of data:

      • SHOW_ALL: default value
      • SHOW_CAPS_DETAILS
      • SHOW_FULL_PARAMS
      • SHOW_MEDIA_TYPE
      • SHOW_NON_DEFAULT_PARAMS
      • SHOW_STATES
      • SHOW_VERBOSE
    • dumpGstreamerDot

      void dumpGstreamerDot(Continuation<Void> cont)
      Asynchronous version of dumpGstreamerDot: Continuation.onSuccess(F) is called when the action is done. If an error occurs, Continuation.onError(java.lang.Throwable) is called.
      See Also:
    • dumpGstreamerDot

      void dumpGstreamerDot(Transaction tx)
      If GST_DEBUG_DUMP_DOT_DIR environment variable is defined dumps in that directoy a file with the GStreamer dot of the Media Element.

      The element can be queried for certain type of data:

      • SHOW_ALL: default value
      • SHOW_CAPS_DETAILS
      • SHOW_FULL_PARAMS
      • SHOW_MEDIA_TYPE
      • SHOW_NON_DEFAULT_PARAMS
      • SHOW_STATES
      • SHOW_VERBOSE
    • getGstreamerDot

      String getGstreamerDot()
      Return a .dot file describing the topology of the media element.

      The element can be queried for certain type of data:

      • SHOW_ALL: default value
      • SHOW_CAPS_DETAILS
      • SHOW_FULL_PARAMS
      • SHOW_MEDIA_TYPE
      • SHOW_NON_DEFAULT_PARAMS
      • SHOW_STATES
      • SHOW_VERBOSE
      Returns:
      The dot graph. *
    • getGstreamerDot

      void getGstreamerDot(Continuation<String> cont)
      Asynchronous version of getGstreamerDot: Continuation.onSuccess(F) is called when the action is done. If an error occurs, Continuation.onError(java.lang.Throwable) is called.
      See Also:
    • getGstreamerDot

      TFuture<String> getGstreamerDot(Transaction tx)
      Return a .dot file describing the topology of the media element.

      The element can be queried for certain type of data:

      • SHOW_ALL: default value
      • SHOW_CAPS_DETAILS
      • SHOW_FULL_PARAMS
      • SHOW_MEDIA_TYPE
      • SHOW_NON_DEFAULT_PARAMS
      • SHOW_STATES
      • SHOW_VERBOSE
      Returns:
      The dot graph. *
    • getStats

      Map<String,Stats> getStats()
      Gets the statistics related to an endpoint. If no media type is specified, it returns statistics for all available types.
      Returns:
      Delivers a successful result in the form of a RTC stats report. A RTC stats report represents a map between strings, identifying the inspected objects (RTCStats.id), and their corresponding RTCStats objects. *
    • getStats

      void getStats(Continuation<Map<String,Stats>> cont)
      Asynchronous version of getStats: Continuation.onSuccess(F) is called when the action is done. If an error occurs, Continuation.onError(java.lang.Throwable) is called.
      See Also:
    • getStats

      Gets the statistics related to an endpoint. If no media type is specified, it returns statistics for all available types.
      Returns:
      Delivers a successful result in the form of a RTC stats report. A RTC stats report represents a map between strings, identifying the inspected objects (RTCStats.id), and their corresponding RTCStats objects. *
    • isMediaFlowingIn

      boolean isMediaFlowingIn(MediaType mediaType)
      This method indicates whether the media element is receiving media of a certain type. The media sink pad can be identified individually, if needed. It is only supported for AUDIO and VIDEO types, raising a MEDIA_OBJECT_ILLEGAL_PARAM_ERROR otherwise. If the pad indicated does not exist, if will return false.
      Parameters:
      mediaType - One of MediaType.AUDIO or MediaType.VIDEO
      Returns:
      TRUE if there is media, FALSE in other case. *
    • isMediaFlowingIn

      void isMediaFlowingIn(MediaType mediaType, Continuation<Boolean> cont)
      Asynchronous version of isMediaFlowingIn: Continuation.onSuccess(F) is called when the action is done. If an error occurs, Continuation.onError(java.lang.Throwable) is called.
      Parameters:
      mediaType - One of MediaType.AUDIO or MediaType.VIDEO
      See Also:
    • isMediaFlowingIn

      TFuture<Boolean> isMediaFlowingIn(Transaction tx, MediaType mediaType)
      This method indicates whether the media element is receiving media of a certain type. The media sink pad can be identified individually, if needed. It is only supported for AUDIO and VIDEO types, raising a MEDIA_OBJECT_ILLEGAL_PARAM_ERROR otherwise. If the pad indicated does not exist, if will return false.
      Parameters:
      mediaType - One of MediaType.AUDIO or MediaType.VIDEO
      Returns:
      TRUE if there is media, FALSE in other case. *
    • isMediaFlowingOut

      boolean isMediaFlowingOut(MediaType mediaType)
      This method indicates whether the media element is emitting media of a certain type. The media source pad can be identified individually, if needed. It is only supported for AUDIO and VIDEO types, raising a MEDIA_OBJECT_ILLEGAL_PARAM_ERROR otherwise. If the pad indicated does not exist, if will return false.
      Parameters:
      mediaType - One of MediaType.AUDIO or MediaType.VIDEO
      Returns:
      TRUE if there is media, FALSE in other case. *
    • isMediaFlowingOut

      void isMediaFlowingOut(MediaType mediaType, Continuation<Boolean> cont)
      Asynchronous version of isMediaFlowingOut: Continuation.onSuccess(F) is called when the action is done. If an error occurs, Continuation.onError(java.lang.Throwable) is called.
      Parameters:
      mediaType - One of MediaType.AUDIO or MediaType.VIDEO
      See Also:
    • isMediaFlowingOut

      TFuture<Boolean> isMediaFlowingOut(Transaction tx, MediaType mediaType)
      This method indicates whether the media element is emitting media of a certain type. The media source pad can be identified individually, if needed. It is only supported for AUDIO and VIDEO types, raising a MEDIA_OBJECT_ILLEGAL_PARAM_ERROR otherwise. If the pad indicated does not exist, if will return false.
      Parameters:
      mediaType - One of MediaType.AUDIO or MediaType.VIDEO
      Returns:
      TRUE if there is media, FALSE in other case. *
    • isMediaTranscoding

      boolean isMediaTranscoding(MediaType mediaType)
      Indicates whether this media element is actively transcoding between input and output pads. This operation is only supported for AUDIO and VIDEO media types, raising a MEDIA_OBJECT_ILLEGAL_PARAM_ERROR otherwise. The internal GStreamer processing bin can be indicated, if needed; if the bin doesn't exist, the return value will be FALSE.
      Parameters:
      mediaType - One of MediaType.AUDIO or MediaType.VIDEO
      Returns:
      TRUE if media is being transcoded, FALSE otherwise. *
    • isMediaTranscoding

      void isMediaTranscoding(MediaType mediaType, Continuation<Boolean> cont)
      Asynchronous version of isMediaTranscoding: Continuation.onSuccess(F) is called when the action is done. If an error occurs, Continuation.onError(java.lang.Throwable) is called.
      Parameters:
      mediaType - One of MediaType.AUDIO or MediaType.VIDEO
      See Also:
    • isMediaTranscoding

      TFuture<Boolean> isMediaTranscoding(Transaction tx, MediaType mediaType)
      Indicates whether this media element is actively transcoding between input and output pads. This operation is only supported for AUDIO and VIDEO media types, raising a MEDIA_OBJECT_ILLEGAL_PARAM_ERROR otherwise. The internal GStreamer processing bin can be indicated, if needed; if the bin doesn't exist, the return value will be FALSE.
      Parameters:
      mediaType - One of MediaType.AUDIO or MediaType.VIDEO
      Returns:
      TRUE if media is being transcoded, FALSE otherwise. *
    • addElementConnectedListener

      ListenerSubscription addElementConnectedListener(EventListener<ElementConnectedEvent> listener)
      Add a EventListener for event ElementConnectedEvent. Synchronous call.
      Parameters:
      listener - Listener to be called on ElementConnectedEvent
      Returns:
      ListenerSubscription for the given Listener
    • addElementConnectedListener

      void addElementConnectedListener(EventListener<ElementConnectedEvent> listener, Continuation<ListenerSubscription> cont)
      Add a EventListener for event ElementConnectedEvent. Asynchronous call. Calls Continuation<ListenerSubscription> when it has been added.
      Parameters:
      listener - Listener to be called on ElementConnectedEvent
      cont - Continuation to be called when the listener is registered
    • removeElementConnectedListener

      void removeElementConnectedListener(ListenerSubscription listenerSubscription)
      Remove a ListenerSubscription for event ElementConnectedEvent. Synchronous call.
      Parameters:
      listenerSubscription - Listener subscription to be removed
    • removeElementConnectedListener

      void removeElementConnectedListener(ListenerSubscription listenerSubscription, Continuation<Void> cont)
      Remove a ListenerSubscription for event ElementConnectedEvent. Asynchronous call. Calls Continuation<Void> when it has been removed.
      Parameters:
      listenerSubscription - Listener subscription to be removed
      cont - Continuation to be called when the listener is removed
    • addElementDisconnectedListener

      ListenerSubscription addElementDisconnectedListener(EventListener<ElementDisconnectedEvent> listener)
      Add a EventListener for event ElementDisconnectedEvent. Synchronous call.
      Parameters:
      listener - Listener to be called on ElementDisconnectedEvent
      Returns:
      ListenerSubscription for the given Listener
    • addElementDisconnectedListener

      void addElementDisconnectedListener(EventListener<ElementDisconnectedEvent> listener, Continuation<ListenerSubscription> cont)
      Add a EventListener for event ElementDisconnectedEvent. Asynchronous call. Calls Continuation<ListenerSubscription> when it has been added.
      Parameters:
      listener - Listener to be called on ElementDisconnectedEvent
      cont - Continuation to be called when the listener is registered
    • removeElementDisconnectedListener

      void removeElementDisconnectedListener(ListenerSubscription listenerSubscription)
      Remove a ListenerSubscription for event ElementDisconnectedEvent. Synchronous call.
      Parameters:
      listenerSubscription - Listener subscription to be removed
    • removeElementDisconnectedListener

      void removeElementDisconnectedListener(ListenerSubscription listenerSubscription, Continuation<Void> cont)
      Remove a ListenerSubscription for event ElementDisconnectedEvent. Asynchronous call. Calls Continuation<Void> when it has been removed.
      Parameters:
      listenerSubscription - Listener subscription to be removed
      cont - Continuation to be called when the listener is removed
    • addMediaFlowOutStateChangedListener

      ListenerSubscription addMediaFlowOutStateChangedListener(EventListener<MediaFlowOutStateChangedEvent> listener)
      Add a EventListener for event MediaFlowOutStateChangedEvent. Synchronous call.
      Parameters:
      listener - Listener to be called on MediaFlowOutStateChangedEvent
      Returns:
      ListenerSubscription for the given Listener
    • addMediaFlowOutStateChangedListener

      void addMediaFlowOutStateChangedListener(EventListener<MediaFlowOutStateChangedEvent> listener, Continuation<ListenerSubscription> cont)
      Add a EventListener for event MediaFlowOutStateChangedEvent. Asynchronous call. Calls Continuation<ListenerSubscription> when it has been added.
      Parameters:
      listener - Listener to be called on MediaFlowOutStateChangedEvent
      cont - Continuation to be called when the listener is registered
    • removeMediaFlowOutStateChangedListener

      void removeMediaFlowOutStateChangedListener(ListenerSubscription listenerSubscription)
      Remove a ListenerSubscription for event MediaFlowOutStateChangedEvent. Synchronous call.
      Parameters:
      listenerSubscription - Listener subscription to be removed
    • removeMediaFlowOutStateChangedListener

      void removeMediaFlowOutStateChangedListener(ListenerSubscription listenerSubscription, Continuation<Void> cont)
      Remove a ListenerSubscription for event MediaFlowOutStateChangedEvent. Asynchronous call. Calls Continuation<Void> when it has been removed.
      Parameters:
      listenerSubscription - Listener subscription to be removed
      cont - Continuation to be called when the listener is removed
    • addMediaFlowInStateChangedListener

      ListenerSubscription addMediaFlowInStateChangedListener(EventListener<MediaFlowInStateChangedEvent> listener)
      Add a EventListener for event MediaFlowInStateChangedEvent. Synchronous call.
      Parameters:
      listener - Listener to be called on MediaFlowInStateChangedEvent
      Returns:
      ListenerSubscription for the given Listener
    • addMediaFlowInStateChangedListener

      void addMediaFlowInStateChangedListener(EventListener<MediaFlowInStateChangedEvent> listener, Continuation<ListenerSubscription> cont)
      Add a EventListener for event MediaFlowInStateChangedEvent. Asynchronous call. Calls Continuation<ListenerSubscription> when it has been added.
      Parameters:
      listener - Listener to be called on MediaFlowInStateChangedEvent
      cont - Continuation to be called when the listener is registered
    • removeMediaFlowInStateChangedListener

      void removeMediaFlowInStateChangedListener(ListenerSubscription listenerSubscription)
      Remove a ListenerSubscription for event MediaFlowInStateChangedEvent. Synchronous call.
      Parameters:
      listenerSubscription - Listener subscription to be removed
    • removeMediaFlowInStateChangedListener

      void removeMediaFlowInStateChangedListener(ListenerSubscription listenerSubscription, Continuation<Void> cont)
      Remove a ListenerSubscription for event MediaFlowInStateChangedEvent. Asynchronous call. Calls Continuation<Void> when it has been removed.
      Parameters:
      listenerSubscription - Listener subscription to be removed
      cont - Continuation to be called when the listener is removed
    • addMediaTranscodingStateChangedListener

      ListenerSubscription addMediaTranscodingStateChangedListener(EventListener<MediaTranscodingStateChangedEvent> listener)
      Add a EventListener for event MediaTranscodingStateChangedEvent. Synchronous call.
      Parameters:
      listener - Listener to be called on MediaTranscodingStateChangedEvent
      Returns:
      ListenerSubscription for the given Listener
    • addMediaTranscodingStateChangedListener

      void addMediaTranscodingStateChangedListener(EventListener<MediaTranscodingStateChangedEvent> listener, Continuation<ListenerSubscription> cont)
      Add a EventListener for event MediaTranscodingStateChangedEvent. Asynchronous call. Calls Continuation<ListenerSubscription> when it has been added.
      Parameters:
      listener - Listener to be called on MediaTranscodingStateChangedEvent
      cont - Continuation to be called when the listener is registered
    • removeMediaTranscodingStateChangedListener

      void removeMediaTranscodingStateChangedListener(ListenerSubscription listenerSubscription)
      Remove a ListenerSubscription for event MediaTranscodingStateChangedEvent. Synchronous call.
      Parameters:
      listenerSubscription - Listener subscription to be removed
    • removeMediaTranscodingStateChangedListener

      void removeMediaTranscodingStateChangedListener(ListenerSubscription listenerSubscription, Continuation<Void> cont)
      Remove a ListenerSubscription for event MediaTranscodingStateChangedEvent. Asynchronous call. Calls Continuation<Void> when it has been removed.
      Parameters:
      listenerSubscription - Listener subscription to be removed
      cont - Continuation to be called when the listener is removed