diff --git a/docs/flutter-webrtc/api-docs/_category_.json b/docs/flutter-webrtc/api-docs/_category_.json new file mode 100644 index 0000000..65a1c99 --- /dev/null +++ b/docs/flutter-webrtc/api-docs/_category_.json @@ -0,0 +1,4 @@ +{ + "label": "API Docs", + "position": 1 +} diff --git a/docs/flutter-webrtc/api-docs/get-display-media.md b/docs/flutter-webrtc/api-docs/get-display-media.md new file mode 100644 index 0000000..99fe1ae --- /dev/null +++ b/docs/flutter-webrtc/api-docs/get-display-media.md @@ -0,0 +1,26 @@ +--- +sidebar_position: 3 +--- + +# GetDisplayMedia + +The `getDisplayMedia` method of the `MediaDevices` interface prompts the user to select and grant permission to capture the contents of a display or portion thereof (such as a window or screen) as a `MediaStream`. + +The corresponding JS API docs is here [MediaDevices.getDisplayMedia()](https://developer.mozilla.org/en-US/docs/Web/API/MediaDevices/getDisplayMedia). + +## Usage + +```dart +MediaStream stream = await navigator.mediaDevices.getDisplayMedia({ + 'video': true, + 'audio': true, +}); +``` + +## Parameters + +- `audio`: A Boolean value that indicates whether the `MediaStream` should include an audio track. + audio is optional, default is false. only chrome tabs can support audio caputure. + +- `video`: A Boolean value that indicates whether the `MediaStream` should include a video track. + video is optional, default is true. diff --git a/docs/flutter-webrtc/api-docs/get-user-media.md b/docs/flutter-webrtc/api-docs/get-user-media.md new file mode 100644 index 0000000..647adb1 --- /dev/null +++ b/docs/flutter-webrtc/api-docs/get-user-media.md @@ -0,0 +1,67 @@ +--- +sidebar_position: 2 +--- + +# GetUserMedia + +The corresponding JS API docs is here +[MediaDevices.getUserMedia()](https://developer.mozilla.org/en-US/docs/Web/API/MediaDevices/getUserMedia). + +## Usage + +basic usage: + +```dart +await navigator.mediaDevices.getUserMedia({'audio': true, 'video': true}); +``` + +advanced usage: + +```dart +await navigator.mediaDevices.getUserMedia({ + 'audio': true, + 'video': { + 'facingMode': 'user', // or 'environment' for mobile devices + 'width': 1280, + 'height': 720, + 'frameRate': 30, + } +}); +``` + +## Parameters + +- mediaConstraints: A dictionary object that specifies the media constraints for the requested media types. +refer to the [MediaStreamConstraints](https://developer.mozilla.org/en-US/docs/Web/API/MediaStreamConstraints) for more details. + +sub options: + +- `audio`: A Boolean value that indicates whether the MediaStream should include an audio track. + +or a dictionary object that specifies the audio track's media constraints. + +```json +{ + 'deviceId': 'audio_device_id', +} +``` + +- `video`: A Boolean value that indicates whether the MediaStream should include a video track. + +or a dictionary object that specifies the video track's media constraints. + +```json +{ + 'deviceId': 'video_device_id', + 'facingMode': 'user', // or 'environment' for mobile devices + 'width': 1280, + 'height': 720, + 'frameRate': 30, +} +``` + +## Return value + +A Promise that resolves to a MediaStream object. + +Note: The `deviceId` parameter is used to specify the device to use. If you want to use the default device, you can omit this parameter. If you want to use a specific device, you can get the device ID by calling `navigator.mediaDevices.enumerateDevices` [here](./media-devices) and then pass it to the `deviceId` parameter. diff --git a/docs/flutter-webrtc/api-docs/media-devices.md b/docs/flutter-webrtc/api-docs/media-devices.md new file mode 100644 index 0000000..b615fc5 --- /dev/null +++ b/docs/flutter-webrtc/api-docs/media-devices.md @@ -0,0 +1,62 @@ +--- +sidebar_position: 1 +--- + +# MediaDevices + +The corresponding JS API docs is here [MediaDevices](https://developer.mozilla.org/en-US/docs/Web/API/MediaDevices). + +## Usage + +```dart +MediaDevices mediaDevices = await navigator.mediaDevices; + +MediaStream stream = await mediaDevices.getUserMedia({ + 'audio': true, + 'video': true, +}); + +MediaStream displayStream = await mediaDevices.getDisplayMedia({ + 'video': true, + 'audio': true, +}); + +List devices = await mediaDevices.enumerateDevices(); + +``` + +## Methods + +- `getUserMedia`: Returns a Promise that resolves to a `MediaStream` object. The `MediaStream` object represents a stream of media content, typically (but not necessarily) including both audio and video. + +```dart +MediaStream stream = await mediaDevices.getUserMedia({ + 'audio': true, + 'video': true, +}); +``` + +- `getDisplayMedia`: Returns a Promise that resolves to a `MediaStream` object. The `MediaStream` object represents a stream of media content, typically (but not necessarily) including both audio and video. + +```dart +MediaStream stream = await mediaDevices.getDisplayMedia({ + 'video': true, + 'audio': true, +}); +``` + +- `enumerateDevices`: Returns a Promise that resolves to an array of `MediaDeviceInfo` objects, each of which represents a media input or output device. + +```dart +List devices = await mediaDevices.enumerateDevices(); +``` + +- `getSupportedConstraints`: Returns a dictionary object that specifies the media constraints supported by the user agent. + only support for web platform. + +- `selectAudioOutput`: select audio output device. + support platforms: macOS/Windows/Linux/Web. + +## Event Callbacks + +- `onDeviceChange`: The `ondevicechange` event handler for the `MediaDevices` interface. It is called when a media input or output device is attached or removed. diff --git a/docs/flutter-webrtc/api-docs/media-recorder.md b/docs/flutter-webrtc/api-docs/media-recorder.md new file mode 100644 index 0000000..4620ced --- /dev/null +++ b/docs/flutter-webrtc/api-docs/media-recorder.md @@ -0,0 +1,62 @@ +--- +sidebar_position: 7 +--- + +# MediaRecorder + +The corresponding JS API docs is here [MediaRecorder](https://developer.mozilla.org/en-US/docs/Web/API/MediaRecorder). + +Creates a new `MediaRecorder` object, given a MediaStream to record. Options are available to do things like set the container's MIME type (such as "video/webm" or "video/mp4") and the bit rates of the audio and video tracks or a single overall bit rate. + +## Methods + +- `start`: Starts recording the media. + For Android use audioChannel param. + For iOS use audioTrack. + +```dart +void start() async { + if (Platform.isIOS) { + print('Recording is not available on iOS'); + return; + } + // TODO(rostopira): request write storage permission + final storagePath = await getExternalStorageDirectory(); + if (storagePath == null) throw Exception('Can\'t find storagePath'); + + final filePath = storagePath.path + '/webrtc_sample/test.mp4'; + mediaRecorder = MediaRecorder(); + setState(() {}); + final videoTrack = stream + .getVideoTracks().first; + await mediaRecorder.start( + filePath, + videoTrack: videoTrack, + audioChannel: RecorderAudioChannel.INPUT, + ); + } +``` + +- `startWeb`: Starts recording the media in the web.only for flutter web. + +```dart +void startWeb() async { + mediaRecorder = MediaRecorder(); + setState(() {}); + mediaRecorder.startWeb(stream,onDataChunk:(data){ + // do something with data + dynamic blob, bool isLastOne + } ,'video/webm', 1000); + } +``` + +- `stop`: Stops recording the media. + +```dart +void stop() async { + await mediaRecorder.stop(); + setState(() { + mediaRecorder = null; + }); + } +``` \ No newline at end of file diff --git a/docs/flutter-webrtc/api-docs/media-stream-track.md b/docs/flutter-webrtc/api-docs/media-stream-track.md new file mode 100644 index 0000000..2972987 --- /dev/null +++ b/docs/flutter-webrtc/api-docs/media-stream-track.md @@ -0,0 +1,66 @@ +--- +sidebar_position: 6 +--- + +# MediaStreamTrack + +The corresponding JS API docs is here [MediaStreamTrack](https://developer.mozilla.org/en-US/docs/Web/API/MediaStreamTrack). + +The `MediaStreamTrack` interface represents a single media track within a stream; typically, these are audio or video tracks, but other track types may exist as well. A `MediaStreamTrack` can be associated with multiple `MediaStream` objects. + +## Methods + +- `stop`: Stops the track. + +```dart +void stopStreamedVideo(RTCVideoRenderer renderer) { + var stream = renderer.srcObject; + var tracks = stream.getTracks(); + + tracks.forEach((track) => { + track.stop(); + }); + + randerer.srcObject = null; +} +``` + +## Events + +- `onMute`: Sent to the MediaStreamTrack when the value of the muted property is changed to true, indicating that the track is unable to provide data temporarily (such as when the network is experiencing a service malfunction). + +```dart +track.onMute = (event) { + print("Track ${track.id} is muted"); +}; +``` + +- `onUnMute`: Sent to the MediaStreamTrack when the value of the muted property is changed to false, indicating that the track is able to provide data again (such as when the network is no longer experiencing a service malfunction). + +```dart +track.onUnMute = (event) { + print("Track ${track.id} is unmuted"); +}; +``` + +- `onEnded`:Sent when playback of the track ends (when the value readyState changes to ended), except when the track is ended by calling MediaStreamTrack.stop. + +```dart +track.onEnded = (event) { + print("Track ${track.id} has ended"); +}; +``` + +## Properties + +- `id`: Returns the unique identifier of the track. + +- `label`:This may label audio and video sources (e.g., "Internal microphone" or "External USB Webcam"). Returns the label of the object's corresponding source, if any. + If the corresponding source has or had no label, returns an empty string. + +- `kind`: Returns the type of track, such as "audio" or "video". + +- `enabled`: Returns the enable state of `MediaStreamTrack`.After a `MediaStreamTrack` has ended, setting the enable state +will not change the ended state. + +- `muted`: Returns true if the track is muted, and false otherwise. \ No newline at end of file diff --git a/docs/flutter-webrtc/api-docs/media-stream.md b/docs/flutter-webrtc/api-docs/media-stream.md new file mode 100644 index 0000000..6d24203 --- /dev/null +++ b/docs/flutter-webrtc/api-docs/media-stream.md @@ -0,0 +1,98 @@ +--- +sidebar_position: 8 +--- + +# MediaStream + +The corresponding JS API docs is here [MediaStream](https://developer.mozilla.org/en-US/docs/Web/API/MediaStream). + +The MediaStream interface of the Media Capture and Streams API represents a stream of media content. A stream consists of several tracks, such as video or audio tracks. Each track is specified as an instance of MediaStreamTrack. + +## Methods + +- `addTrack`:Adds the given `MediaStreamTrack` to this `MediaStream`. + +```dart +var mediaStream = MediaStream( + id: 'audio-stream', + ownerTag: 'audio-tag', + active: true, +/// The active attribute return true if this [MediaStream] is active and false otherwise. +/// [MediaStream] is considered active if at least one of its [MediaStreamTracks] is not in the [MediaStreamTrack.ended] state. +/// Once every track has ended, the stream's active property becomes false. + onAddTrack: (MediaStreamTrack track) { + print('Track added: ${track.id}'); + }, + onRemoveTrack: (MediaStreamTrack track) { + print('Track removed: ${track.id}'); + }, +); +mediaStream.addTrack(track, {addToNative: true}); +``` + +- `removeTrack`:Removes the given `MediaStreamTrack` object from this `MediaStream`. + +```dart +mediaStream.removeTrack(track,{removeFromNative: true}); +``` + +- `getTracks`:Returns a List `MediaStreamTrack` objects representing all the tracks in this stream. + +```dart +var tracks = mediaStream.getTracks(); +``` + +- `getAudioTracks`:Returns a List `MediaStreamTrack` objects representing the audio tracks in this stream. +The list represents a snapshot of all the `MediaStreamTrack` objects in this stream's track set whose kind is equal to 'audio'. + +```dart +var audioTracks = mediaStream.getAudioTracks(); +``` + +- `getVideoTracks`:Returns a List `MediaStreamTrack` objects representing the video tracks in this stream. + +```dart +var videoTracks = mediaStream.getVideoTracks(); +``` + +- `getTrackById`:Returns either a `MediaStreamTrack` object from this stream's track set whose id is equal to trackId, or `StateError`, if no such track exists. + +```dart +var track = mediaStream.getTrackById('some-track-id'); +``` + +- `dispose`:Dispose the `MediaStream`. + +```dart +await mediaStream.dispose(); +``` + +## Events + +- `onAddTrack`:Fires when a new `MediaStreamTrack` is added to this `MediaStream`. + +```dart +var mediaStream = MediaStream( + id: 'audio-stream', + ownerTag: 'audio-tag', + active: true, +); +mediaStream.onAddTrack = (MediaStreamTrack track) { + print('Track added: ${track.id}'); +}; +``` + +- `onRemoveTrack`:Fires when a `MediaStreamTrack` is removed from this `MediaStream`. + +```dart +var mediaStream = MediaStream( + id: 'audio-stream', + ownerTag: 'audio-tag', + active: true, +); +mediaStream.onRemoveTrack = (MediaStreamTrack track) { + print('Track removed: ${track.id}'); +}; +``` + + diff --git a/docs/flutter-webrtc/api-docs/rtc-data-channel.md b/docs/flutter-webrtc/api-docs/rtc-data-channel.md new file mode 100644 index 0000000..e69d979 --- /dev/null +++ b/docs/flutter-webrtc/api-docs/rtc-data-channel.md @@ -0,0 +1,85 @@ +--- +sidebar_position: 9 +--- + +# RTCDataChannel + +The corresponding JS API docs is here [RTCDataChannel](https://developer.mozilla.org/en-US/docs/Web/API/RTCDataChannel). + +The RTCDataChannel interface represents a network channel which can be used for bidirectional peer-to-peer transfers of arbitrary data. Every data channel is associated with an RTCPeerConnection, and each peer connection can have up to a theoretical maximum of 65,534 data channels (the actual limit may vary from browser to browser). + +## Methods + +- `send`:Send a message to this datachannel. +To send a text message, use the default constructor to instantiate a text [RTCDataChannelMessage]for the [message] parameter. +To send a binary message, pass a binary [RTCDataChannelMessage]. +constructed with [RTCDataChannelMessage.fromBinary]. + +```dart +var sendChannel = await pc.createDataChannel( + 'dataChannel', + dataChannelDict, + ); +sendChannel.send('Hello, world!'); +``` + +- `close`:Close the data channel. + +```dart +sendChannel.close(); +``` + +## Properties + +- `id`:The unique identifier of the data channel. + +- `label`:The label of the data channel. + +- `state`:The current state of the data channel. + +- `bufferedAmount`:The number of bytes of data currently queued to be sent over the data channel. + +- `bufferedAmountLowThreshold`:Set threshold to trigger onBufferedAmountLow callback. + +## Events + +- `onDataChannelState`:Stream of state change events. Emits the new state on change. +Closes when the [RTCDataChannel] is closed. + +```dart +var dc = await pc.createDataChannel('dataChannel'); +dataChannel.onDataChannelState(state) { + print('Data channel state is: $state'); +}; +/// or +dataChannel.stateChangeStream.listen((state) { + print('Data channel state is: $state'); +}); +``` + +- `onMessage`:Stream of incoming messages. Emits the [RTCDataChannelMessage] on message. + +```dart +dataChannel.onMessage(data) { + print('Received data: $data.text'); +}; +/// or +dataChannel.messageStream.listen((data) { + print('Received data: $data.text'); +}); +``` + +- `onBufferedAmountChange`:Stream of buffered amount change events. + +```dart +dataChannel.onBufferedAmountChange(int currentAmount, int changedAmount) { + print('Buffered amount change: $changedAmount'); +}; +``` + +- `onBufferedAmountLow`:Stream of low buffer amount events. + +```dart +dataChannel.onBufferedAmountLow(int currentAmount) { + print('Buffered amount low: $currentAmount'); +}; \ No newline at end of file diff --git a/docs/flutter-webrtc/api-docs/rtc-dtmf-sender.md b/docs/flutter-webrtc/api-docs/rtc-dtmf-sender.md new file mode 100644 index 0000000..6e07b50 --- /dev/null +++ b/docs/flutter-webrtc/api-docs/rtc-dtmf-sender.md @@ -0,0 +1,39 @@ +--- +sidebar_position: 10 +--- + +# RTCDTMFSender + +The corresponding JS API docs is here [RTCDTMFSender](https://developer.mozilla.org/en-US/docs/Web/API/RTCDTMFSender). + +The RTCDTMFSender interface provides a mechanism for transmitting DTMF codes on a WebRTC RTCPeerConnection. You gain access to the connection's RTCDTMFSender through the RTCRtpSender.dtmf property on the audio track you wish to send DTMF with. + +## Methods + +- `insertDTMF`: Sends DTMF tones. + +```dart +var dtmfSender = pc.getSenders().firstWhere((s) => s.track.kind == 'audio'); +dtmfSender.insertDTMF('1234#', duration: 100, interToneGap: 70); +``` + +- `canInsertDtmf`: Returns a boolean indicating whether the DTMF sender can send DTMF tones. + +```dart +dtmfSender.canInsertDtmf(); /// Return true or false +``` + +## Properties + +- `tones`:A String containing the DTMF codes to be transmitted to the recipient. + Specifying an empty string as the tones parameter clears the tone +buffer, aborting any currently queued tones. A "," character inserts +a two second delay. + +- `duration`:This value must be between 40 ms and 6000 ms (6 seconds). + The default is 100 ms. + +- `interToneGap`:The length of time, in milliseconds, to wait between tones. + The browser will enforce a minimum value of 30 ms (that is, +if you specify a lower value, 30 ms will be used instead); +the default is 70 ms. \ No newline at end of file diff --git a/docs/flutter-webrtc/api-docs/rtc-factory.md b/docs/flutter-webrtc/api-docs/rtc-factory.md new file mode 100644 index 0000000..dd7a239 --- /dev/null +++ b/docs/flutter-webrtc/api-docs/rtc-factory.md @@ -0,0 +1,39 @@ +--- +sidebar_position: 4 +--- + +# RTCFactory + +## Methods + +- `createPeerConnection`: Creates a new `RTCPeerConnection` object with the specified `RTCConfiguration`. + +```dart +RTCPeerConnection pc = await createPeerConnection({ + 'iceServers': [ + {'urls': 'stun:stun.l.google.com:19302'}, + ] +}); +``` + +- `createLocalMediaStream`: Creates a new `MediaStream` object with the specified `lable`. + +```dart +MediaStream stream = factory.createLocalMediaStream('new_stream_label'); +``` + +- `getRtpSenderCapabilities`: Returns a `RTCRtpCapabilities` object that represents the capabilities of the sender for the given `kind`. + +```dart +RTCRtpCapabilities capabilities = getRtpSenderCapabilities('video'); // or 'audio' +``` + +- `getRtpReceiverCapabilities`: Returns a `RTCRtpCapabilities` object that represents the capabilities of the receiver for the given `kind`. + +```dart +RTCRtpCapabilities capabilities = getRtpReceiverCapabilities('video'); // or 'audio' +``` + +## Properties + +- `frameCryptorFactory`: Returns a `FrameCryptorFactory` object for End to End Encryption. diff --git a/docs/flutter-webrtc/api-docs/rtc-ice-candidate.md b/docs/flutter-webrtc/api-docs/rtc-ice-candidate.md new file mode 100644 index 0000000..3ad5f1f --- /dev/null +++ b/docs/flutter-webrtc/api-docs/rtc-ice-candidate.md @@ -0,0 +1,17 @@ +--- +sidebar_position: 11 +--- + +# RTCICECandidate + +The corresponding JS API docs is here [RTCICECandidate](https://developer.mozilla.org/en-US/docs/Web/API/RTCICECandidate). + +The RTCIceCandidate() constructor creates and returns a new RTCIceCandidate object, which can be configured to represent a single ICE candidate. + +## properties + +- `candidate`:A string representing the transport address for the candidate that can be used for connectivity checks. + +- `sdpMid`:A string specifying the candidate's media stream identification tag which uniquely identifies the media stream within the component with which the candidate is associated, or null if no such association exists. + +- `sdpMLineIndex`:If not null, sdpMLineIndex indicates the zero-based index number of the media description (as defined in RFC 4566) in the SDP with which the candidate is associated. \ No newline at end of file diff --git a/docs/flutter-webrtc/api-docs/rtc-peerconnection.md b/docs/flutter-webrtc/api-docs/rtc-peerconnection.md new file mode 100644 index 0000000..5c56bd0 --- /dev/null +++ b/docs/flutter-webrtc/api-docs/rtc-peerconnection.md @@ -0,0 +1,267 @@ +--- +sidebar_position: 5 +--- + +# RTCPeerConnection + +The corresponding JS API docs is here [RTCPeerConnection](https://developer.mozilla.org/zh-CN/docs/Web/API/RTCPeerConnection). + +## Methods + +- `createOffer`: Creates an offer SDP for negotiations for WebRTC peer connection. + +```dart +var offer = await pc.createOffer(); +``` + +- `createAnswer`: Creates an answer SDP for negotiations for WebRTC peer connection. + +```dart + var answer = await pc.createAnswer(); +``` + +- `addStream`: Connects a MediaStream to the peer connection. + +```dart + pc.addStream(stream); +``` + +- `removeStream`: Disconnects a MediaStream from the peer connection. + +```dart +pc.removeStream(stream); +``` + +- `getLocalDescription`: Returns an RTCSessionDescription describing the session for the local end of the connection. If it has not yet been set, returns null. + +```dart +var localDescription = await pc.getLocalDescription(); +``` + +- `setLocalDescription`: Changes the local description associated with the connection. This description specifies the properties of the local end of the connection, including the media format. It returns a Promise which is fulfilled once the description has been changed, asynchronously. + This function will triggers candidate collection. + + ```dart + var offer = await pc.createOffer(); + await pc.setLocalDescription(offer); + + ``` + +- `getRemoteDescription`: Returns an RTCSessionDescription describing the session for the remote end of the connection. If it has not yet been set, returns null. + +```dart +var remoteDescription = await pc.getRemoteDescription(); +``` + +- `setRemoteDescription`: Changes the remote description associated with the connection. This description specifies the properties of the remote end of the connection, including the media format. It returns a Promise which is fulfilled once the description has been changed, asynchronously. + This function will triggers candidate collection. + +```dart +var remoteDescription = await pc.getRemoteDescription(); +await pc.setRemoteDescription(remoteDescription); +``` + +- `addCandidate`: Adds a candidate to the ICE agent. This can be used to add remote candidates when the remote description is set. + +```dart + /// receive ICE candidates from signaling server. + /// and convert to RTCIceCandidate object. + var json = ws.receive(); + var candidate = new RTCIceCandidate(candidate: json['candidate'], sdpMid: json['sdpMid'], sdpMLineIndex: json['sdpMLineIndex']); + await pc.addCandidate(candidate); +``` + +- `getStats`: Returns a Promise that resolves with an RTCStatsReport object containing statistics for the connection. `track` parameter is optional, if specified, only statistics for the track will be returned. + +```dart +var someTrack = pc.getSenders()[0].track; +var stats = await pc.getStats(someTrack); +``` + +- `getLocalStreams`: Returns an array of MediaStream objects representing the local streams being sent to the remote peer. + +```dart +var localStreams = await pc.getLocalStreams(); +``` + +- `getRemoteStreams`: Returns an array of MediaStream objects representing the remote streams being received from the remote peer. + +```dart +var remoteStreams = await pc.getRemoteStreams(); +``` + +- `getSenders`: Returns an array of RTCRtpSender objects, each representing the RTP sender responsible for transmitting one or more tracks. + +```dart +var senders = pc.getSenders(); +``` + +- `getReceivers`: Returns an array of RTCRtpReceiver objects, each representing the RTP receiver responsible for receiving and decoding one or more tracks. + +```dart +var receivers = pc.getReceivers(); +``` + +- `getTransceivers`: Returns an array of RTCRtpTransceiver objects, each representing a combination of an RTCRtpSender and an RTCRtpReceiver that share a common mid. + +```dart +var transceivers = pc.getTransceivers(); +``` + +- `createDataChannel`: Creates a new RTCDataChannel object with the given label. + +```dart +var dataChannelDict = RTCDataChannelInit(); + dataChannelDict.id = 1; + dataChannelDict.ordered = true; + dataChannelDict.maxRetransmitTime = -1; + dataChannelDict.maxRetransmits = -1; + dataChannelDict.protocol = 'sctp';///sctp or quic +var sendChannel = await pc.createDataChannel( + 'dataChannel', + dataChannelDict, + ); +``` + +- `restartIce`: Restarts the ICE agent for the connection. This can be used to request a new set of candidates from the remote peer. + +```dart +await pc.restartIce(); +``` + +- `close`: Closes the RTCPeerConnection. + +```dart +await pc.close(); +``` + +- `createDtmfSender`: Creates a new RTCDtmfSender object to send DTMF tones to the remote peer. + +```dart +var track = pc.getAudioTracks()[0]; +var dtmfSender = await pc.createDtmfSender(track); +``` + +- `addTrack`: Adds a new MediaStreamTrack to the set of tracks which will be transmitted to the remote peer. + +```dart +var track = await stream.getTracks().forEach((track) { + pc.addTrack(track, [stream]); +}); +``` + +- `removeTrack`: Removes a MediaStreamTrack from the set of tracks which will be transmitted to the remote peer. + +```dart +var track = pc.getSenders()[0].track; +pc.removeTrack(track); +``` + +- `setConfiguration`: Sets the ICE server configuration. + +```dart +var configuration = { + "iceServers": [ + { + "urls": "turn:asia.turn-server.net", + "username": "allie@oopcode.com", + "credential": "topsecretpassword", + }, + ], +}; +pc.setConfiguration(configuration); +/// renegotiate connection after configuration is set. +var offer = await pc.createOffer(); +``` + +- `addTransceiver`: Adds a new transceiver to the set of transceivers associated with the connection. + +```dart +var transceiver = await pc.addTransceiver( + kind: RTCRtpMediaType.RTCRtpMediaTypeAudio, + init: + RTCRtpTransceiverInit(direction: TransceiverDirection.RecvOnly)); +``` + +## Events + +- `onIceCandidate`: Fired when a new ICE candidate is generated. + +```dart +pc.onIceCandidate = (candidate) { + /// send ICE candidates to remote peer. + signaling.send(candidate.toMap()); +}; +``` + +- `onSignalingState`: Fired when the signaling state changes. + +```dart +pc.onSignalingState = (state) { + print('Signaling state changed to $state'); +}; +``` + +- `onIceConnectionState`: Fired when the ICE connection state changes. + +```dart +pc.onIceConnectionState = (state) { + print('ICE connection state changed to $state'); +}; +``` + +- `onConnectionState`: Fired when the connection state changes. + +```dart +pc.onConnectionState = (state) { + print('Connection state changed to $state'); +}; +``` + +- `onTrack`: Fired when a new track is added to the connection. + +```dart +pc.onTrack = (event) { + print('Track added: ${event.track.id}'); +}; +``` + +- `onDataChannel`: Fired when a new data channel is added to the connection. + +```dart +pc.onDataChannel = (dc) { + print('Data channel added: ${dc.label}'); +}; +``` + +- `onRenegotiationNeeded`: Fired when a negotiation is needed. + +```dart +pc.onRenegotiationNeeded = () async { + var offer = await pc.createOffer(); + await pc.setLocalDescription(offer); + signaling.send(offer.toMap()); +}; +``` + +- `onIceGatheringState`: Fired when the ICE gathering state changes. + +```dart +pc.onIceGatheringState = (state) { + print('ICE gathering state changed to $state'); +}; +``` + + + + + + + + + + + + + + diff --git a/docs/flutter-webrtc/api-docs/rtc-rtcp-parameters.md b/docs/flutter-webrtc/api-docs/rtc-rtcp-parameters.md new file mode 100644 index 0000000..7ceffdc --- /dev/null +++ b/docs/flutter-webrtc/api-docs/rtc-rtcp-parameters.md @@ -0,0 +1,13 @@ +--- +sidebar_position: 12 +--- + +# RTCRTCPParameters + +Encoding configuration of RTCP + +## Properties + +- `cname`:The Canonical Name used by RTCP. + +- `reducedSize`:Whether reduced size RTCP is configured or compound RTCP. \ No newline at end of file diff --git a/docs/flutter-webrtc/api-docs/rtc-rtp-parameters.md b/docs/flutter-webrtc/api-docs/rtc-rtp-parameters.md new file mode 100644 index 0000000..4e1a0d2 --- /dev/null +++ b/docs/flutter-webrtc/api-docs/rtc-rtp-parameters.md @@ -0,0 +1,68 @@ +--- +sidebar_position: 13 +--- + +# RTCRTPParameters + +The corresponding JS API docs is here [getParameters](https://developer.mozilla.org/en-US/docs/Web/API/RTCRtpSender/getParameters). + +Encoding configuration of RTP + +## Properties + +- `transactionId`:A string containing a unique ID. This value is used to ensure that `setParameters()` can only be called to modify the parameters returned by a specific previous call to get Parameters. This parameter cannot be changed by the caller. + +- `rtcp`:A dictionary containing information about the RTCP configuration. This parameter is optional and can be omitted if RTCP is not needed. + +- `headerExtensions`:An array of zero or more RTP header extensions, each identifying an extension supported by the sender or receiver. + +- `encodings`:An array of zero or more RTP encodings, each specifying a media stream and its parameters. +The properties of the objects include: + + - `rid`:A string which, If non-null, this represents the RID thatidentifies this encoding layer. + RIDs are used to identify layers in simulcast. + + - `active`:true (the default) if the encoding is being sent, false if it is not being sent or used. + + - `maxBitrate`:The maximum bitrate (in bits per second) that the encoding can use. + If non-null, this represents the Transport IndependentApplication + Specific maximum bandwidth defined in RFC3890. If null, there is no +maximum bitrate. + + - `maxFramerate`:The maximum frame rate (in frames per second) that the encoding can use. + + - `minBitrate`:The minimum bitrate (in bits per second) that the encoding can use. + + - `numTemporalLayers`:The number of temporal layers to be used for this encoding(default is 1). + + - `scaleResolutionDownBy`:The factor by which the resolution of the video should be scaled down before encoding (default is 1.0). + If non-null, scale the width and height down by this factor for video. If null, the implementation default scaling factor will be used. + + - `ssrc`:SSRC to be used by this encoding.Can't be changed between getParameters/setParameters. + + - `scalabilityMode`:The scalability mode to be used for this encoding. Can be "L1T3" or "L1T2". + +- `codecs`:An array of zero or more RTCRtpCodecParameters objects, each specifying a codec supported by the sender or receiver. +Each codec object in the array may have the following properties: + + - `payloadType`:Payload type used to identify this codec in RTP packets. + + - `name`:Name used to identify the codec. Equivalent to MIME subtype. + + - `kind`:The media type of this codec. Equivalent to MIME top-level type. + + - `clockRate`:Clock rate in Hertz. + + - `numChannels`:The number of audio channels used. Set to null for video codecs. + +- `degradationPreference`:Specifies the preferred way the WebRTC layer should handle optimizing bandwidth against quality in constrained-bandwidth situations.The possible values are + + - `DISABLED` + + - `MAINTAIN_FRAMERATE` + + - `MAINTAIN_RESOLUTION` + + - `BALANCED` + + The default value is `BALANCED`. \ No newline at end of file diff --git a/docs/flutter-webrtc/api-docs/rtc-rtp-receiver.md b/docs/flutter-webrtc/api-docs/rtc-rtp-receiver.md new file mode 100644 index 0000000..bce6374 --- /dev/null +++ b/docs/flutter-webrtc/api-docs/rtc-rtp-receiver.md @@ -0,0 +1,39 @@ +--- +sidebar_position: 15 +--- + +# RTCRTPReceiver + +The corresponding JS API docs is here [RTCRTPReceiver](https://developer.mozilla.org/en-US/docs/Web/API/RTCRTPReceiver). + + interface of the WebRTC API manages the reception and decoding of data for a `MediaStreamTrack` on an RTCPeerConnection. + +## Methods + +- `getStats`:The `RTCRtpReceiver` method `getStats()` asynchronously requests an RTCStatsReport object which provides statistics about incoming traffic on the owning RTCPeerConnection. + +```dart +var receiver = pc.getReceivers().firstWhere((s) => s.track.kind == 'video'); +receiver.getStats().then((stats) => { + print('statsId is+ ${stats.id}'); +}); +``` + +## Events + +- `onFirstPacketReceived`:The callback function of the first data packet received by RTCRTPreciewer. + +```dart +var receiver = pc.getReceivers().firstWhere((s) => s.track.kind == 'video'); +receiver.onFirstPacketReceived = (RTCRtpReceiver rtpReceiver, RTCRtpMediaType mediaType) { + print('first packet received'); +}; +``` + +## Properties + +- `parameters`:The `RTCRtpReceiver` property [RTCRTPParameters](http://localhost:3000/docs/flutter-webrtc/api-docs/rtc-rtp-parameters) is an object describing the current configuration for the encoding and transmission of media on the track. + +- `track`:The track property of the RTCRtpReceiver interface returns the MediaStreamTrack associated with the current RTCRtpReceiver instance. + +- `receiverId`:The receiverId property of the RTCRtpReceiver interface returns a unique identifier for the RTCRtpReceiver. \ No newline at end of file diff --git a/docs/flutter-webrtc/api-docs/rtc-rtp-sender.md b/docs/flutter-webrtc/api-docs/rtc-rtp-sender.md new file mode 100644 index 0000000..118173d --- /dev/null +++ b/docs/flutter-webrtc/api-docs/rtc-rtp-sender.md @@ -0,0 +1,86 @@ +--- +sidebar_position: 14 +--- + +# RTCRTPSender + +The corresponding JS API docs is here [RTCRTPSender](https://developer.mozilla.org/en-US/docs/Web/API/RTCRtpSender). + +The RTCRtpSender interface provides the ability to control and obtain details about how a particular `MediaStreamTrack`is encoded and transmitted to a remote peer. + +## Methods + +- `setParameters`:Sets the encoding parameters for the `MediaStreamTrack` associated with this `RTCRtpSender`. + +```dart +var sender = pc.getSenders().firstWhere((s) => s.track.kind == 'video'); +var params = sender.parameters; + params.degradationPreference = RTCDegradationPreference.MAINTAIN_RESOLUTION; +await sender.setParameters(params); +``` + +- `replaceTrack`:replaces the `MediaStreamTrack` associated with this `RTCRtpSender` with a new `MediaStreamTrack`. + +```dart +/// Example usage: +var stream = await navigator.mediaDevices.getUserMedia({video: true}); +var videoTrack = stream.getVideoTracks()[0]; +var sender = pc.getSenders().firstWhere((s) => s.track.kind == 'video'); +/// Replace the video track with a new video track +sender.replaceTrack(sender, newVideoTrack); +``` + +- `setTrack`:The `RTCRtpSender` method setStreams() associates the sender's track with the specified `MediaStreamTrack` objects. + +```dart +var stream = await navigator.mediaDevices.getUserMedia({video: true}); +var videoTrack = stream.getVideoTracks()[0]; +sender.setTrack(videoTrack,{takeOwnership: true}); +``` + +- `getStats`:The RTCRtpSender method `getStats()` asynchronously requests an RTCStatsReport object which provides statistics about outgoing traffic on the `RTCPeerConnection` which owns the sender, returning a Promise which is fulfilled when the results are available. + +```dart +var sender = pc.getSenders().firstWhere((s) => s.track.kind == 'video'); +sender.getStats().then((stats) => { + print('statsId is+ ${stats.id}'); +}); +``` + +- `setStreams`:The `RTCRtpSender` method setStreams() associates the sender's track with the specified `MediaStream` objects. + +```dart +var stream = await navigator.mediaDevices.getUserMedia({video: true}); +var sender = pc.getSenders().firstWhere((s) => s.track.kind == 'video'); +sender.setStreams(stream); +``` + +- `dispose`:The `RTCRtpSender` method `dispose()` closes the `RTCRtpSender` and releases any associated resources. + +```dart +sender.dispose(); +``` + +## Properties + +- `paramters`:The `RTCRtpSender` property [RTCRTPParameters](http://localhost:3000/docs/flutter-webrtc/api-docs/rtc-rtp-parameters) is an object describing the current configuration for the encoding and transmission of media on the track. + +```dart +var sender = pc.getSenders().firstWhere((s) => s.track.kind == 'video'); +var parameters = sender.parameters; +print('transactionId is ${parameters.transactionId}'); +``` + +- `track`: The track property of the RTCRtpSender interface returns the MediaStreamTrack which is being handled by the RTCRtpSender. + +- `senderId`: The senderId property of the RTCRtpSender interface returns a unique identifier for the RTCRtpSender. + +- `ownsTrack`: The ownsTrack property of the RTCRtpSender interface returns a boolean value indicating whether the RTCRtpSender owns the track. + +- `dtmfSender`: The dtmfSender property of the RTCRtpSender interface returns an RTCDTMFSender object which can be used to send DTMF tones using the track associated with the RTCRtpSender. +Regarding `RTCDTMFsender`, please refer to: + + [RTCDTMFSender](http://localhost:3000/docs/flutter-webrtc/api-docs/rtc-dtmf-sender) + + + diff --git a/docs/flutter-webrtc/api-docs/rtc-rtp-transceiver.md b/docs/flutter-webrtc/api-docs/rtc-rtp-transceiver.md new file mode 100644 index 0000000..89d05c2 --- /dev/null +++ b/docs/flutter-webrtc/api-docs/rtc-rtp-transceiver.md @@ -0,0 +1,72 @@ +--- +sidebar_position: 16 +--- + +# RTCRTPTransceiver + +The corresponding JS API docs is here [RTCRTPTransceiver](https://developer.mozilla.org/en-US/docs/Web/API/RTCRTPTransceiver). + +The WebRTC interface RTCRtpTransceiver describes a permanent pairing of an RTCRtpSender and an RTCRtpReceiver, along with some shared state. + +Each SDP media section describes one bidirectional SRTP ("Secure Real Time Protocol") stream (excepting the media section for RTCDataChannel, if present). This pairing of send and receive SRTP streams is significant for some applications, so RTCRtpTransceiver is used to represent this pairing, along with other important state from the media section. Each non-disabled SRTP media section is always represented by exactly one transceiver. + +A transceiver is uniquely identified using its mid property, which is the same as the media ID (mid) of its corresponding m-line. An RTCRtpTransceiver is associated with an m-line if its mid is non-null; otherwise it's considered disassociated. + +## Methods + +- `getCurrentDirection`:Get the current direction attribute,The `RTCRtpTransceiver` property direction is a string that indicates the transceiver's preferred directionality.A string with one of the following values: + - `SendRecv`: The transceiver can send and receive media. + + - `SendOnly`: The transceiver can only send media. + + - `RecvOnly`: The transceiver can only receive media. + + - `Inactive`: The transceiver is inactive and is not sending or receiving media. + + - `Stopped`: The transceiver has been stopped and is no longer sending or receiving media. + +```dart +var currentDirection = await transceiver.getCurrentDirection(); +``` + +- `getDirection`:Get the direction attribute. + +```dart +var direction =await transcreiver.getDirection(); +``` + +- `setDirection`:Set the direction attribute. + +```dart +await transceiver.setDirection("sendrecv"); +``` + +- `setCodecPreferences`:The `setCodecPreferences()` method of the `RTCRtpTransceiver` interface is used to set the codecs that the transceiver allows for decoding received data, in order of decreasing preference. + +```dart +var transceiver = await pc.getTransceivers()[0]; +var acaps = await getRtpSenderCapabilities('audio'); +var codecs = acaps.codecs.where((c) => c.mimeType == 'audio/opus').toList(); +await transceiver.setCodecPreferences(codecs); +``` + +- `stop`:Stop the transceiver. + +```dart +var transceiver = await pc.getTransceivers()[0]; +await transceiver.stop(); +``` + +## Properties + +- `mid`:The `RTCRtpTransceiver` interface's mid property specifies the negotiated media ID (mid) which the local and remote peers have agreed upon to uniquely identify the stream's pairing of sender and receiver. + +- `sender`:The sender property of WebRTC's RTCRtpTransceiver interface indicates the RTCRtpSender responsible for encoding and sending outgoing media data for the transceiver's stream. + +- `receiver`:The receiver property of WebRTC's RTCRtpTransceiver interface indicates the RTCRtpReceiver responsible for decoding and receiving incoming media data for the transceiver's stream. + +- `stopped`:The stopped property of WebRTC's RTCRtpTransceiver interface indicates whether the transceiver has been stopped and is no longer sending or receiving media. + +- `transceiverId`:The transceiverId property of WebRTC's RTCRtpTransceiver interface is a unique identifier for the transceiver. + + diff --git a/docs/flutter-webrtc/api-docs/rtc-session-description.md b/docs/flutter-webrtc/api-docs/rtc-session-description.md new file mode 100644 index 0000000..1d3c677 --- /dev/null +++ b/docs/flutter-webrtc/api-docs/rtc-session-description.md @@ -0,0 +1,17 @@ +--- +sidebar_position: 17 +--- + +# RTCSessionDescription + +The corresponding JS API docs is here [RTCSessionDescription](https://developer.mozilla.org/en-US/docs/Web/API/RTCSessionDescription). + +The `RTCSessionDescription` interface describes one end of a connection—or potential connection—and how it's configured. Each RTCSessionDescription consists of a description type indicating which part of the offer/answer negotiation process it describes and of the SDP descriptor of the session. + +The process of negotiating a connection between two peers involves exchanging RTCSessionDescription objects back and forth, with each description suggesting one combination of connection configuration options that the sender of the description supports. Once the two peers agree upon a configuration for the connection, negotiation is complete. + +## Properties + +- `type`: A String indicating the type of session description, which is either "offer" or "answer". + +- `sdp`: A String containing the SDP descriptor of the session. \ No newline at end of file diff --git a/docs/flutter-webrtc/api-docs/rtc-stats-report.md b/docs/flutter-webrtc/api-docs/rtc-stats-report.md new file mode 100644 index 0000000..d6b2819 --- /dev/null +++ b/docs/flutter-webrtc/api-docs/rtc-stats-report.md @@ -0,0 +1,19 @@ +--- +sidebar_position: 18 +--- + +# RTCStatsReport + +The corresponding JS API docs is here [RTCStatsReport](https://developer.mozilla.org/en-US/docs/Web/API/RTCStatsReport). + +The RTCStatsReport interface of the WebRTC API provides a statistics report for a RTCPeerConnection, RTCRtpSender, or RTCRtpReceiver. + +## Properties + +- `id`: A string that uniquely identifies the object was monitored to produce the set of statistics. This value persists across reports for (at least) the lifetime of the connection. Note however that for some statistics the ID may vary between browsers and for subsequent connections, even to the same peer. + +- `type`: A string with a value that indicates the type of statistics that the object contains, such as `candidate-pair`, `inbound-rtp`, `certificate`, and so on. The types of statistics and their corresponding objects are listed below. + +- `timestamp`: A high resolution timestamp object indicating the time at which the sample was taken. Many reported statistics are cumulative values; the timestamp allows rates and averages to be calculated between any two reports, at any desired reporting rate. + +- `values`: A dictionary object containing the statistics values. The keys are strings that identify the individual statistics, and the values are the statistics themselves. The specific statistics available depend on the type of the report. \ No newline at end of file diff --git a/docs/flutter-webrtc/api-docs/rtc-track-event.md b/docs/flutter-webrtc/api-docs/rtc-track-event.md new file mode 100644 index 0000000..afa7bbd --- /dev/null +++ b/docs/flutter-webrtc/api-docs/rtc-track-event.md @@ -0,0 +1,20 @@ +--- +sidebar_position: 19 +--- + +# RTCTrackEvent + +The corresponding JS API docs is here [RTCTrackEvent](https://developer.mozilla.org/en-US/docs/Web/API/RTCTrackEvent). + +`RTCTrackEvent` is an interface in the WebRTC API that represents track events. When a new MediaStreamTrack (such as an audio or video track) is added to the RTCRtpReceiver (the object responsible for receiving media data) of `RTCPeerConnection`, this event is triggered. + +## Properties + +- `track`:This property returns the `MediaStreamTrack` object that has been added. This represents the actual media track (audio or video) that is now available for use. + +- `receiver`:This property returns the `RTCRtpReceiver` object associated with the event. The `RTCRtpReceiver` is responsible for receiving the media track. + +- `streams`:This property returns an array of `MediaStream` objects associated with the track. A single track can be part of multiple streams. + +- `transceiver`:This property returns the `RTCRtpTransceiver` object associated with the event. The transceiver manages the sending and receiving of media for a specific track. + diff --git a/docs/flutter-webrtc/api-docs/rtc-video-renderer.md b/docs/flutter-webrtc/api-docs/rtc-video-renderer.md new file mode 100644 index 0000000..8cc99bb --- /dev/null +++ b/docs/flutter-webrtc/api-docs/rtc-video-renderer.md @@ -0,0 +1,64 @@ +--- +sidebar_position: 20 +--- + +# RtcVideoRenderer + +Videorenderer defines a universal interface for video renderers to manage the rendering, audio control, and lifecycle of video streams. + +## Methods + +- `audioOutput`:Switch audio output devices (such as speakers, headphones). +Returning true indicates a successful switch. + +```dart +var RTCVideoRenderer renderer = RTCVideoRenderer(); +///deviceId is The ID of the target audio output device +Future selectAudioOutput(String? deviceId) async {` + await renderer.audioOutput(deviceId); +} +``` + +- `initialize`:Initialize the renderer (such as creating textures, allocating resources). It needs to be called before setting the srcObject. + +```dart +var RTCVideoRenderer renderer = RTCVideoRenderer(); +Future initRenderers() async { + await renderer.initialize(); +} +``` + +- `dispose`:Release the resources used by the renderer. + +```dart +var RTCVideoRenderer renderer = RTCVideoRenderer(); +Future dispose() async { + await renderer.dispose(); +} +``` + +## Events + +- `onResize`:When the video size changes, or the native texture +changes (angle or size), notify the user to redraw the Widget. + +- `onFirstFrameRendered`:When the first frame is rendered, notify the user that video started playing. + +## Properties + +- `videoWidth`:The width of the video stream. + +- `videoHeight`:The height of the video stream. + +- `muted`:Is it silent. Disable audio output when set to true. + +- `renderVideo`:Whether to render the video. If true and width/height is valid, the video should be rendered. + +- `textureId`:The associated underlying texture ID is used for rendering the Texture component in Flutter. + +- `srcObject`:Associated media streams. Set this property to start rendering video/audio. + +```dart +var RTCVideoRenderer renderer = RTCVideoRenderer(); +renderer.srcObject = mediaStream; +``` \ No newline at end of file diff --git a/docs/flutter-webrtc/first-app.md b/docs/flutter-webrtc/first-app.md new file mode 100644 index 0000000..10d8068 --- /dev/null +++ b/docs/flutter-webrtc/first-app.md @@ -0,0 +1,81 @@ +--- +sidebar_position: 4 +--- + +# First App + +## Step 1 + +Create or use an existing flutter app project and add `flutter_webrtc` to your `pubspec.yaml` file + +```shell +flutter create myapp +``` + +- Add `flutter_webrtc` to your `pubspec.yaml` file + +```shell +flutter pub add flutter_webrtc +``` + +## Step 2 + +Setup required permissions for audio and video, link to [Project Settings](./project-settings) + +Using `navigator.mediaDevices.getUserMedia` to get access to the camera and microphone. + +you can view getUserMedia docs [here](./api-docs/get-user-media) + +```dart +class _MyHomePageState extends State { + RTCVideoRenderer? _renderer; + MediaStream? _stream; + + void _openCamera() async { + // create and initialize renderer + _renderer ??= RTCVideoRenderer(); + await _renderer!.initialize(); + + // + try { + _stream = await navigator.mediaDevices + .getUserMedia({'audio': false, 'video': true}); + } catch (e) { + //if you get an error, please check the permissions in the project settings. + print(e.toString()); + } + + // set the MediaStream to the video renderer + _renderer!.srcObject = _stream; + setState(() {}); + } +``` + +## Step 3 + +render the video renderer in the widget tree + +```dart + @override + Widget build(BuildContext context) { + return Scaffold( + appBar: AppBar( + backgroundColor: Theme.of(context).colorScheme.inversePrimary, + title: Text(widget.title), + ), + body: Center( + child: SizedBox( + width: 320, + height: 240, + // render the video renderer in the widget tree + child: _renderer != null ? RTCVideoView(_renderer!) : Container(), + ), + ), + floatingActionButton: FloatingActionButton( + onPressed: _setup, + tooltip: 'open camera', + child: const Icon(Icons.camera_alt), + ), + ); + } +``` diff --git a/docs/flutter-webrtc/get-stared.md b/docs/flutter-webrtc/get-stared.md index d4b696f..37ac253 100644 --- a/docs/flutter-webrtc/get-stared.md +++ b/docs/flutter-webrtc/get-stared.md @@ -4,15 +4,13 @@ sidebar_position: 2 # Get Started -Please refer to https://flutter.dev/docs to deploy the Flutter development environment. - -and make sure your Flutter SDK version is 2.x +Please refer to [here](https://flutter.dev/docs) to deploy the Flutter development environment. ## Usage -Add `flutter_webrtc` as a [dependency in your pubspec.yaml file](https://flutter.io/using-packages/). +Add `flutter_webrtc` as a `[dependency in your pubspec.yaml file](https://flutter.io/using-packages/)`. -You can find the latest version here [![pub package](https://img.shields.io/pub/v/flutter_webrtc.svg)](https://pub.dartlang.org/packages/flutter_webrtc) +You can find the latest version here `[![pub package](https://img.shields.io/pub/v/flutter_webrtc.svg)](https://pub.dartlang.org/packages/flutter_webrtc)` ```yaml flutter_webrtc: ^x.y.z @@ -20,96 +18,8 @@ You can find the latest version here [![pub package](https://img.shields.io/pub/ then run `flutter pub get` in your project. -## Platform related settings - -### iOS - -Add the following entry to your _Info.plist_ file, - -located in `/ios/Runner/Info.plist`: - -```xml -NSCameraUsageDescription -$(PRODUCT_NAME) Camera Usage! -NSMicrophoneUsageDescription -$(PRODUCT_NAME) Microphone Usage! -``` - -This entry allows your app to access camera and microphone. - -### Android - -Ensure the following permission is present in your Android Manifest file, - - located in `/android/app/src/main/AndroidManifest.xml`: - -```xml - - - - - - - -``` - -If you need to use a Bluetooth device, please add: - -```xml - - -``` +or just use `flutter pub add flutter_webrtc` to add the dependency. -The Flutter project template adds it, so it may already be there. +## Setup required permissions for audio and video -Also you will need to set your build settings to Java 8, because official WebRTC jar now uses static methods in `EglBase` interface. Just add this to your app level `build.gradle`: - -```groovy -android { - //... - compileOptions { - sourceCompatibility JavaVersion.VERSION_1_8 - targetCompatibility JavaVersion.VERSION_1_8 - } -} -``` - -If necessary, in the same `build.gradle` you will need to increase `minSdkVersion` of `defaultConfig` up to `21` (currently default Flutter generator set it to `16`). - -### Important reminder - -When you compile the release apk, you need to add the following operations, - -Edit `/android/app/build.gradle` - -```diff - buildTypes { - release { - // TODO: Add your own signing config for the release build. - // Signing with the debug keys for now, so `flutter run --release` works. - signingConfig signingConfigs.debug - -+ minifyEnabled true -+ useProguard true -+ -+ proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro' - } - } -``` - -And create new file `/android/app/proguard-rules.pro` - -``` -## Flutter wrapper --keep class io.flutter.app.** { *; } --keep class io.flutter.plugin.** { *; } --keep class io.flutter.util.** { *; } --keep class io.flutter.view.** { *; } --keep class io.flutter.** { *; } --keep class io.flutter.plugins.** { *; } --dontwarn io.flutter.embedding.** - -## Flutter WebRTC --keep class com.cloudwebrtc.webrtc.** { *; } --keep class org.webrtc.** { *; } -``` + please view next page `[Project Settings](./project-settings)` diff --git a/docs/flutter-webrtc/introduction.md b/docs/flutter-webrtc/introduction.md index 718e5d7..fc12d02 100644 --- a/docs/flutter-webrtc/introduction.md +++ b/docs/flutter-webrtc/introduction.md @@ -8,7 +8,7 @@ We package this plug-in based on Google WebRTC, and you will have high-quality a Flutter-WebRTC is a cross-platform plugin, the platform support is as follows. -## Functionality And Platform Support +## Functionality And Platform Support [![pub package](https://img.shields.io/pub/v/flutter_webrtc.svg)](https://pub.dartlang.org/packages/flutter_webrtc) diff --git a/docs/flutter-webrtc/loopback.md b/docs/flutter-webrtc/loopback.md new file mode 100644 index 0000000..d924b03 --- /dev/null +++ b/docs/flutter-webrtc/loopback.md @@ -0,0 +1,23 @@ +--- +sidebar_position: 4 +--- + +# LoopBack App + +## Introduction + +This is a simple example of how to use the `flutter-webrtc` package to create a loopback call + +## How to use + +Basic Flowchart: + +```mermaid +flowchart TD + A[Create a new RTCPeerConnection] --> B[Create a new MediaStream] + B --> C[Add the MediaStream to the RTCPeerConnection] + C --> D[Create a new RTCDataChannel] + D --> E[Send a message] + E --> F[Receive a message] + F --> D +``` \ No newline at end of file diff --git a/docs/flutter-webrtc/project-settings.md b/docs/flutter-webrtc/project-settings.md new file mode 100644 index 0000000..52cc759 --- /dev/null +++ b/docs/flutter-webrtc/project-settings.md @@ -0,0 +1,87 @@ +--- +sidebar_position: 3 +--- + +# Project Settings + +## Permissions + +### iOS/macOS + +Add the following entry to your _Info.plist_ file, + +located in `/ios/Runner/Info.plist` and `/macos/Runner/Info.plist`: + +```xml +NSCameraUsageDescription +$(PRODUCT_NAME) Camera Usage! +NSMicrophoneUsageDescription +$(PRODUCT_NAME) Microphone Usage! +``` + +This entry allows your app to access camera and microphone. + +### Android + +Ensure the following permission is present in your Android Manifest file, + + located in `/android/app/src/main/AndroidManifest.xml`: + +```xml + + + + + + + +``` + +If you need to use a Bluetooth device, please add: + +```xml + + +``` + +## Other Settings + +If necessary, in the same `build.gradle` you will need to increase `minSdkVersion` of `defaultConfig` up to `21` (currently default Flutter generator set it to `16`). + +### Important reminder + +When you compile the release apk, you need to add the following operations, + +Edit `/android/app/build.gradle` + +```diff + buildTypes { + release { + // TODO: Add your own signing config for the release build. + // Signing with the debug keys for now, so `flutter run --release` works. + signingConfig signingConfigs.debug + ++ minifyEnabled true ++ useProguard true ++ ++ proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro' + } + } +``` + +And create new file `/android/app/proguard-rules.pro` + +```groovy +## Flutter wrapper +-keep class io.flutter.app.** { *; } +-keep class io.flutter.plugin.** { *; } +-keep class io.flutter.util.** { *; } +-keep class io.flutter.view.** { *; } +-keep class io.flutter.** { *; } +-keep class io.flutter.plugins.** { *; } +-dontwarn io.flutter.embedding.** + +## Flutter WebRTC +-keep class com.cloudwebrtc.webrtc.** { *; } +-keep class org.webrtc.** { *; } +```