Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions docs/flutter-webrtc/api-docs/_category_.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
{
"label": "API Docs",
"position": 1
}
26 changes: 26 additions & 0 deletions docs/flutter-webrtc/api-docs/get-display-media.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
---
sidebar_position: 3
---

# GetDisplayMedia

The `getDisplayMedia` method of the `MediaDevices` interface prompts the user to select and grant permission to capture the contents of a display or portion thereof (such as a window or screen) as a `MediaStream`.

The corresponding JS API docs is here [MediaDevices.getDisplayMedia()](https://developer.mozilla.org/en-US/docs/Web/API/MediaDevices/getDisplayMedia).

## Usage

```dart
MediaStream stream = await navigator.mediaDevices.getDisplayMedia({
'video': true,
'audio': true,
});
```

## Parameters

- `audio`: A Boolean value that indicates whether the `MediaStream` should include an audio track.
audio is optional, default is false. only chrome tabs can support audio caputure.

- `video`: A Boolean value that indicates whether the `MediaStream` should include a video track.
video is optional, default is true.
67 changes: 67 additions & 0 deletions docs/flutter-webrtc/api-docs/get-user-media.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
---
sidebar_position: 2
---

# GetUserMedia

The corresponding JS API docs is here
[MediaDevices.getUserMedia()](https://developer.mozilla.org/en-US/docs/Web/API/MediaDevices/getUserMedia).

## Usage

basic usage:

```dart
await navigator.mediaDevices.getUserMedia({'audio': true, 'video': true});
```

advanced usage:

```dart
await navigator.mediaDevices.getUserMedia({
'audio': true,
'video': {
'facingMode': 'user', // or 'environment' for mobile devices
'width': 1280,
'height': 720,
'frameRate': 30,
}
});
```

## Parameters

- mediaConstraints: A dictionary object that specifies the media constraints for the requested media types.
refer to the [MediaStreamConstraints](https://developer.mozilla.org/en-US/docs/Web/API/MediaStreamConstraints) for more details.

sub options:

- `audio`: A Boolean value that indicates whether the MediaStream should include an audio track.

or a dictionary object that specifies the audio track's media constraints.

```json
{
'deviceId': 'audio_device_id',
}
```

- `video`: A Boolean value that indicates whether the MediaStream should include a video track.

or a dictionary object that specifies the video track's media constraints.

```json
{
'deviceId': 'video_device_id',
'facingMode': 'user', // or 'environment' for mobile devices
'width': 1280,
'height': 720,
'frameRate': 30,
}
```

## Return value

A Promise that resolves to a MediaStream object.

Note: The `deviceId` parameter is used to specify the device to use. If you want to use the default device, you can omit this parameter. If you want to use a specific device, you can get the device ID by calling `navigator.mediaDevices.enumerateDevices` [here](./media-devices) and then pass it to the `deviceId` parameter.
62 changes: 62 additions & 0 deletions docs/flutter-webrtc/api-docs/media-devices.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,62 @@
---
sidebar_position: 1
---

# MediaDevices

The corresponding JS API docs is here [MediaDevices](https://developer.mozilla.org/en-US/docs/Web/API/MediaDevices).

## Usage

```dart
MediaDevices mediaDevices = await navigator.mediaDevices;

MediaStream stream = await mediaDevices.getUserMedia({
'audio': true,
'video': true,
});

MediaStream displayStream = await mediaDevices.getDisplayMedia({
'video': true,
'audio': true,
});

List<MediaDeviceInfo> devices = await mediaDevices.enumerateDevices();

```

## Methods

- `getUserMedia`: Returns a Promise that resolves to a `MediaStream` object. The `MediaStream` object represents a stream of media content, typically (but not necessarily) including both audio and video.

```dart
MediaStream stream = await mediaDevices.getUserMedia({
'audio': true,
'video': true,
});
```

- `getDisplayMedia`: Returns a Promise that resolves to a `MediaStream` object. The `MediaStream` object represents a stream of media content, typically (but not necessarily) including both audio and video.

```dart
MediaStream stream = await mediaDevices.getDisplayMedia({
'video': true,
'audio': true,
});
```

- `enumerateDevices`: Returns a Promise that resolves to an array of `MediaDeviceInfo` objects, each of which represents a media input or output device.

```dart
List<MediaDeviceInfo> devices = await mediaDevices.enumerateDevices();
```

- `getSupportedConstraints`: Returns a dictionary object that specifies the media constraints supported by the user agent.
only support for web platform.

- `selectAudioOutput`: select audio output device.
support platforms: macOS/Windows/Linux/Web.

## Event Callbacks

- `onDeviceChange`: The `ondevicechange` event handler for the `MediaDevices` interface. It is called when a media input or output device is attached or removed.
62 changes: 62 additions & 0 deletions docs/flutter-webrtc/api-docs/media-recorder.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,62 @@
---
sidebar_position: 7
---

# MediaRecorder

The corresponding JS API docs is here [MediaRecorder](https://developer.mozilla.org/en-US/docs/Web/API/MediaRecorder).

Creates a new `MediaRecorder` object, given a MediaStream to record. Options are available to do things like set the container's MIME type (such as "video/webm" or "video/mp4") and the bit rates of the audio and video tracks or a single overall bit rate.

## Methods

- `start`: Starts recording the media.
For Android use audioChannel param.
For iOS use audioTrack.

```dart
void start() async {
if (Platform.isIOS) {
print('Recording is not available on iOS');
return;
}
// TODO(rostopira): request write storage permission
final storagePath = await getExternalStorageDirectory();
if (storagePath == null) throw Exception('Can\'t find storagePath');

final filePath = storagePath.path + '/webrtc_sample/test.mp4';
mediaRecorder = MediaRecorder();
setState(() {});
final videoTrack = stream
.getVideoTracks().first;
await mediaRecorder.start(
filePath,
videoTrack: videoTrack,
audioChannel: RecorderAudioChannel.INPUT,
);
}
```

- `startWeb`: Starts recording the media in the web.only for flutter web.

```dart
void startWeb() async {
mediaRecorder = MediaRecorder();
setState(() {});
mediaRecorder.startWeb(stream,onDataChunk:(data){
// do something with data
dynamic blob, bool isLastOne
} ,'video/webm', 1000);
}
```

- `stop`: Stops recording the media.

```dart
void stop() async {
await mediaRecorder.stop();
setState(() {
mediaRecorder = null;
});
}
```
66 changes: 66 additions & 0 deletions docs/flutter-webrtc/api-docs/media-stream-track.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@
---
sidebar_position: 6
---

# MediaStreamTrack

The corresponding JS API docs is here [MediaStreamTrack](https://developer.mozilla.org/en-US/docs/Web/API/MediaStreamTrack).

The `MediaStreamTrack` interface represents a single media track within a stream; typically, these are audio or video tracks, but other track types may exist as well. A `MediaStreamTrack` can be associated with multiple `MediaStream` objects.

## Methods

- `stop`: Stops the track.

```dart
void stopStreamedVideo(RTCVideoRenderer renderer) {
var stream = renderer.srcObject;
var tracks = stream.getTracks();

tracks.forEach((track) => {
track.stop();
});

randerer.srcObject = null;
}
```

## Events

- `onMute`: Sent to the MediaStreamTrack when the value of the muted property is changed to true, indicating that the track is unable to provide data temporarily (such as when the network is experiencing a service malfunction).

```dart
track.onMute = (event) {
print("Track ${track.id} is muted");
};
```

- `onUnMute`: Sent to the MediaStreamTrack when the value of the muted property is changed to false, indicating that the track is able to provide data again (such as when the network is no longer experiencing a service malfunction).

```dart
track.onUnMute = (event) {
print("Track ${track.id} is unmuted");
};
```

- `onEnded`:Sent when playback of the track ends (when the value readyState changes to ended), except when the track is ended by calling MediaStreamTrack.stop.

```dart
track.onEnded = (event) {
print("Track ${track.id} has ended");
};
```

## Properties

- `id`: Returns the unique identifier of the track.

- `label`:This may label audio and video sources (e.g., "Internal microphone" or "External USB Webcam"). Returns the label of the object's corresponding source, if any.
If the corresponding source has or had no label, returns an empty string.

- `kind`: Returns the type of track, such as "audio" or "video".

- `enabled`: Returns the enable state of `MediaStreamTrack`.After a `MediaStreamTrack` has ended, setting the enable state
will not change the ended state.

- `muted`: Returns true if the track is muted, and false otherwise.
98 changes: 98 additions & 0 deletions docs/flutter-webrtc/api-docs/media-stream.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,98 @@
---
sidebar_position: 8
---

# MediaStream

The corresponding JS API docs is here [MediaStream](https://developer.mozilla.org/en-US/docs/Web/API/MediaStream).

The MediaStream interface of the Media Capture and Streams API represents a stream of media content. A stream consists of several tracks, such as video or audio tracks. Each track is specified as an instance of MediaStreamTrack.

## Methods

- `addTrack`:Adds the given `MediaStreamTrack` to this `MediaStream`.

```dart
var mediaStream = MediaStream(
id: 'audio-stream',
ownerTag: 'audio-tag',
active: true,
/// The active attribute return true if this [MediaStream] is active and false otherwise.
/// [MediaStream] is considered active if at least one of its [MediaStreamTracks] is not in the [MediaStreamTrack.ended] state.
/// Once every track has ended, the stream's active property becomes false.
onAddTrack: (MediaStreamTrack track) {
print('Track added: ${track.id}');
},
onRemoveTrack: (MediaStreamTrack track) {
print('Track removed: ${track.id}');
},
);
mediaStream.addTrack(track, {addToNative: true});
```

- `removeTrack`:Removes the given `MediaStreamTrack` object from this `MediaStream`.

```dart
mediaStream.removeTrack(track,{removeFromNative: true});
```

- `getTracks`:Returns a List `MediaStreamTrack` objects representing all the tracks in this stream.

```dart
var tracks = mediaStream.getTracks();
```

- `getAudioTracks`:Returns a List `MediaStreamTrack` objects representing the audio tracks in this stream.
The list represents a snapshot of all the `MediaStreamTrack` objects in this stream's track set whose kind is equal to 'audio'.

```dart
var audioTracks = mediaStream.getAudioTracks();
```

- `getVideoTracks`:Returns a List `MediaStreamTrack` objects representing the video tracks in this stream.

```dart
var videoTracks = mediaStream.getVideoTracks();
```

- `getTrackById`:Returns either a `MediaStreamTrack` object from this stream's track set whose id is equal to trackId, or `StateError`, if no such track exists.

```dart
var track = mediaStream.getTrackById('some-track-id');
```

- `dispose`:Dispose the `MediaStream`.

```dart
await mediaStream.dispose();
```

## Events

- `onAddTrack`:Fires when a new `MediaStreamTrack` is added to this `MediaStream`.

```dart
var mediaStream = MediaStream(
id: 'audio-stream',
ownerTag: 'audio-tag',
active: true,
);
mediaStream.onAddTrack = (MediaStreamTrack track) {
print('Track added: ${track.id}');
};
```

- `onRemoveTrack`:Fires when a `MediaStreamTrack` is removed from this `MediaStream`.

```dart
var mediaStream = MediaStream(
id: 'audio-stream',
ownerTag: 'audio-tag',
active: true,
);
mediaStream.onRemoveTrack = (MediaStreamTrack track) {
print('Track removed: ${track.id}');
};
```


Loading