Skip to content

Commit

Permalink
Merge 9f94a65 into 0216fe7
Browse files Browse the repository at this point in the history
  • Loading branch information
oleh-poberezhets committed Sep 11, 2018
2 parents 0216fe7 + 9f94a65 commit 9be72da
Show file tree
Hide file tree
Showing 18 changed files with 507 additions and 146 deletions.
14 changes: 14 additions & 0 deletions CHANGELOG.md
@@ -0,0 +1,14 @@
## video-quality-tools v1.1.0

* **processFrames**:

Added new fields `gopDuration`, `displayAspectRatio`, `width`, `height`, `hasAudioStream` to the result of
_processFrames_ execution .

Add new methods to _processFrames_: `calculateGopDuration`, `calculateDisplayAspectRatio`, `hasAudioFrames`.

* **FramesMonitor**

FramesMonitor fetches video and audio frames from the stream now.

Added `width` and `height` info to video frames.
44 changes: 36 additions & 8 deletions README.md
Expand Up @@ -117,7 +117,7 @@ structure as the `ffprobe -show_streams` output has. You may find a typical outp

`videos` and `audios` may be an empty array if there are no appropriate streams in the live stream.

```json
```
{ videos:
[ { index: 1,
codec_name: 'h264',
Expand Down Expand Up @@ -214,7 +214,7 @@ do it pretty often). If ffprobe doesn't exit after `exitProcessGuardTimeoutInMs`

After creation of the `FramesMonitor` instance, you may start listening live stream data. To do so, just
run `framesMonitor.listen()` method. After that `framesMonitor` starts emitting `frame` event as soon as ffprobe
decodes frame from the stream. It emits only video frames at now.
decodes frame from the stream. It emits video and audio frames.

```javascript
const {FramesMonitor, processFrames, ExitReasons} = require('video-quality-tools');
Expand Down Expand Up @@ -250,16 +250,25 @@ try {

## `frame` event

This event is generated on each video frame decoded by ffprobe. The structure of the frame object is the following:
This event is generated on each video and audio frame decoded by ffprobe.
The structure of the frame object is the following:

```
{ media_type: 'video',
key_frame: 0,
pkt_pts_time: 3530.279,
pkt_size: 3332,
width: 640,
height: 480,
pict_type: 'P' }
```

or
```
{ media_type: 'audio',
key_frame: 1,
pkt_pts_time: 'N/A',
pkt_size: 20 }
```

## `exit` event

Expand Down Expand Up @@ -357,7 +366,19 @@ There is an output for the example above:
{ mean: 1494.9075520833333,
min: 1440.27734375,
max: 1525.95703125 },
fps: { mean: 30, min: 30, max: 30 } }
fps: {
mean: 30,
min: 30,
max: 30 },
gopDuration: {
mean: 2,
min: 1.9,
max: 2.1 },
displayAspectRatio: '16:9',
width: 1280,
height: 720,
hasAudioStream: true
}
```

In given example the frames are collected in `frames` array and than use `processFrames` function for sets of 300 frames
Expand All @@ -372,9 +393,16 @@ If there are more than 2 key frames, `processFrames` uses full GOPs to track fps
in the last GOP that was not finished. It's important to remember the `remainedFrames` output and push a new frame to
the `remainedFrames` array when it arrives.

For the full GOPs `processFrames` calculates min/max/mean values of bitrates (in kbit/s) and framerates and returns
them in `payload` field. The result of the check for the similarity of GOP structures for the collected GOPs is returned in
`areAllGopsIdentical` field.
For the full GOPs `processFrames` calculates min/max/mean values of bitrates (in kbit/s), framerates and GOP duration
(in seconds) and returns them in `payload` field. The result of the check for the similarity of GOP structures for
the collected GOPs is returned in `areAllGopsIdentical` field. Fields `width`, `height` and `displayAspectRatio`
are taken from data from first frame of the first collected GOP. Value of `hasAudioStream` reflects presence of
audio frames.

For display aspect ratio calculation method `processFrames::calculateDisplayAspectRatio` use list of
[current video aspect ratio standards](https://en.wikipedia.org/wiki/Aspect_ratio_(image))
with approximation error of frames width and height ratio. If ratio hasn't a reflection in aspect ratio standards then
[GCD algorithm](https://en.wikipedia.org/wiki/Greatest_common_divisor) is used.

`processFrames` may throw `Errors.GopNotFoundError`.

Expand Down
2 changes: 1 addition & 1 deletion package.json
@@ -1,6 +1,6 @@
{
"name": "video-quality-tools",
"version": "1.0.0",
"version": "1.1.0",
"description": "Set of tools to evaluate video stream quality.",
"main": "index.js",
"engines": {
Expand Down
4 changes: 1 addition & 3 deletions src/FramesMonitor.js
Expand Up @@ -269,11 +269,9 @@ class FramesMonitor extends EventEmitter {
'-hide_banner',
'-v',
errorLevel,
'-select_streams',
'v:0',
'-show_frames',
'-show_entries',
'frame=pkt_size,pkt_pts_time,media_type,pict_type,key_frame',
'frame=pkt_size,pkt_pts_time,media_type,pict_type,key_frame,width,height',
'-i',
`${this._url} timeout=${timeoutInSec}`
]
Expand Down
34 changes: 9 additions & 25 deletions src/StreamsInfo.js
Expand Up @@ -6,6 +6,7 @@ const {exec} = require('child_process');
const {promisify} = require('util');

const Errors = require('./Errors/');
const processFrames = require('./processFrames');

const DAR_OR_SAR_NA = 'N/A';
const DAR_OR_SAR_01 = '0:1';
Expand Down Expand Up @@ -129,36 +130,19 @@ class StreamsInfo {
video.display_aspect_ratio === DAR_OR_SAR_NA
) {
video.sample_aspect_ratio = '1:1';
video.display_aspect_ratio = this._calculateDisplayAspectRatio(video.width, video.height);
try {
video.display_aspect_ratio = processFrames.calculateDisplayAspectRatio(video.width, video.height);
} catch (err) {
throw new Errors.StreamsInfoError(
'Can not calculate aspect ratio due to invalid video resolution',
{width: video.width, height: video.height, url: this._url}
);
}
}

return video;
});
}

_calculateDisplayAspectRatio(width, height) {
if (!_.isInteger(width) || !_.isInteger(height) || width <= 0 || height <= 0) {
throw new Errors.StreamsInfoError(
'Can not calculate aspect rate due to invalid video resolution',
{width, height, url: this._url}
);
}

const gcd = this._findGcd(width, height);

return `${width / gcd}:${height / gcd}`;
}

_findGcd(a, b) {
if (a === 0 && b === 0) {
return 0;
}

if (b === 0) {
return a;
}
return this._findGcd(b, a % b);
}
}

module.exports = StreamsInfo;
157 changes: 145 additions & 12 deletions src/processFrames.js
Expand Up @@ -4,6 +4,23 @@ const _ = require('lodash');

const Errors = require('./Errors');

const AR_CALCULATION_PRECISION = 0.01;

const SQUARE_AR_COEFFICIENT = 1;
const SQUARE_AR = '1:1';

const TRADITIONAL_TV_AR_COEFFICIENT = 1.333;
const TRADITIONAL_TV_AR = '4:3';

const HD_VIDEO_AR_COEFFICIENT = 1.777;
const HD_VIDEO_AR = '16:9';

const UNIVISIUM_AR_COEFFICIENT = 2;
const UNIVISIUM_AR = '18:9';

const WIDESCREEN_AR_COEFFICIENT = 2.33;
const WIDESCREEN_AR = '21:9';

function processFrames(frames) {
if (!Array.isArray(frames)) {
throw new TypeError('process method is supposed to accept an array of frames.');
Expand All @@ -16,28 +33,76 @@ function processFrames(frames) {
throw new Errors.GopNotFoundError('Can not find any gop for these frames', {frames});
}

const areAllGopsIdentical = processFrames.areAllGopsIdentical(gops);
const bitrate = processFrames.calculateBitrate(gops);
const fps = processFrames.calculateFps(gops);
let areAllGopsIdentical = true;
const hasAudioStream = processFrames.hasAudioFrames(frames);
const baseGopSize = gops[0].frames.length;
const bitrates = [];
const fpsList = [];
const gopDurations = [];

gops.forEach(gop => {
areAllGopsIdentical = areAllGopsIdentical && baseGopSize === gop.frames.length;
const accumulatedPktSize = processFrames.accumulatePktSize(gop);
const gopDuration = processFrames.gopDurationInSec(gop);

const gopBitrate = processFrames.toKbs(accumulatedPktSize / gopDuration);
bitrates.push(gopBitrate);

const gopFps = gop.frames.length / gopDuration;
fpsList.push(gopFps);

gopDurations.push(gopDuration);
});

const bitrate = {
mean: _.mean(bitrates),
min : Math.min(...bitrates),
max : Math.max(...bitrates)
};

const fps = {
mean: _.mean(fpsList),
min : Math.min(...fpsList),
max : Math.max(...fpsList)
};

const gopDuration = {
mean: _.mean(gopDurations),
min: Math.min(...gopDurations),
max: Math.max(...gopDurations)
};

const width = gops[0].frames[0].width;
const height = gops[0].frames[0].height;
const displayAspectRatio = calculateDisplayAspectRatio(width, height);

return {
payload : {
areAllGopsIdentical,
bitrate,
fps
fps,
gopDuration,
displayAspectRatio,
width,
height,
hasAudioStream
},
remainedFrames: remainedFrames
};
}

processFrames.identifyGops = identifyGops;
processFrames.calculateBitrate = calculateBitrate;
processFrames.calculateFps = calculateFps;
processFrames.filterVideoFrames = filterVideoFrames;
processFrames.gopDurationInSec = gopDurationInSec;
processFrames.toKbs = toKbs;
processFrames.accumulatePktSize = accumulatePktSize;
processFrames.areAllGopsIdentical = areAllGopsIdentical;
processFrames.identifyGops = identifyGops;
processFrames.calculateBitrate = calculateBitrate;
processFrames.calculateFps = calculateFps;
processFrames.calculateGopDuration = calculateGopDuration;
processFrames.filterVideoFrames = filterVideoFrames;
processFrames.hasAudioFrames = hasAudioFrames;
processFrames.gopDurationInSec = gopDurationInSec;
processFrames.toKbs = toKbs;
processFrames.accumulatePktSize = accumulatePktSize;
processFrames.areAllGopsIdentical = areAllGopsIdentical;
processFrames.findGcd = findGcd;
processFrames.calculateDisplayAspectRatio = calculateDisplayAspectRatio;

module.exports = processFrames;

Expand Down Expand Up @@ -185,6 +250,58 @@ function calculateFps(gops) {
};
}

function calculateGopDuration(gops) {
const gopsDurations = [];

gops.forEach(gop => {
const gopDurationInSec = processFrames.gopDurationInSec(gop);

gopsDurations.push(gopDurationInSec);
});

return {
mean: _.mean(gopsDurations),
min: Math.min(...gopsDurations),
max: Math.max(...gopsDurations)
};
}

function calculateDisplayAspectRatio(width, height) {
if (!_.isInteger(width) || width <= 0) {
throw new TypeError('"width" must be a positive integer');
}

if (!_.isInteger(height) || height <= 0) {
throw new TypeError('"height" must be a positive integer');
}

const arCoefficient = width / height;

if (Math.abs(arCoefficient - SQUARE_AR_COEFFICIENT) <= AR_CALCULATION_PRECISION) {
return SQUARE_AR;
}

if (Math.abs(arCoefficient - TRADITIONAL_TV_AR_COEFFICIENT) <= AR_CALCULATION_PRECISION) {
return TRADITIONAL_TV_AR;
}

if (Math.abs(arCoefficient - HD_VIDEO_AR_COEFFICIENT) <= AR_CALCULATION_PRECISION) {
return HD_VIDEO_AR;
}

if (Math.abs(arCoefficient - UNIVISIUM_AR_COEFFICIENT) <= AR_CALCULATION_PRECISION) {
return UNIVISIUM_AR;
}

if (Math.abs(arCoefficient - WIDESCREEN_AR_COEFFICIENT) <= AR_CALCULATION_PRECISION) {
return WIDESCREEN_AR;
}

const gcd = findGcd(width, height);

return `${width / gcd}:${height / gcd}`;
}

function areAllGopsIdentical(gops) {
return gops.every(gop => _.isEqual(gops[0].frames.length, gop.frames.length));
}
Expand All @@ -193,6 +310,22 @@ function filterVideoFrames(frames) {
return frames.filter(frame => frame.media_type === 'video');
}

function hasAudioFrames(frames) {
return frames.some(frame => frame.media_type === 'audio');
}

function toKbs(val) {
return val * 8 / 1024;
}

function findGcd(a, b) {
if (a === 0 && b === 0) {
return 0;
}

if (b === 0) {
return a;
}

return findGcd(b, a % b);
}

0 comments on commit 9be72da

Please sign in to comment.