Tweet a video clip from a video, with optional text status
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
assets
example
lib
.gitignore
LICENSE
README.md
example-usage-audio.js
example-usage-video.js
index.js
package-lock.json
package.json

README.md

tweet-that-clip

Tweet a video or audio clip from a video, with optional text status.

If it's an audio file it creates an animated waveform video

If captions are provided, for either video or audio, it burns them onto the video

Origin

Originally developed as part of textAV 2018, for "Full Fact - tweet that clip" by Pietro & James.

Part of textAV reusable components Trello board

Subsequently integrated in autoEdit.io, to enable tweeting audio or video quote.

Development env

  • node v10.0.0
  • npm 6.1.0

Setup

TWITTER_CONSUMER_KEY=""
TWITTER_CONSUMER_SECRET=""
TWITTER_CALLBACK=""

and user credentials, to post on user timelines

TWITTER_ACCESS_TOKEN=""
TWITTER_ACCESS_TOKEN_SECRET=""

Install npm module tweet-that-clip

npm install tweet-that-clip

Usage

requrie and use in your code

const path = require('path');
const tweetThatClip = require('tweet-that-clip');
const ffmpeg = require('ffmpeg-static-electron');

const opts = {
   inputFile: path.join(__dirname,'./assets/test.mp4'),
  mediaType: 'video', // 'audio' or 'video'
  outputFile: path.join(__dirname,'/example/test-clipped.mp4'),
  inputSeconds: 10, // seconds
  durationSeconds: 20, // in seconds. Up to 2min duration 
  // Twitter text status  280 characters limit.
  tweetText: 'The Trussell Trust found that food bank use increased by 52% in a year in areas where Universal Credit has been rolled out. The National Audit Office observed similar findings https://fullfact.org/economy/universal-credit-driving-people-food-banks/', 
  // tmp directory for creating intermediate clips when processing media
  tmpDir: path.join(__dirname,'/assets'),
  // optional path to ffmpeg. eg To burn captions, needs, optional path to ffmpeg binary - enable libas, 
  // if not provided it uses default on system if present
  // if in doubt can give the path to https://www.npmjs.com/package/ffmpeg-static-electron
  ffmpegBin: ffmpeg.path,
  // Optional caption file - if burning captions provide an srtFilePath.
  srtFilePath:  path.join(__dirname,'./assets/captions.srt')
};

tweetThatClip(opts)
  .then((res)=>{
    console.log('in example-usage for video',res.outputFile);
    // console.log(res.resTwitter);
  })
  .catch((error) => {
    console.log('Error in example-usage for video',error);
  })

also See ./example-usage-video.js file and ./example-usage-audio.js.

ffmpeg binary path

As seen in example below you need to provide binary for ffmpeg. eg ffmpeg-static or ffmpeg-static-electron.

Especially when using the option to burn captions you need to provide an ffmpeg with --enable-libass. The two binaries linked above have been tested to work.

Optional credentials object

For some use cases such as electron, you might want to pass in an optional credentials object attribute, see example blow

const opts = {
  ...
  // optional credentials 
  credentials: {
    consumerKey: "",
    consumerSecret: "",
    accessToken: "",
    accessTokenSecret: ""
  }
};

Optional captions file

If you provide the path to a caption file for the selection you want to trim, it is going to be used to burn captions onto the clip.

Note timecodes and text need to be relative to the selection only, as if the sele

const opts = {
  ...
 // Optional caption file - if burning captions provide an srtFilePath.
  srtFilePath:  path.join(__dirname,'./assets/captions.srt')
};

System Architecture

  • At a high level it uses fluent-ffmpeg to trim clip and convert to twitter video specs
    • 0.5 seconds and 30 seconds (sync) / 140 seconds (async)
    • not exceed 15 mb (sync) / 512 mb (async)
  • For twitter video upload and status post uses script by @jcipriano refactored into a module.
  • It creates a tmp clipped/trimmed file and deletes it once the tweet is sent.

In more detail, the main index.js pulls in the modules from lib.

  1. Trim video
  2. if audio create Waveform
  3. Burn captions - optional, if srt is provided
  4. Tweet clip

Build

No build step

Tests

No tests for now, just ./example-usage-video.js file and ./example-usage-audio.js.

Deployment

No deployment, as node module, but available on npm as tweet-that-clip