Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is there any way to record a audio that corresponds to a canvas animation? #65

Closed
FumiyaShibusawa opened this issue Dec 8, 2017 · 6 comments

Comments

@FumiyaShibusawa
Copy link

I'm struggling with putting audio into webM file with Ccapture.js.

I can capture a canvas animation with no problem but the webM file doesn't have audio at all.
I saw a link on Whammy.js and it says like this at Todo section.
Does that mean Ccapture.js doesn't support audio recording either?

...
Also, if someone ever makes a Javascript Vorbis encoder, it would be nice to integrate that in,
since this currently only does the video part, but audio’s also a pretty big part.

sample

I just forked a three.js sample from its repo and added Ccapture.js setup within the file as below.
https://github.com/mrdoob/three.js/blob/master/examples/webaudio_visualizer.html

Here is a music that I use in this file.
https://www.bensound.com/royalty-free-music/track/going-higher

<!DOCTYPE html>
<html lang="en">
  <head>
    <title>three.js webaudio - visualizer</title>
    <meta charset="utf-8">
    <meta name="viewport" content="width=device-width, user-scalable=no, minimum-scale=1.0, maximum-scale=1.0">
    <style>
      body {
        background:#777;
        padding:0;
        margin:0;
        font-weight: bold;
        overflow:hidden;
      }

      #info {
        position: absolute;
        top: 0px;
        width: 100%;
        color: #ffffff;
        padding: 5px;
        font-family:Monospace;
        font-size:13px;
        text-align:center;
      }

      a {
        color: #ffffff;
      }
    </style>

    <script src="../../js/three.min.js"></script>
    <script src="../../node_modules/ccapture.js/build/CCapture.all.min.js"></script>
    <!-- <script src="js/Detector.js"></script> -->

    <script id="vertexShader" type="x-shader/x-vertex">

      varying vec2 vUv;

      void main() {

        vUv = uv;
        gl_Position = vec4( position, 1.0 );

      }

    </script>

    <script id="fragmentShader" type="x-shader/x-fragment">

      uniform sampler2D tAudioData;

      varying vec2 vUv;

      void main() {

        vec3 backgroundColor = vec3( 0.0 );
        vec3 color = vec3( 1.0, 0.0, 0.0 );

        float f = texture2D( tAudioData, vec2( vUv.x, 0.0 ) ).r; // sample data texture (only the red channel is relevant)
        float i = step( vUv.y, f );

        gl_FragColor = vec4( mix( backgroundColor, color, i ), 1.0 );

      }

    </script>

  </head>
<body>

  <div id="container"></div>
  <div id="info">
    <a href="https://threejs.org" target="_blank" rel="noopener noreferrer">three.js</a> webaudio - visualizer<br/>
    music by <a href="http://www.newgrounds.com/audio/listen/358232" target="_blank" rel="noopener noreferrer">larrylarrybb</a>
  </div>

  <script>

  // if ( ! Detector.webgl ) Detector.addGetWebGLMessage();

  var scene, camera, renderer, analyser, uniforms;

  var capturer = new CCapture( {
    verbose: true,
    display: true,
    framerate: 60,
    format: 'webm',
    timeLimit: 3.14
  } );

  function CCaptureButtons() {
    window.addEventListener( 'load', function ( e ) {
      e.preventDefault();
      capturer.start();
    }, false );
  };


  init();
  animate();
  CCaptureButtons();


  function init() {

    var fftSize = 2048;

    //

    var container = document.getElementById( 'container' );

    //

    scene = new THREE.Scene();

    //

    renderer = new THREE.WebGLRenderer( { antialias: true } );
    renderer.setSize( window.innerWidth, window.innerHeight );
    renderer.setClearColor( 0x000000 );
    renderer.setPixelRatio( window.devicePixelRatio );
    container.appendChild( renderer.domElement );

    //

    camera = new THREE.Camera();
    camera.position.z = 1;

    //

    var audioLoader = new THREE.AudioLoader();

    var listener = new THREE.AudioListener();
    camera.add( listener );

    var audio = new THREE.Audio( listener );
    audioLoader.load( '../../audios/bensound-goinghigher.mp3', function( buffer ) {
      audio.setBuffer( buffer );
      audio.setLoop( true );
      audio.play();
    });

    analyser = new THREE.AudioAnalyser( audio, fftSize );

    //

    var geometry = new THREE.PlaneBufferGeometry( 2, 2 );

    //

    var size = fftSize / 2;

    uniforms = {
      tAudioData: { value: new THREE.DataTexture( analyser.data, size, 1, THREE.LuminanceFormat ) }
    };

    var material = new THREE.ShaderMaterial( {

      uniforms: uniforms,
      vertexShader: document.getElementById( 'vertexShader' ).textContent,
      fragmentShader: document.getElementById( 'fragmentShader' ).textContent

    } );

    //

    var mesh = new THREE.Mesh( geometry, material );
    scene.add( mesh );

    //

    window.addEventListener( 'resize', onResize, false );

  }

  function onResize() {

    renderer.setSize( window.innerWidth, window.innerHeight );

  }

  function animate() {

    requestAnimationFrame( animate );

    render();

  }

  function render() {

    var data = analyser.getFrequencyData();

    uniforms.tAudioData.value.needsUpdate = true;

    renderer.render( scene, camera );
    capturer.capture(renderer.domElement);

  }

  </script>

</body>
</html>
@spite
Copy link
Owner

spite commented Dec 11, 2017

No support for audio, unfortunately. Web Audio API is not easy -or possible at all- to throttle like ccapture does. In the future there might be a nicer Audio/Video muxing API, but for now, the best solution is to capture the video, and the ffmpeg it together with the audio file.

@FumiyaShibusawa
Copy link
Author

@spite Thanks for the reply. alright, got it. I will work around with what you advised for now.

@nuthinking
Copy link

nuthinking commented Feb 9, 2018

In a tool I am building I am actually recording both video and audio together in realtime with discreet results. Sometimes unfortunately I get some initial frames with artefacts (green grid). Not sure if you Jaume had similar problems with MediaRecorder.

@spite
Copy link
Owner

spite commented Feb 10, 2018

Not really, no. All the work i've done with MediaRecorder yields solid results, both on macos and Linux. Not so sure about windows, but i'd say i've never had problems on at least w8 and w10.

@MoSofi
Copy link

MoSofi commented Jul 30, 2019

@FumiyaShibusawa Did you find a solution?

@sintj
Copy link

sintj commented May 15, 2022

Hi guys, any solution on this? I'm capturing the canvas with CCcapture but of course when I start recording the video slows down to capture the frames but the audio won't, so this will result in a desynced animation. The only workaround is to record directly the screen, but of course this is not ideal for the output quality.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants