Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

(WaveTableNormalization): WaveTable normalization in createWaveTable not clearly defined #91

Closed
olivierthereaux opened this issue Sep 11, 2013 · 5 comments

Comments

@olivierthereaux
Copy link
Contributor

Originally reported on W3C Bugzilla ISSUE-17370 Tue, 05 Jun 2012 11:55:13 GMT
Reported by Philip Jägenstedt
Assigned to

Audio-ISSUE-84 (WaveTableNormalization): WaveTable normalization in createWaveTable not clearly defined [Web Audio API]

http://www.w3.org/2011/audio/track/issues/84

Raised by: Philip Jägenstedt
On product: Web Audio API

https://dvcs.w3.org/hg/audio/raw-file/tip/webaudio/specification.html#dfn-createWaveTable

"The created WaveTable will be used with an Oscillator and will represent a normalized time-domain waveform having maximum absolute peak value of 1. Another way of saying this is that the generated waveform of an Oscillator will have maximum peak value at 0dBFS. Conveniently, this corresponds to the full-range of the signal values used by the Web Audio API. Because the WaveTable will be normalized on creation, the real and imag parameters represent relative values."

This does not clearly define what normalization must be performed. Should/can the normalization be performed analytically, or should one period be generated and the maximum (absolute) value in that period be found? These approaches may not arrive at the exact same normalization level, so a sample-exact test suite would not be possible.

@joeberkovitz
Copy link
Contributor

I feel that the application of normalization in any form at all may be a serious problem with WaveTable. I do not think that PeriodicWave/Oscillator should be involved in messing with the coefficients in the Fourier series; this creates extra work for developers trying to figure out how to set those coefficients to produce a given overall dynamic level relative to other sounds. The philosophy here seems to be to remove control from developers and make PeriodicWave into a pure timbre spec, not the Fourier series that it actually is. But that drags in a lot of complications as Philip pointed out originally,

Say one wants to produce two PeriodicWaves that differ in the value of only one coefficient but are otherwise identical in terms of volume. A developer would have to reverse engineer the normalization and correct all the other coefficients to produce this result.

Even if we don't change this normalization behavior, it certainly needs to be much more carefully speced.

@cwilso
Copy link
Contributor

cwilso commented Oct 23, 2014

Chris Rogers once said when I was poking at him about wavetable normalization "yeah, we probably should have a parameter to disable the normalization or something."

@joeberkovitz
Copy link
Contributor

TPAC resolution:

  1. add additional optional parameter to createPeriodicWave() to enable/disable normalization (see ConvolverNode)
  2. add exact formula describing what coefficients in real and imag arrays are doing in terms of cos and sin functions (e.g. amplitude = sum over i of real[i] * cos(i * freq * time) + imag[i] * sin(i * freq * time))
  3. document the exact normalization function

@cwilso cwilso added this to the Web Audio Last Call 1 milestone Oct 30, 2014
@rtoy
Copy link
Member

rtoy commented Feb 6, 2015

The normalization is done roughly like this:

  • Compute a 4096-point IFFT of the the 2048-element periodicwave arrays. (The arrays are padded with zeroes if less than 2048 elements.)
  • Find the maximum absolute value in the 4096 element time signal.
  • Scale the signal by the max, so that the resulting time signal has maximum amplitude of 1.

This scale factor is used to scale the time domain signal for all frequencies used by the oscillator.

@rtoy
Copy link
Member

rtoy commented Feb 23, 2015

Correction. This is how the normalization is done.

  • Pad the real/imag arrays to length 2048.
  • Compute a 4096-point (forward) FFT.
  • Take the real part as the time-domain signal.
  • Find the maximum absolute value of the time-domain signal. This is the normalization factor.
  • Divide the time-domain signal by the normalization factor.

@rtoy rtoy self-assigned this Jun 13, 2015
rtoy pushed a commit to rtoy/web-audio-api that referenced this issue Jun 15, 2015
This fixes WebAudio#91 by adding the optional normalization parameter to
createPeriodicWave.  We also describe the basic waveform generation
and the normalization process.
rtoy pushed a commit to rtoy/web-audio-api that referenced this issue Jun 15, 2015
…icWave

o Add optional normalization parameter to createPeriodicWave.
o Define how normalization is done.
o Define the waveform generation better.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants