Notice
Recent Posts
Recent Comments
Link
«   2024/12   »
1 2 3 4 5 6 7
8 9 10 11 12 13 14
15 16 17 18 19 20 21
22 23 24 25 26 27 28
29 30 31
Tags
more
Archives
Today
Total
관리 메뉴

melius

[Web API] Audio 관련 정리 본문

Web API

[Web API] Audio 관련 정리

melius102 2020. 2. 1. 14:20

Web Audio API

https://developer.mozilla.org/en-US/docs/Web/API/Web_Audio_API

 

Basic concepts behind Web Audio API

https://developer.mozilla.org/en-US/docs/Web/API/Web_Audio_API/Basic_concepts_behind_Web_Audio_API

 

Web Audio Workflow

1. audio context 생성

2. context내에 source 생성

3. effects node 생성

4. final destination 선택

5. audio source에서 effects를 거쳐 destination 연결

 

 

각 audio node의 입력과 출력은 audio channel로 구성된다. (mono, stereo, quad, 5.1)

 

Audio Buffer

sample: 특정시간에 출력되는 사운드에 대응하는 float32 값

frame: 특정시간에 출력되는 각 channel의 사우드에 대응하는 Sample 집합 (frame = sample/channel)

sample Rate: 초당 출력되는 Frame의 수 (Hz)

 

 

Using the Web Audio API

https://developer.mozilla.org/en-US/docs/Web/API/Web_Audio_API/Using_Web_Audio_API

 

AudioContext 객체 생성 및 source 연결

AudioContext 객체의 source에 mediaDevices.getUserMedia()를 통한 MediaStream 객체나, HTML의 audio 요소 객체를 사용할 수 있다.

let audioContext = new window.AudioContext();

let audioElement = document.querySelector('audio');
// let audioInput = audioContext.createMediaStreamSource(mediaStream);
let audioInput = audioContext.createMediaElementSource(audioElement);

 

effects node 생성 및 연결

let gainNode = audioContext.createGain();
gainNode.gain.setValueAtTime(2.0, audioContext.currentTime);
// gainNode.gain.value = 2.0;
audioInput.connect(gainNode);

let analyserNode = audioContext.createAnalyser();
analyserNode.fftSize = 2048;
gainNode.connect(analyserNode);

let pannerOptions = {
    pan: 0
};
let pannerNode = new StereoPannerNode(audioContext, pannerOptions);
pannerNode.pan.value = 0.5; // range: -1 ~ 1
gainNode.connect(pannerNode);

 

final destination 연결

pannerNode.connect(audioContext.destination);

///////////////////////////////////////////////
// audioInput -> gainNode -> analyserNode
//               gainNode -> pannerNode -> destination(speaker)

 

AudioContext 객체 조작

btnResume.onclick = () => { audioContext.resume(); };
btnSuspend.onclick = () => { audioContext.suspend(); };
btnClose.onclick = () => { audioContext.close(); };

 

 

AudioContext

AudioContext.createMediaElementSource()

- Parameters: HTMLMediaElement

- Returns: MediaElementAudioSourceNode

AudioContext.createMediaStreamSource()

- Parameters: MediaStream

- Returns: MediaStreamAudioSourceNode

AudioContext.resume()

- Returns: Promise, change state to 'running'

AudioContext.suspend()

- Returns: Promise, change state to 'suspended'

AudioContext.close()

- Returns: Promise, change state to 'closed'

 

 

BaseAudioContext

BaseAudioContext.destination

- Value: AudioDestinationNode

BaseAudioContext.state (read-only property)

- Value: 'suspended', 'running', 'closed'

BaseAudioContext.createGain()

- Returns: GainNode

BaseAudioContext.createStereoPanner()

- Returns: StereoPannerNode

BaseAudioContext.createAnalyser()

- Returns: AnalyserNode

BaseAudioContext.createBuffer()

- Parameters: numOfChannels, length, sampleRate

- Returns: AudioBuffer

// baseAudioContext.createBuffer(numOfChannels, length, sampleRate);
// numSecond = length/sampleRate
// length = samples per channel

var audioContext = new AudioContext();
var audioBuffer = audioContext.createBuffer(2, 22050, 44100); // 0.5s stereo buffer

BaseAudioContext.createScriptProcessor() [Deprecated]

- Parameters: bufferSize, numberOfInputChannels, numberOfOutputChannels

- Returns: ScriptProcessorNode

let scriptProcessorNode = audioContext.createScriptProcessor(4096, 2, 2);

scriptProcessorNode.onaudioprocess = function (evt) {
    let audioBuffer = evt.inputBuffer;
    let float32Array = [];

    // audioBuffer.length: 4096 sample-frames
    // audioBuffer.numberOfChannels : 2
    // audioBuffer.duration == audioBuffer.length / audioBuffer.sampleRate

    for(let i=0; i<audioBuffer.numberOfChannels;i++){
        float32Array[i] = audioBuffer.getChannelData(i);
    }
    // ...
};

audioInput.connect(scriptProcessorNode).connect(audioContext.destination);

 

 

AudioBuffer

AudioBuffer.sampleRate

- Value: A floating-point value (Hz)

AudioBuffer.numberOfChannels

- Value: channel 갯수

AudioBuffer.getChannelData()

- Parameters: index of channel ( < numberOfChannels )

- Returns: Float32Array

 

 

AudioNode

AudioNode.context

- Value: AudioContext, 해당 AudioNode를 만든 객체

AudioNode.connect()

 

 

GainNode

GainNode.gain: AudioParam

 

 

StereoPannerNode

StereoPannerNode.pan: AudioParam.value (-1 ~ 1)

 

 

AnalyserNode

AnalyserNode.fftSize

- Value: unsigned integer, window size of Fast Fourier Transform

 

 

ScriptProcessorNode [Deprecated]

ScriptProcessorNode.onaudioprocess

- EventListener: 해당 노드의 입력버퍼 처리준비 완료

 

 

AudioParam

AudioParam.setValueAtTime()

AudioParam.value

 

 

HTMLMediaElement

let audioElm = document.createElement('audio');
audioElm.canPlayType('audio/mp3');
// probably

audioElm.setAttribute("src", "audio.mp3");
audioElm.addEventListener('ended', () => {
    console.log('on ended');
});

audioElm.play();
audioElm.pause();

<audio>: The Embed Audio element

HTMLMediaElement.canPlayType()

- Returns: probably(play 가능), maybe(식별불가), "" (empty string, play 불가)

 

 

'Web API' 카테고리의 다른 글

[Web API] WebSocket  (0) 2020.02.01
[Web API] Worker  (0) 2020.02.01
[BOM] Lock Orientation  (0) 2020.01.11
[DOM] 이벤트  (0) 2019.12.27
[DOM] 문서 객체 모델  (0) 2019.12.24
Comments