connect()
方法在
AudioNode
interface lets you connect one of the node's outputs to a target, which may be either another
AudioNode
(thereby directing the sound data to the specified node) or an
AudioParam
, so that the node's output data is automatically used to change the value of that parameter over time.
var destinationNode = AudioNode.connect(destination, outputIndex, inputIndex); AudioNode.connect(destination, outputIndex);
destination
AudioNode
or
AudioParam
to which to connect.
outputIndex
可选
AudioNode
to connect to the destination. The index numbers are defined according to the number of output channels (see
Audio channels
). While you can only connect a given output to a given input once (repeated attempts are ignored), you can connect an output to multiple inputs by calling
connect()
repeatedly. This makes
fan-out
possible. The default value is 0.
inputIndex
可选
AudioNode
to; the default is 0. The index numbers are defined according to the number of input channels (see
Audio channels
). It is possible to connect an
AudioNode
to another
AudioNode
, which in turn connects back to the first
AudioNode
, creating a cycle. This is allowed only if there is at least one
DelayNode
in the cycle. Otherwise, a
NotSupportedError
exception is thrown. This parameter is not allowed if the destination is an
AudioParam
.
If the destination is a node,
connect()
returns a reference to the destination
AudioNode
object, allowing you to chain multiple
connect()
calls. In some browsers, older implementations of this interface return
undefined
.
If the destination is an
AudioParam
,
connect()
返回
undefined
.
IndexSizeError
outputIndex
or
inputIndex
doesn't correspond to an existing input or output.
InvalidAccessError
The destination node is not part of the same audio context as the source node.
NotSupportedError
DelayNode
s in the cycle to prevent the resulting waveform from getting stuck constructing the same audio frame indefinitely.
The most obvious use of the
connect()
method is to direct the audio output from one node into the audio input of another node for further processing. For example, you might send the audio from a
MediaElementAudioSourceNode
—that is, the audio from an HTML5 media element such as
<audio>
—through a band pass filter implemented using a
BiquadFilterNode
to reduce noise before then sending the audio along to the speakers.
This example creates an oscillator, then links it to a gain node, so that the gain node controls the volume of the oscillator node.
var AudioContext = window.AudioContext || window.webkitAudioContext; var audioCtx = new AudioContext(); var oscillator = audioCtx.createOscillator(); var gainNode = audioCtx.createGain(); oscillator.connect(gainNode); gainNode.connect(audioCtx.destination);
In this example, we will be altering the gain value of a
GainNode
using an
OscillatorNode
with a slow frequency value. This technique is know as an
LFO
-controlled parameter.
var AudioContext = window.AudioContext || window.webkitAudioContext; var audioCtx = new AudioContext(); // create an normal oscillator to make sound var oscillator = audioCtx.createOscillator(); // create a second oscillator that will be used as an LFO (Low-frequency // oscillator), and will control a parameter var lfo = audioCtx.createOscillator(); // set the frequency of the second oscillator to a low number lfo.frequency.value = 2.0; // 2Hz: two oscillations par second // create a gain whose gain AudioParam will be controlled by the LFO var gain = audioCtx.createGain(); // connect the LFO to the gain AudioParam. This means the value of the LFO // will not produce any audio, but will change the value of the gain instead lfo.connect(gain.gain); // connect the oscillator that will produce audio to the gain oscillator.connect(gain); // connect the gain to the destination so we hear sound gain.connect(audioCtx.destination); // start the oscillator that will produce audio oscillator.start(); // start the oscillator that will modify the gain value lfo.start();
It is possible to connect an
AudioNode
output to more than one
AudioParam
, and more than one AudioNode output to a single
AudioParam
, with multiple calls to
connect()
.
Fan-in and fan-out
are therefore supported.
AudioParam
will take the rendered audio data from any
AudioNode
output connected to it and convert it to mono by
down-mixing
(if it is not already mono). Next, it will mix it together with any other such outputs, and the intrinsic parameter value (the value the
AudioParam
would normally have without any audio connections), including any timeline changes scheduled for the parameter.
Therefore, it is possible to choose the range in which an
AudioParam
will change by setting the value of the
AudioParam
to the central frequency, and to use a
GainNode
between the audio source and the
AudioParam
to adjust the range of the
AudioParam
改变。
| 规范 | 状态 | 注释 |
|---|---|---|
|
Web 音频 API
The definition of 'connect() to an AudioNode' in that specification. |
工作草案 | |
|
Web 音频 API
The definition of 'connect() to an AudioParam' in that specification. |
工作草案 |
| 桌面 | 移动 | |||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|
connect
|
Chrome 14 | Edge 12 | Firefox 25 | IE 不支持 No | Opera 15 | Safari 6 | WebView Android Yes | Chrome Android 18 | Firefox Android 26 | Opera Android 14 | Safari iOS Yes | Samsung Internet Android 1.0 |
完整支持
不支持
AudioNode
connect()
disconnect()
AnalyserNode
AudioBuffer
AudioBufferSourceNode
AudioContext
AudioContextOptions
AudioDestinationNode
AudioListener
AudioNodeOptions
AudioParam
AudioProcessingEvent
AudioScheduledSourceNode
AudioWorklet
AudioWorkletGlobalScope
AudioWorkletNode
AudioWorkletProcessor
BaseAudioContext
BiquadFilterNode
ChannelMergerNode
ChannelSplitterNode
ConstantSourceNode
ConvolverNode
DelayNode
DynamicsCompressorNode
GainNode
IIRFilterNode
MediaElementAudioSourceNode
MediaStreamAudioDestinationNode
MediaStreamAudioSourceNode
OfflineAudioCompletionEvent
OfflineAudioContext
OscillatorNode
PannerNode
PeriodicWave
StereoPannerNode
WaveShaperNode