createMediaStreamDestination() 方法在 AudioContext Interface is used to create a new MediaStreamAudioDestinationNode object associated with a WebRTC MediaStream representing an audio stream, which may be stored in a local file or sent to another computer.

MediaStream is created when the node is created and is accessible via the MediaStreamAudioDestinationNode 's strea m attribute. This stream can be used in a similar way as a MediaStream obtained via navigator.getUserMedia — it can, for example, be sent to a remote peer using the RTCPeerConnection addStream() 方法。

For more details about media stream destination nodes, check out the MediaStreamAudioDestinationNode reference page.

句法

var audioCtx = new AudioContext();
var destination = audioCtx.createMediaStreamDestination();
					

返回

A MediaStreamAudioDestinationNode .

范例

In the following simple example, we create a MediaStreamAudioDestinationNode OscillatorNode MediaRecorder (the example will therefore only work in Firefox and Chrome at this time.) The MediaRecorder is set up to record information from the MediaStreamDestinationNode .

When the button is clicked, the oscillator starts, and the MediaRecorder is started. When the button is stopped, the oscillator and MediaRecorder both stop. Stopping the MediaRecorder causes the dataavailable event to fire, and the event data is pushed into the chunks array. After that, the stop event fires, a new blob is made of type opus — which contains the data in the chunks array, and a new window (tab) is then opened that points to a URL created from the blob.

From here, you can play and save the opus file.

<!DOCTYPE html>
<html>
  <head>
    <title>createMediaStreamDestination() demo</title>
  </head>
  <body>
    <h1>createMediaStreamDestination() demo</h1>
    <p>Encoding a pure sine wave to an Opus file </p>
    <button>Make sine wave</button>
    <audio controls></audio>
    <script>
     var b = document.querySelector("button");
     var clicked = false;
     var chunks = [];
     var ac = new AudioContext();
     var osc = ac.createOscillator();
     var dest = ac.createMediaStreamDestination();
     var mediaRecorder = new MediaRecorder(dest.stream);
     osc.connect(dest);
     b.addEventListener("click", function(e) {
       if (!clicked) {
           mediaRecorder.start();
           osc.start(0);
           e.target.innerHTML = "Stop recording";
           clicked = true;
         } else {
           mediaRecorder.stop();
           osc.stop(0);
           e.target.disabled = true;
         }
     });
     mediaRecorder.ondataavailable = function(evt) {
       // push each chunk (blobs) in an array
       chunks.push(evt.data);
     };
     mediaRecorder.onstop = function(evt) {
       // Make blob out of our blobs, and open it.
       var blob = new Blob(chunks, { 'type' : 'audio/ogg; codecs=opus' });
       document.querySelector("audio").src = URL.createObjectURL(blob);
     };
    </script>
  </body>
</html>
					

注意 : You can view this example live ,或 study the source code , on Github.

规范

规范 状态 注释
Web 音频 API
The definition of 'createMediaStreamDestination()' in that specification.
工作草案

浏览器兼容性

The compatibility table on this page is generated from structured data. If you'd like to contribute to the data, please check out https://github.com/mdn/browser-compat-data and send us a pull request. 更新 GitHub 上的兼容性数据
桌面 移动
Chrome Edge Firefox Internet Explorer Opera Safari Android webview Chrome for Android Firefox for Android Opera for Android Safari on iOS Samsung Internet
createMediaStreamDestination Chrome 14 Edge ≤18 Firefox 25 IE 不支持 No Opera 15 Safari 6 WebView Android Yes Chrome Android 18 Firefox Android 26 Opera Android 14 Safari iOS Yes Samsung Internet Android 1.0

图例

完整支持

完整支持

不支持

不支持

另请参阅

元数据

  • 最后修改: