Web Audio API works in Chrome 14+, Edge 12+, Firefox 25+, Safari 6+ on macOS and iOS, Opera 15+, and Samsung Internet. Learn browser support and known issues.

Prince Dewani
May 1, 2026
Web Audio API is a W3C JavaScript API for routing, processing, and synthesizing audio inside a node graph in the browser. It supports Chrome 14+, Edge 12+, Firefox 25+, Safari 6+ on macOS and iOS, Opera 15+, and Samsung Internet, while Internet Explorer never added support.
This guide covers what Web Audio API is, the browsers that support it, the key features, how it works, the use cases, and the known cross-browser issues.
Web Audio API is a W3C standard for routing, processing, and synthesizing audio inside a node graph in the browser. The W3C Audio Working Group maintains the spec. The entry point is AudioContext, which wires source nodes (oscillators, audio files, microphone streams) through processing nodes (gain, filter, panner, analyser) to a destination node that plays through the speakers.
Web Audio API works in every modern browser. Chrome, Edge, Firefox, Safari, Opera, and Samsung Internet all ship support; only Internet Explorer never added it.
Chrome supports Web Audio API from Chrome 14 on desktop and Chrome 18 on Android. Chrome 14 to 33 required the webkitAudioContext prefix; Chrome 34 and later expose unprefixed AudioContext. Chrome 4 to 13 did not support the API.
Edge supports Web Audio API from Edge 12 on Windows. Every Edge release, including Chromium Edge 79 and later, exposes unprefixed AudioContext natively without flags or prefixes.
Firefox supports Web Audio API from Firefox 25 on desktop and Firefox 25 on Android. Firefox 25 and later expose unprefixed AudioContext by default. Firefox versions earlier than 25 shipped a different, deprecated audio API and cannot run modern Web Audio code.
Safari supports Web Audio API from Safari 6 on macOS and Safari 6 on iOS. Safari 6 to 14 required the webkitAudioContext prefix; Safari 14.1 on macOS and iOS 14.5 expose unprefixed AudioContext. Safari 3.1 to 5.1 did not support the API.
Opera supports Web Audio API from Opera 15 on desktop and Opera Mobile 14 on Android. Opera ships the same Blink engine as Chrome, so behavior matches Chrome's. Opera 9 to 12.1 did not support the API.
Samsung Internet supports Web Audio API from version 1.0 on Android. Every Samsung Internet release on Galaxy phones and tablets ships unprefixed AudioContext by default and follows the same codec rules as Chrome for Android.
The legacy stock Android Browser before Android 4.4 KitKat did not support Web Audio API. From Android 4.4 on, the Chrome-based WebView (v37 and later) ships full Web Audio API support, and modern Android phones run this WebView. On current Android devices, use Chrome for Android, Firefox for Android, or Samsung Internet for full support.
Internet Explorer never supported Web Audio API. IE 5.5 through IE 11 lack AudioContext and webkitAudioContext entirely. Microsoft has retired Internet Explorer, so use Edge, Chrome, or Firefox. Sites that still target IE 11 fall back to the HTML audio element.
Note: Web Audio API breaks across older Safari, iOS, and Android Browser builds. Test it on real browsers and OS with TestMu AI. Try TestMu AI free!
Web Audio API ships a modular toolkit for routing, generating, and analyzing sound in JavaScript. Below are the building blocks every project relies on.
OscillatorNode for synthesized waveforms, AudioBufferSourceNode for in-memory clips, and MediaStreamAudioSourceNode for microphone or WebRTC input.GainNode for volume, BiquadFilterNode for equalization, DynamicsCompressorNode for compression, ConvolverNode for reverb, and PannerNode for 3D spatialization.getByteFrequencyData() and getByteTimeDomainData(), used to power waveform and spectrum visualizations.ScriptProcessorNode. It shipped in Chrome 66, Firefox 76, and Safari 14.1.start(time) and AudioParam.setValueAtTime(value, time) call lines up with the AudioContext clock at sub-millisecond precision.Web Audio API runs every sound through an audio graph: source nodes connect to processing nodes, which connect to a destination node that the browser plays. The pipeline runs in four steps.
const ctx = new AudioContext(). The context owns the sample rate (typically 48000 Hz) and the audio clock.HTMLAudioElement, an AudioBuffer, an OscillatorNode, or a microphone stream returned by getUserMedia().source.connect(gain).connect(filter).connect(panner). Each call adds a node to the graph and routes audio through it.panner.connect(ctx.destination). The browser then mixes the graph in a real-time audio thread and writes samples to the output device.Modern browsers start the AudioContext in a suspended state until the user interacts with the page. Call ctx.resume() inside a click or keydown handler to start playback.
Web Audio API powers production audio across browser-based games, music tools, video conferencing, and streaming services.
AudioBufferSourceNode and chain BiquadFilterNode plus DynamicsCompressorNode for in-browser mixing.MediaStreamAudioSourceNode for noise gating and gain control.AnalyserNode powers oscilloscopes, frequency bars, and waveform displays in players like Vimeo and Twitch.Web Audio API is widely supported, but cross-browser parity is not perfect. In my experience, autoplay enforcement and the legacy webkitAudioContext prefix cause the most QA regressions.
ctx.resume() or audio stays silent with no error.new webkitAudioContext(). Sites that target older Safari builds still need the fallback window.AudioContext = window.AudioContext || window.webkitAudioContext.noteOn, noteOff, and createGainNode. Chrome and Firefox migrated to start, stop, and createGain. A polyfill is required if you must run on Safari 6 to 8.ScriptProcessorNode.All Web Audio API version numbers and platform notes in this guide come from these primary sources:
Did you find this page helpful?
More Related Hubs
TestMu AI forEnterprise
Get access to solutions built on Enterprise
grade security, privacy, & compliance