Table of Contents

Class OfflineAudioContext

Namespace
CSharpToJavaScript.APIs.JS
Assembly
CSharpToJavaScript.dll

The OfflineAudioContext interface is an AudioContext interface representing an audio-processing graph built from linked together AudioNodes. In contrast with a standard AudioContext, an OfflineAudioContext doesn't render the audio to the device hardware; instead, it generates it, as fast as it can, and outputs the result to an AudioBuffer.

[Value("OfflineAudioContext")]
public class OfflineAudioContext : BaseAudioContext
Inheritance
OfflineAudioContext
Inherited Members

Remarks

Constructors

OfflineAudioContext()

public OfflineAudioContext()

OfflineAudioContext(OfflineAudioContextOptions)

The
OfflineAudioContext() constructor—part of the Web Audio API—creates and returns a new
OfflineAudioContext object instance, which can then be used to render
audio to an AudioBuffer rather than to an audio output device.

public OfflineAudioContext(OfflineAudioContextOptions contextOptions)

Parameters

contextOptions OfflineAudioContextOptions

Remarks

OfflineAudioContext(ulong, ulong, Number)

The
OfflineAudioContext() constructor—part of the Web Audio API—creates and returns a new
OfflineAudioContext object instance, which can then be used to render
audio to an AudioBuffer rather than to an audio output device.

public OfflineAudioContext(ulong numberOfChannels, ulong length, Number sampleRate)

Parameters

numberOfChannels ulong
length ulong
sampleRate Number

Remarks

Properties

Length

The length property of the
OfflineAudioContext interface returns an integer representing the size of
the buffer in sample-frames.

[Value("length")]
public ulong Length { get; }

Property Value

ulong

An integer representing the size of the buffer in sample-frames.

Remarks

Oncomplete

[Value("oncomplete")]
public EventHandlerNonNull Oncomplete { get; set; }

Property Value

EventHandlerNonNull

Methods

Resume()

The resume() method of the
OfflineAudioContext interface resumes the progression of time in an audio
context that has been suspended. The promise resolves immediately because the
OfflineAudioContext does not require the audio hardware.

[Value("resume")]
public Task<GlobalObject.Undefined> Resume()

Returns

Task<GlobalObject.Undefined>

A Promise resolving to 'undefined'.

Remarks

StartRendering()

The startRendering() method of the OfflineAudioContext Interface starts rendering the audio graph, taking into account the current connections and the current scheduled changes.

[Value("startRendering")]
public Task<AudioBuffer> StartRendering()

Returns

Task<AudioBuffer>

A {{jsxref("Promise")}} that fulfills with an AudioBuffer.

Remarks

The OfflineAudioContextcomplete event (of type OfflineAudioCompletionEvent) is raised when the rendering is finished, containing the resulting AudioBuffer in its renderedBuffer property.

Browsers currently support two versions of the startRendering() method — an older event-based version and a newer promise-based version.
The former will eventually be removed, but currently both mechanisms are provided for legacy reasons.

-Using the Web Audio API

See also on MDN

Suspend(Number)

The suspend() method of the OfflineAudioContext interface schedules a suspension of the time
progression in the audio context at the specified time and returns a promise. This is
generally useful at the time of manipulating the audio graph synchronously on
OfflineAudioContext.

[Value("suspend")]
public Task<GlobalObject.Undefined> Suspend(Number suspendTime)

Parameters

suspendTime Number

Returns

Task<GlobalObject.Undefined>

A Promise resolving to 'undefined'.

Remarks

Note that the maximum precision of suspension is the size of the render quantum and the
specified suspension time will be rounded down to the nearest render quantum boundary.
For this reason, it is not allowed to schedule multiple suspends at the same quantized
frame. Also scheduling should be done while the context is not running to ensure the
precise suspension.

See also on MDN