Module

WAGS.Run

Package
purescript-wags
Repository
mikesol/purescript-wags

Run a Scene to produce sound using the Web Audio API.

#EasingAlgorithm Source

type EasingAlgorithm = Cofree (Function Int) Int

An algorithm that tells the engine how much lookahead the audio should have in milliseconds. The (->) Int is a penalty function, where a positive input is the number of milliseconds left over after rendering (meaning we gave too much headroom) and a negative input is the number of milliseconds by which we missed the deadline (meaning there was not enough headroom). This allows the algorithm to make adjustments if necessary.

As an example:

easingAlgorithm :: EasingAlgorithm
easingAlgorithm =
  let
    fOf initialTime = mkCofree initialTime \adj -> fOf $ max 20 (initialTime - adj)
  in
    fOf 20

This easing algorithm always provides at least 20ms of headroom to the algorithm, but adjusts upwards in case deadlines are being missed.

#EngineInfo Source

type EngineInfo = { easingAlgorithm :: EasingAlgorithm }

The information provided to run that tells the engine how to make certain rendering tradeoffs.

#Run Source

type Run res = { edges :: Map Int (Set Int), nodes :: Map Int AnAudioUnit, res :: res }

The output of run to be consumed downstream (or not). It contains:

  • nodes: the nodes in the audio graph including information about things like their frequency, Q value, on/off state etc.
  • edges: incoming edges into nodes.
  • res: the residual from the computation.

This information can be used for visualizing the audio graph or for other instruments outside of a browser that are using the browser as a control layer.

#SceneI Source

type SceneI trigger world = { active :: Boolean, headroom :: Int, sysTime :: Instant, time :: Number, trigger :: trigger, world :: world }

The input type to a scene that is handled by run. Given Event trigger and Behavior world, the scene will receive:

trigger - the trigger. world - the world. time - the time of the audio context. sysTime - the time provided by new Date().getTime() active - whether this event was caused by a trigger or is a measurement of the world. This is useful to not repeat onsets from the trigger. headroom - the amount of lookahead time. If you are programming a precise rhythmic event and need the onset to occur at a specific moment, you can use headroom to determine if the apex should happen now or later.

#bufferToList Source

bufferToList :: forall a. Int -> Event a -> Event (List { time :: Instant, value :: a })

Given a buffering window and an event, return a list of events that occur within that window.

  • timeToCollect - the buffering window
  • incomingEvent - the event to buffer

For example, if event outputs the following sequence:

  • unit @ 0 ms
  • unit @ 1 ms
  • unit @ 7 ms
  • unit @ 9 ms
  • unit @ 15 ms

Then:

bufferToList 4 event

would group together events within the same 4ms window before emitting them, resulting in

{ time :: 0ms, value :: unit } : { time :: 1ms, value :: unit } : Nil -- emitted at 4ms
{ time :: 7ms, value :: unit } : { time :: 9ms, value :: unit } : Nil -- emitted at 11ms
{ time :: 15ms, value :: unit } : Nil -- emitted at 19ms

#run Source

run :: forall trigger world res. Monoid res => Event trigger -> Behavior world -> EngineInfo -> FFIAudio -> SceneT (SceneI trigger world) FFIAudio (Effect Unit) Frame0 Thunkable res -> Event (Run res)

Run a scene.

  • Event trigger is the event to which the scene reacts. trigger will contain things like an initial event, mouse clicks, MIDI onsets, OSC commands and any other event to which the scene should respond. Because of this, the polymorphic type trigger is often defined as an ADT with different potential incoming actions, similar to how actions are defined in Halogen. Note that no sound will be produced unless there is at least one event. For this reason, there is usually some form of initial event, ie data Trigger = InitialEvent | MouseClick | etc.., that is sent to start audio rendering. All of the examples in this repo contain an initial event, which is often pure unit in the case where there in only the initial event.
  • Behavior world is the outside environment. world will usually contain things like the current mouse position, the ambient temperature, the axial tilt of the Earth, or other things that can be modeled as a continuous function of time. One important thing to note is that world lags trigger by 0 or 1 events in the browser event queue. For most real-world applications, this does not matter, but it does lead to subtle logic bugs if trigger and world are corrolated. For this reason, it is good to decouple trigger and world.
  • EngineInfo is the engine information needed for rendering.
  • FFIAudio is the audio state needed for rendering
  • Scene is the scene to render. See SceneI to understand how trigger and world are blended into the inptu environment going to Scene.