WAGS.Run
- Package
- purescript-wags
- Repository
- mikesol/purescript-wags
Run a Scene
to produce sound using the Web Audio API.
#EasingAlgorithm Source
type EasingAlgorithm = Cofree (Function Int) Int
An algorithm that tells the engine how much lookahead the audio should have in milliseconds. The (->) Int
is a penalty function, where a positive input is the number of milliseconds left over after rendering (meaning we gave too much headroom) and a negative input is the number of milliseconds by which we missed the deadline (meaning there was not enough headroom). This allows the algorithm to make adjustments if necessary.
As an example:
easingAlgorithm :: EasingAlgorithm
easingAlgorithm =
let
fOf initialTime = mkCofree initialTime \adj -> fOf $ max 20 (initialTime - adj)
in
fOf 20
This easing algorithm always provides at least 20ms of headroom to the algorithm, but adjusts upwards in case deadlines are being missed.
#EngineInfo Source
type EngineInfo = { easingAlgorithm :: EasingAlgorithm }
The information provided to run
that tells the engine how to make certain rendering tradeoffs.
#Run Source
type Run res = { instructions :: Array Instruction, res :: res }
#SceneI Source
type SceneI trigger world = { headroom :: Int, sysTime :: Milliseconds, time :: Number, trigger :: Maybe trigger, world :: world }
The input type to a scene that is handled by run
. Given Event trigger
and Behavior world
, the scene will receive:
trigger
- the trigger. If none exists (meaning we are polling) it will be Nothing.
world
- the world.
time
- the time of the audio context.
sysTime
- the time provided by new Date().getTime()
headroom
- the amount of lookahead time. If you are programming a precise rhythmic event and need the onset to occur at a specific moment, you can use headroom
to determine if the apex should happen now or later.
#bufferToList Source
bufferToList :: forall a. Int -> Event a -> Event (List { time :: Instant, value :: a })
Given a buffering window and an event, return a list of events that occur within that window.
timeToCollect
- the buffering windowincomingEvent
- the event to buffer
For example, if event
outputs the following sequence:
unit
@ 0 msunit
@ 1 msunit
@ 7 msunit
@ 9 msunit
@ 15 ms
Then:
bufferToList 4 event
would group together events within the same 4ms window before emitting them, resulting in
{ time :: 0ms, value :: unit } : { time :: 1ms, value :: unit } : Nil -- emitted at 4ms
{ time :: 7ms, value :: unit } : { time :: 9ms, value :: unit } : Nil -- emitted at 11ms
{ time :: 15ms, value :: unit } : Nil -- emitted at 19ms
#run Source
run :: forall trigger world res. Monoid res => Event trigger -> Behavior world -> EngineInfo -> FFIAudio -> Scene (SceneI trigger world) RunAudio RunEngine Frame0 res -> Event (Run res)
Run a scene.
Event trigger
is the event to which the scene reacts.trigger
will contain things like an initial event, mouse clicks, MIDI onsets, OSC commands and any other event to which the scene should respond. Because of this, the polymorphic typetrigger
is often defined as an ADT with different potential incoming actions, similar to how actions are defined in Halogen. Note that no sound will be produced unless there is at least one event. For this reason, there is usually some form of initial event, iedata Trigger = InitialEvent | MouseClick | etc..
, that is sent to start audio rendering. All of the examples in this repo contain an initial event, which is oftenpure unit
in the case where there in only the initial event.Behavior world
is the outside environment.world
will usually contain things like the current mouse position, the ambient temperature, the axial tilt of the Earth, or other things that can be modeled as a continuous function of time. One important thing to note is thatworld
lagstrigger
by 0 or 1 events in the browser event queue. For most real-world applications, this does not matter, but it does lead to subtle logic bugs iftrigger
andworld
are corrolated. For this reason, it is good to decoupletrigger
andworld
.EngineInfo
is the engine information needed for rendering.FFIAudio
is the audio state needed for renderingScene
is the scene to render. SeeSceneI
to understand howtrigger
andworld
are blended into the inptu environment going toScene
.
- Modules
- FRP.
Event. MIDI - WAGS.
Change - WAGS.
Change. Optionals - WAGS.
Comonad - WAGS.
Connect - WAGS.
Control. Functions - WAGS.
Control. Functions. Validated - WAGS.
Control. Indexed - WAGS.
Control. Types - WAGS.
Create - WAGS.
Create. Optionals - WAGS.
CreateT - WAGS.
Debug - WAGS.
Destroy - WAGS.
Disconnect - WAGS.
Edgeable - WAGS.
Graph. AudioUnit - WAGS.
Graph. Edge - WAGS.
Graph. Graph - WAGS.
Graph. Node - WAGS.
Graph. Oversample - WAGS.
Graph. Paramable - WAGS.
Graph. Parameter - WAGS.
Interpret - WAGS.
Math - WAGS.
Patch - WAGS.
Rendered - WAGS.
Run - WAGS.
Util - WAGS.
Validation