Synchronizing Audio Nodes
In the Mixing example we can see how we can load two audio files in two different
AudioPlayerNodes and start playing them together. This works fine for some cases, but there is a problem with this approach, because the two audio players are not guaranteed to play in sync.
Running audio players in sync can be problematic because it requires the players to be precisely aligned in time. The root of the problem originates from the fact that typically the call to the audio player is triggered from the UI thread when we "press" play, but the audio is played on a different high priority thread. Pausing and resuming the audio players multiple times can lead to an accumulated offset between the audio players up to 10-30 ms, which is quite noticeable. The resulting offset can depend on device hardware, and other running processes.
Playing multiple audio files in sync is a common problem in almost every platform:
Synchronizing Audio Nodes with Subgraphs
Audio nodes can be started and stopped at the same time in subgraphs. The idea is to create a nested
AudioGraph which contains the nodes which we want to start, pause and stop in sync. The created internal audio graph can be connected to our main graph through a
SubgraphProcessorNode. We can think of the
SubgraphProcessorNode as a ProcessorNode since it has both inputs and outputs.
You can find an example here about synchronizing audio players with this technique.