Digital Audio Workstation with React Native
About
This example demonstrates how easy it is to build cross-platform digital audio workstation like applications with SwitchboardSDK using React Native and Turbo modules. It highlights features like simultaneous playback of multiple synchronized audio samples and realtime audio manipulation implementation with modern React Native, which allows us to call native C++ methods directly from Typescript. This example supports iOS, Android, and macOS platforms, with Windows support easily added using react-native-windows.
Architecture Overview
The application uses React Native's C++ Turbo Modules for cross-platform implementation, making it simple to interact with SwitchboardSDK across iOS, Android, and macOS platforms. The Turbo Module architecture consists of:
- TypeScript Specification (
specs/NativeSwitchboardSDKCppTurboModule.ts
) - Defines the interface between JavaScript and native code - Shared C++ Implementation (
shared/NativeSwitchboardSDKCppTurboModule.cpp
) - A simple C++ wrapper around SwitchboardSDK to enable.
This approach eliminates the need for separate iOS (Objective-C/Swift) and Android (Java/Kotlin) implementations, as all audio processing logic is contained within the shared C++ module.
Code Implementation
Audio Graph Configuration
// assets/AudioGraph.ts
const jsonAudioGraph = `
{
"type": "Realtime",
"config": {
"graph": {
"nodes": [
{
"id": "timeline",
"type": "Timeline"
},
{
"id": "player1",
"type": "SynchronizedAudioPlayer"
},
{
"id": "player2",
"type": "SynchronizedAudioPlayer"
},
{
"id": "gainNode1",
"type": "Gain"
},
{
"id": "gainNode2",
"type": "Gain"
},
{
"id": "mixerNode",
"type": "Mixer"
}
],
"connections": [
{
"sourceNode": "timeline",
"destinationNode": "player1"
},
{
"sourceNode": "timeline",
"destinationNode": "player2"
},
{
"sourceNode": "player1",
"destinationNode": "gainNode1"
},
{
"sourceNode": "player2",
"destinationNode": "gainNode2"
},
{
"sourceNode": "gainNode1",
"destinationNode": "mixerNode"
},
{
"sourceNode": "gainNode2",
"destinationNode": "mixerNode"
},
{
"sourceNode": "mixerNode",
"destinationNode": "outputNode"
}
]
}
}
}
Audio Engine Context
The React's Context API is used to provide a centralized audio engine state management system that can be accessed by any component in the application tree. This Context Provider pattern is useful for the followin reasons:
- Global State Management - Audio engine state (playback position, engine status, etc.) needs to be shared across multiple components
- Avoiding Prop Drilling - Instead of passing audio functions through multiple component layers, any component can directly access the audio engine
- Centralized Audio Logic - All SwitchboardSDK interactions are consolidated in one place, making the code more maintainable
- Reactive Real-time Updates - The context automatically propagates real-time audio position updates to all subscribed components
// AudioEngineContext.tsx
const [isEngineRunning, setIsEngineRunning] = useState(false)
const initializeSDK = useCallback(() => {
if (!isInitialized) {
SwitchboardSDK.initialize('', '')
setIsInitialized(true)
}
}, [isInitialized])
// Create Audio Engine from Graph Definition
const createEngine = useCallback((jsonAudioGraph: string) => {
const newEngineId = SwitchboardSDK.createAudioEngine(jsonAudioGraph)
setEngineId(newEngineId)
setIsReady(true)
}, [])
// Start the Audio Engine
const startEngine = useCallback(() => {
SwitchboardSDK.callAction(engineId, 'start', {})
setIsEngineRunning(true)
}, [])
// Load Audio Samples into Players
const loadSample = useCallback((fileName: string, playerId: string) => {
SwitchboardSDK.callAction(playerId, 'load', {
audioFilePath: getPath(fileName),
codec: 'wav',
})
}, [])
const play = useCallback(() => {
SwitchboardSDK.callAction('timeline', 'start', {})
}, [])
const contextValue: AudioEngineContextType = {
isEngineRunning,
engineId,
isReady,
initializeSDK,
createEngine,
startEngine,
play,
loadSample,
}
export const useAudioEngine = (): AudioEngineContextType => {
return useContext(AudioEngineContext)
}
Application Component
The Project page demonstrates how to use the SwitchboardSDK via AudioEngineContext's useAudioEngine()
.
// Project.tsx
const { createEngine, startEngine, play, loadSample } = useAudioEngine()
createEngine(JSON.stringify(audioGraphObject))
loadSample('Drums.wav', 'player1')
startEngine()
play()
SwitchboardSDK Wrapper
The SwitchboardSDK wrapper provides an interface over the native Turbo Module by handling basic type conversions. The communication of paramters between JS and C++ layer is done with strings. This wrapper includes helper functions like castString()
to handle strings to types (boolean, number, null, or string) conversion.
// SwitchboardSDK.ts
function getValue(objectId: string, key: string) {
return castString(SwitchboardSDKModule.getValue(objectId, key))
}
function setValue(objectId: string, key: string, value: any) {
if (typeof value === 'number' && Number.isInteger(value)) {
return SwitchboardSDKModule.setValue(objectId, key, String(value), 'int')
}
if (typeof value === 'number' && !Number.isInteger(value)) {
return SwitchboardSDKModule.setValue(objectId, key, String(value), 'float')
}
if (typeof value === 'string') {
return SwitchboardSDKModule.setValue(objectId, key, String(value), 'string')
}
}
C++ Turbo Module Implementation
Native Module Specification
This spec files defines the natieve C++ methods and their signature that we can call JS.
// specs/NativeSwitchboardSDKCppTurboModule.ts
import type { TurboModule } from 'react-native'
import { TurboModuleRegistry } from 'react-native'
export interface Spec extends TurboModule {
initialize(appId: string, appSecret: string): void
createAudioEngine(json: string): string
callAction(objectId: string, actionName: string, params: string): string
getValue(objectId: string, key: string): string
setValue(objectId: string, key: string, value: string, type: string): void
}
C++ Implementation Header
This is our C++ warpper that interacts with SwitchboardSDK's C++ API.
void initialize(jsi::Runtime &rt, const std::string &appId, const std::string &appSecret);
std::string createAudioEngine(jsi::Runtime &rt, const std::string &json);
std::string callAction(jsi::Runtime &rt, const std::string &objectId,
const std::string &actionName, const std::string ¶ms);
std::string getValue(jsi::Runtime &rt, const std::string &objectId, const std::string &key);
void setValue(jsi::Runtime &rt, const std::string &objectId, const std::string &key,
const std::string &value, const std::string &type);
Example Features
Synchronized Playback
All audio players are synchronized through a central Timeline
node, ensuring perfect timing across all tracks:
// Timeline node controls all players simultaneously
SwitchboardSDK.callAction('timeline', 'start', {})
SwitchboardSDK.callAction('timeline', 'pause', {})
SwitchboardSDK.callAction('timeline', 'stop', {})
Individual Volume Control
Each track has its own gain node for independent volume control:
// Set individual track volumes
setGain('gainNode1', 0.8)
setGain('gainNode2', 1)
Real-Time Position Updates
The application provides real-time playhead position updates for timeline visualization:
const position = SwitchboardSDK.getValue('timeline', 'position')
setPlayheadPosition(position)
Audio File Loading
SwitchboardSDK.callAction(playerId, 'load', {
audioFilePath: getPath(fileName),
codec: 'wav',
})
Timeline Manipulation
You can set offsets for player to create custom arrangements:
// Set sample start offset (in seconds)
SwitchboardSDK.setValue('player1', 'offset', 2.5)
Audio Duration Retrieval
Get audio file duration for timeline calculations:
const getAudioDuration = (playerId: string): number => {
return (SwitchboardSDK.getValue(playerId, 'duration') as number) / 1000
}
Cross-Platform Support
The implementation works seamlessly across:
- iOS: Using
libs/ios/SwitchboardSDK.xcframework
- Android: Using
libs/android/SwitchboardSDK.aar
- macOS: Using
libs/macos/SwitchboardSDK.xcframework
Source Code
You can find the source code on the following link:
Cross-platform Digital Audio Workstation with Switchboard SDK and React Native