implement spatial audio with Swift on IOS 18?

implement spatial audio with Swift on IOS 18?


I’m going through an issue in implementing spatial audio results in my iOS 18 app. I’ve tried a number of approaches to attain 3D audio impact, however the end result by no means felt adequate or it didn’t work in any respect.

Situation that principally troubles me is I observed that AirPods I’ve do not acknowledge my app as one having spatial audio. In audio settings it exhibits “Spatial Audio Not Enjoying”, which makes me assume my app would not use spatial audio potential.

It is likely to be related to say that I’m utilizing a private group account.

First strategy makes use of AVAudioEnviromentNode with AVAudioEngine. Regardless of setting the playerNode.place and listener parameters, the spatial results don’t appear to work. Altering the listenerPosition or playerNode.place has no noticeable influence on playback.

Here is simplified instance how i initialize AVAudioEngine:

import Basis
import AVFoundation

class AudioManager: ObservableObject {
// essential class variables
    var audioEngine: AVAudioEngine!
    var environmentNode: AVAudioEnvironmentNode!
    var playerNode: AVAudioPlayerNode!
    var audioFile: AVAudioFile?
...
   //Sound arrange
func setupAudio() {
        do {
            let session = AVAudioSession.sharedInstance()
            strive session.setCategory(.playback, mode: .default, choices: [])
            strive session.setActive(true)
        } catch {
            print("Did not configure AVAudioSession: (error.localizedDescription)")
        }

        audioEngine = AVAudioEngine()
        environmentNode = AVAudioEnvironmentNode()
        playerNode = AVAudioPlayerNode()
        audioEngine.connect(environmentNode)
        audioEngine.connect(playerNode)
        audioEngine.join(playerNode, to: environmentNode, format: nil)
        audioEngine.join(environmentNode, to: audioEngine.mainMixerNode, format: nil)
        environmentNode.listenerPosition = AVAudio3DPoint(x: 0, y: 0, z: 0)
        environmentNode.listenerAngularOrientation = AVAudio3DAngularOrientation(yaw: 0, pitch: 0, roll: 0)
      environmentNode.distanceAttenuationParameters.referenceDistance = 1.0        environmentNode.distanceAttenuationParameters.maximumDistance = 100.0
        environmentNode.distanceAttenuationParameters.rolloffFactor = 2.0
        // instance.mp3 is mono sound
        guard let audioURL = Bundle.principal.url(forResource: "instance", withExtension: "mp3") else {
            print("Audio file not discovered")
            return
        }

        do {
            audioFile = strive AVAudioFile(forReading: audioURL)
        } catch {
            print("Did not load audio file: (error)")
        }
   }
...
    //Enjoying sound
    func playSpatialAudio(pan: Float ) {
        guard let audioFile = audioFile else { return }
        //  left facet
        playerNode.place = AVAudio3DPoint(x: pan, y: 0, z: 0)
        playerNode.scheduleFile(audioFile, at: nil, completionHandler: nil)

        do {
            strive audioEngine.begin()
            playerNode.play()
        } catch {
            print("Failed to start out audio engine: (error)")
        }
...
}    

Second strategy is extra advanced utilizing PHASE – it did higher. I’ve made an exemplary app that permits gamers to maneuver audio supply in 3D area. I’ve added reverb, and sliders altering audio place as much as 10 meters every course from listener however audio appears to solely actually change left to proper (x axis) – once more I feel it is likely to be hassle with the app not being acknowledged as spatial (AirPods settings nonetheless present “Spatial Audio Not Enjoying”).

Here is the instance setup:

class PHASEAudioController: ObservableObject{
    //Essential class Variables:
    personal var soundSourcePosition: simd_float4x4 = matrix_identity_float4x4
    personal var audioAsset: PHASESoundAsset!
    personal let phaseEngine: PHASEEngine
    personal let params = PHASEMixerParameters()
    personal var soundSource: PHASESource
    personal var phaseListener:  PHASEListener!
    personal var soundEventAsset: PHASESoundEventNodeAsset?

// Initialization of PHASE
init{     
 do {
            let session = AVAudioSession.sharedInstance()
            strive session.setCategory(.playback, mode: .default, choices: [])

            strive session.setActive(true)
        } catch {
            print("Did not configure AVAudioSession: (error.localizedDescription)")
        }
        // Init PHASE Engine
        phaseEngine = PHASEEngine(updateMode: .automated)
        phaseEngine.defaultReverbPreset = .mediumHall
        phaseEngine.outputSpatializationMode = .automated //nothing helps        
        // Set listener place to (0,0,0) in World area
        let origin: simd_float4x4 = matrix_identity_float4x4
        phaseListener = PHASEListener(engine: phaseEngine)
        phaseListener.remodel = origin
        phaseListener.automaticHeadTrackingFlags = .orientation
        strive! self.phaseEngine.rootObject.addChild(self.phaseListener)
        do{
            strive self.phaseEngine.begin();
        }
        catch {
            print("Couldn't begin PHASE engine")
        }
 
        audioAsset = loadAudioAsset()
        // Create sound Supply
        // Sphere
        soundSourcePosition.translate(z:3.0)
        let sphere = MDLMesh.newEllipsoid(withRadii: vector_float3(0.1,0.1,0.1), radialSegments: 14, verticalSegments: 14, geometryType: MDLGeometryType.triangles, inwardNormals: false, hemisphere: false, allocator: nil)
        let form = PHASEShape(engine: phaseEngine, mesh: sphere)
        soundSource = PHASESource(engine: phaseEngine, shapes: [shape])
        soundSource.remodel = soundSourcePosition
        print(soundSourcePosition)
        do {
            strive phaseEngine.rootObject.addChild(soundSource)
        }
        catch {
            print ("Failed so as to add a toddler object to the scene.")
        }
        let simpleModel = PHASEGeometricSpreadingDistanceModelParameters()
        simpleModel.rolloffFactor = rolloffFactor
        soundPipeline.distanceModelParameters = simpleModel
        
        let samplerNode = PHASESamplerNodeDefinition(
            soundAssetIdentifier: audioAsset.identifier,
            mixerDefinition: soundPipeline,
            identifier: audioAsset.identifier + "_SamplerNode")
        samplerNode.playbackMode = .looping
        do {soundEventAsset = strive
            phaseEngine.assetRegistry.registerSoundEventAsset(
            rootNode: samplerNode,
            identifier: audioAsset.identifier + "_SoundEventAsset")
        } catch {
            print("Did not register a sound occasion asset.")
            soundEventAsset = nil
        }
}

//Enjoying sound
 func playSound(){
        // Fireplace new sound occasion with at present set properties
        guard let soundEventAsset else { return }
        
        params.addSpatialMixerParameters(
            identifier: soundPipeline.identifier,
            supply: soundSource,
            listener: phaseListener)
        let soundEvent = strive! PHASESoundEvent(engine: phaseEngine,
                                              assetIdentifier: soundEventAsset.identifier,
                                              mixerParameters: params)
        soundEvent.begin(completion: nil)
    }
...
}

Additionally I’ve experimented with RealityKit, however I hope to discover a resolution with out utilizing AR view, PHASE appears to be most suitable choice if it labored as meant.

What I count on
I hope my app will place audio in 3D area appropriately in all axes like in another spatial apps accessible on iOS and app shall be acknowledged as spatial audio enabled app by e.g. AirPods.

author avatar
roosho Senior Engineer (Technical Services)
I am Rakib Raihan RooSho, Jack of all IT Trades. You got it right. Good for nothing. I try a lot of things and fail more than that. That's how I learn. Whenever I succeed, I note that in my cookbook. Eventually, that became my blog. 
rooshohttps://www.roosho.com
I am Rakib Raihan RooSho, Jack of all IT Trades. You got it right. Good for nothing. I try a lot of things and fail more than that. That's how I learn. Whenever I succeed, I note that in my cookbook. Eventually, that became my blog. 

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here


Latest Articles

author avatar
roosho Senior Engineer (Technical Services)
I am Rakib Raihan RooSho, Jack of all IT Trades. You got it right. Good for nothing. I try a lot of things and fail more than that. That's how I learn. Whenever I succeed, I note that in my cookbook. Eventually, that became my blog.