Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Created

Audio Volume increases by itself
I am using an AVAudioPlayer to play a "tick" sound once per second in a SwiftUI app. When running the app on an iPhone 16 (18.2.1) the tick sounds increase in volume after a few seconds. This does not happen in the simulator nor on an iPhone SE 2020 (18.1.1).
2
0
277
Jan ’25
App crashes when opening camera from file input in WKWebView (iOS)
Hello everyone, I have a SwiftUI app using WKWebView to load a website that includes a form with a file input (). The issue is: 📌 When a user taps “Browse” and selects “Take Photo” (camera option), the app crashes before the camera opens. Setup Details: • App Uses SwiftUI with WKWebView • The crash occurs only when selecting “Take Photo”, but selecting an image from the library works fine. 📌 Full Code (WKWebView in SwiftUI) import SwiftUI import WebKit struct WebViewRepresentable: UIViewRepresentable { var urlString: String func makeUIView(context: Context) -> WKWebView { let webView = WKWebView() webView.configuration.allowsInlineMediaPlayback = true webView.configuration.mediaTypesRequiringUserActionForPlayback = [] loadURL(in: webView) return webView } func updateUIView(_ uiView: WKWebView, context: Context) { loadURL(in: uiView) } private func loadURL(in webView: WKWebView) { if let url = URL(string: urlString) { webView.load(URLRequest(url: url)) } } } struct ContentView: View { @State private var currentURL: String = "https://fv-wohlensee.ch" var body: some View { VStack(spacing: 0) { // Oberer Bereich in Grün Color(red: 0, green: 0.4, blue: 0) .frame(height: 50) // WebView with white background WebViewRepresentable(urlString: currentURL) .background(Color.white) Divider() // Navigation buttons HStack(spacing: 10) { Button { currentURL = "https://fv-wohlensee.ch/vereinshaus-eymatt/" } label: { VStack { Image(systemName: "house") .font(.system(size: 18)) Text("Klubhaus") .font(.system(size: 12)) .minimumScaleFactor(0.7) .lineLimit(1) } .padding(8) } .foregroundColor(.white) .frame(maxWidth: .infinity) Button { currentURL = "https://fv-wohlensee.ch/vereinsboot/" } label: { VStack { Image(systemName: "ferry.fill") .font(.system(size: 18)) Text("Boot") .font(.system(size: 12)) .minimumScaleFactor(0.7) .lineLimit(1) } .padding(8) } .foregroundColor(.white) .frame(maxWidth: .infinity) Button { currentURL = "https://fv-wohlensee.ch/aktivitaeten/" } label: { VStack { Image(systemName: "calendar") .font(.system(size: 18)) Text("Aktivitäten") .font(.system(size: 12)) .minimumScaleFactor(0.7) .lineLimit(1) } .padding(8) } .foregroundColor(.white) .frame(maxWidth: .infinity) Button { currentURL = "https://fv-wohlensee.ch/mitglied-werden/" } label: { VStack { Image(systemName: "person.badge.plus") .font(.system(size: 18)) Text("Mitglied") .font(.system(size: 12)) .minimumScaleFactor(0.7) .lineLimit(1) } .padding(8) } .foregroundColor(.white) .frame(maxWidth: .infinity) } .padding(.horizontal, 15) .padding(.vertical, 10) .background(Color(red: 0, green: 0.4, blue: 0)) } .frame(maxWidth: .infinity, maxHeight: .infinity) .background(Color(red: 0, green: 0.4, blue: 0)) .ignoresSafeArea() } } struct ContentView_Previews: PreviewProvider { static var previews: some View { ContentView() } } What I’ve Tried: 1️⃣ Checked Info.plist: Added permissions for camera and photo library: <key>NSCameraUsageDescription</key> <string>This app requires access to the camera to upload photos.</string> <key>NSPhotoLibraryUsageDescription</key> <string>This app requires access to your photo library.</string> 2️⃣ Enabled Media Capture in WKWebView: webView.configuration.allowsInlineMediaPlayback = true webView.configuration.mediaTypesRequiringUserActionForPlayback = [] 3️⃣ Tested in Safari: The same form works fine when opened in Safari. Questions: ❓ Does WKWebView need additional permissions to open the camera? ❓ Do I need to implement a delegate to handle file uploads in SwiftUI? ❓ Has anyone faced this issue and found a fix? Any guidance would be greatly appreciated! 🚀 Thanks in advance! 😊
1
1
467
Jan ’25
slow decoding of animated AVIF images
Safari is supposed to support animated AVIF images since version 16, but the ones I've tested perform very poorly, even on an M4 Mac Mini running Sequoia 15.1.1. I believe Safari delegates decoding to the operating system itself, so this issue also happens in Live Preview in the finder, when I try to preview a file. Sample file here: https://s3.us-west-2.amazonaws.com/cdn.paintera.org/test/sample.avif 322KB file, 5 seconds long, 12fps This plays perfectly on Chrome on Mac OS, but is slow and laggy on Safari and Live Preview (it takes about 6.5 seconds to finish the 5 second video). Does anyone know how to fix this or workaround this issue?
2
1
569
Jan ’25
On iOS 18, Mandarin is read aloud as Cantonese
Please include the line below in follow-up emails for this request. Case-ID: 11089799 When using AVSpeechUtterance and setting it to play in Mandarin, if Siri is set to Cantonese on iOS 18, it will be played in Cantonese. There is no such issue on iOS 17 and 16. 1.let utterance = AVSpeechUtterance(string: textView.text) let voice = AVSpeechSynthesisVoice(language: "zh-CN") utterance.voice = voice 2.In the phone settings, Siri is set to Cantonese
3
1
537
Jan ’25
ScreenCaptureKit crash
We are having issues with ScreenCaptureKit. Our use case is the following: We have multiple applications that each starts a stream capture, using ScreenCaptureKit. It works fine when just one application is streaming, but when starting multiple streams continuesly, all streams stops or crashes, without ScreenCaptureKit reporting an error back. Restarting replayd for the user will allow us to start streaming again, if the streaming applications are restartet too. We have build a small test program that we have tested on different Macs, running different versions of macOS, with identical results. The test program just calls the 'getShareableContentExcludingDesktopWindows' function multiple times, since this was the simplest way to show the problem. Our test setup is the following: Mac Mini M2 macOS 13.6 MacBook Pro M3 macOS 14.4 MacBook Pro M3 macOS 15.1 Code main.m #import <Foundation/Foundation.h> #import <ScreenCaptureKit/ScreenCaptureKit.h> @interface Runner : NSObject @property(atomic, assign) BOOL keepRunning; @property(atomic, retain) SCShareableContent* availableWindows; -(void) updateAvailableWindows; -(void) notif:(NSNotification*) aNotif; @end int main(int argc, const char * argv[]) { @autoreleasepool { NSRunLoop* loo = [NSRunLoop mainRunLoop]; Runner* r = [[Runner alloc] init]; [r updateAvailableWindows]; while (r.keepRunning) [loo runUntilDate:[NSDate distantFuture]]; NSLog(@"Program exit"); } return 0; } @implementation Runner -(instancetype) init { self = [super init]; if(self){ _keepRunning = YES; [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(notif:) name:@"windowWasNotFound" object:nil]; [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(notif:) name:@"windowWasFound" object:nil]; } return self; } -(void) updateAvailableWindows { @autoreleasepool { NSLog(@"begin"); [SCShareableContent getShareableContentExcludingDesktopWindows:NO onScreenWindowsOnly:YES completionHandler:^(SCShareableContent* content, NSError* error){ NSLog(@"running"); if(!error){ self.availableWindows = content; for (__unused SCWindow* aWin in self.availableWindows.windows) { [NSThread sleepForTimeInterval:0.01]; } [[NSNotificationCenter defaultCenter] postNotificationName:@"windowWasFound" object:self.availableWindows]; } else{ [[NSNotificationCenter defaultCenter] postNotificationName:@"windowWasNotFound" object:nil]; } }]; } } -(void) notif:(NSNotification*) aNotif { if(aNotif.object) [self performSelectorOnMainThread:@selector(updateAvailableWindows) withObject:nil waitUntilDone:NO]; else{ NSLog(@"Not Found"); self.keepRunning = NO; } } @end How to replicate: Compile and run the program from multiple terminal windows (terminal must be granted screen recording permission) and notice that the output stops, when the replayd stops responding (our assumption). Restarting the application does nothing, before the replayd also is restarted. Running th application in Xcode gives the following error: [ERROR] -[RPDaemonProxy fetchShareableContentWithOption:windowID:currentProcess:withCompletionHandler:]_block_invoke:902 error: 4097 This error is not something we have been able to detect in our applications - and since the only workaround is restarting replayd and the applications, catching the error would help us.
2
0
675
Jan ’25
Handling AVAudioEngine Configuration Change
Hi all, I have been quite stumped on this behavior for a little bit now, so thought it best to share here and see if someone more experience with AVAudioEngine / AVAudioSession can weigh in. Right now I have a AVAudioEngine that I am using to perform some voice chat with and give buffers to play. This works perfectly until route changes start to occur, which causes the AVAudioEngine to reset itself, which then causes all players attached to this engine to be stopped. Once a AVPlayerNode gets stopped due to this (but also any other time), all samples that were scheduled to be played then get purged. Where this becomes confusing for me is the completion handler gets called every time regardless of the sound actually being played. Is there a reliable way to know if a sample needs to be rescheduled after a player has been reset? I am not quite sure in my case what my observer of AVAudioEngineConfigurationChange needs to be doing, as this engine only handles output. All input is through a separate engine for simplicity. Currently I am storing a queue of samples as they get sent to the AVPlayerNode for playback, and after that completion checking if the player isPlaying or not. If it's playing I assume that the sound actually was played- and if not then I leave it in the queue and assume that an observer on the route change or the configuration change will realize there are samples in the queue and reset them Thanks for any feedback!
3
0
635
Jan ’25
How start AVPictureInPicture when video is paused
I have AVPlayer with AVPictureInPictureController. Play video in app and picture In Picture works except one situation. Issue is: I pause video in application and during switch to background is not PiP activate. What do I wrong? import UIKit import AVKit import AVFoundation class ViewControllerSec: UIViewController,AVPictureInPictureControllerDelegate { var pipPlayer: AVPlayer! var avCanvas : UIView! var pipCanvas: AVPlayerLayer? var pipController: AVPictureInPictureController! var mainViewControler : UIViewController! var playerItem : AVPlayerItem! var videoAvasset : AVAsset! public func link(to parentViewController : UIViewController) { mainViewControler = parentViewController setup() } @objc func appWillResignActiveNotification(application: UIApplication) { guard let pipController = pipController else { print("PiP not supported") return } print("PIP isSuspend: \(pipController.isPictureInPictureSuspended)") print("PIP isPossible: \(pipController.isPictureInPicturePossible)" if playerItem.status == .readyToPlay { if pipPlayer.rate == 0 { pipPlayer.play() } pipController.startPictureInPicture(). ---> Errorin log: Failed to start picture in picture. } else { print("Player not ready for PiP.") } } private func setupAudio() { do { let session = AVAudioSession.sharedInstance() try session.setCategory(.playback, mode: .moviePlayback) try session.setActive(true) } catch { print("Audio session setup failed: \(error.localizedDescription)") } } @objc func playerItemDidFailToPlayToEnd(_ notification: Notification) { if let error = notification.userInfo?[AVPlayerItemFailedToPlayToEndTimeErrorKey] as? Error { print("Failed to play to end: \(error.localizedDescription)") } } func setup() { setupAudio() guard let videoURL = URL(string: "https://demo.unified-streaming.com/k8s/features/stable/video/tears-of-steel/tears-of-steel.mp4/.m3u8") else { return } videoAvasset = AVAsset(url: videoURL) playerItem = AVPlayerItem(asset: videoAvasset) addPlayerObservers() pipPlayer = AVPlayer(playerItem: playerItem) avCanvas = UIView(frame: view.bounds) pipCanvas = AVPlayerLayer(player: pipPlayer) guard let pipCanvas else { return } pipCanvas.frame = avCanvas.bounds //pipCanvas.videoGravity = .resizeAspectFill mainViewControler.view.addSubview(avCanvas) avCanvas.layer.addSublayer(pipCanvas) if AVPictureInPictureController.isPictureInPictureSupported() { pipController = AVPictureInPictureController(playerLayer: pipCanvas) pipController?.delegate = self pipController?.canStartPictureInPictureAutomaticallyFromInline = true } let playButton = UIButton(frame: CGRect(x: 20, y: 50, width: 100, height: 50)) playButton.setTitle("Play", for: .normal) playButton.backgroundColor = .blue playButton.addTarget(self, action: #selector(playTapped), for: .touchUpInside) mainViewControler.view.addSubview(playButton) let pauseButton = UIButton(frame: CGRect(x: 140, y: 50, width: 100, height: 50)) pauseButton.setTitle("Pause", for: .normal) pauseButton.backgroundColor = .red pauseButton.addTarget(self, action: #selector(pauseTapped), for: .touchUpInside) mainViewControler.view.addSubview(pauseButton) let pipButton = UIButton(frame: CGRect(x: 260, y: 50, width: 150, height: 50)) pipButton.setTitle("Start PiP", for: .normal) pipButton.backgroundColor = .green pipButton.addTarget(self, action: #selector(startPictureInPicture), for: .touchUpInside) mainViewControler.view.addSubview(pipButton) print("Error:\(String(describing: pipPlayer.error?.localizedDescription))") NotificationCenter.default.addObserver(forName: UIApplication.didEnterBackgroundNotification, object: nil, queue: nil) { [weak self] _ in guard let self = self else { return } if self.pipPlayer.rate == 0 { self.pipPlayer.play() pipController?.startPictureInPicture() } } func addPlayerObservers() { playerItem?.addObserver(self, forKeyPath: "status", options: [.old, .new], context: nil) NotificationCenter.default.addObserver(self, selector: #selector(playerDidFinishPlaying(_:)), name: .AVPlayerItemDidPlayToEndTime, object: playerItem) } override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) { if keyPath == "status" { if let statusNumber = change?[.newKey] as? NSNumber { let status = AVPlayer.Status(rawValue: statusNumber.intValue)! switch status { case .readyToPlay: print("Player is ready to play") case .failed: print("Player failed: \(String(describing: playerItem?.error))") case .unknown: print("Player status is unknown") @unknown default: fatalError() } } } } @objc func playerDidFinishPlaying(_ notification: Notification) { print("Video finished playing.") } deinit { playerItem?.removeObserver(self, forKeyPath: "status") NotificationCenter.default.removeObserver(self) } @objc func playTapped() { pipPlayer.play() } @objc func pauseTapped() { pipPlayer.pause() } @objc func startPictureInPicture() { if let pipController = pipController, !pipController.isPictureInPictureActive { pipController.startPictureInPicture() } } @objc func stopPictureInPicture() { if let pipController = pipController, pipController.isPictureInPictureActive { pipController.stopPictureInPicture() } } func pictureInPictureController(_ pictureInPictureController: AVPictureInPictureController, failedToStartPictureInPictureWithError error: Error) { print("Failed to start PiP: \(error.localizedDescription)") if let underlyingError = (error as NSError).userInfo[NSUnderlyingErrorKey] { print("Underlying error: \(underlyingError)") } } }
1
0
546
Jan ’25
ProRAW to CIRAWFilter to HEIF producing borked HDR results
Following WWDC 2023 "Support HDR images in your app", I'm trying to save 48-megapixel ProRAWs (taken on an iPhone 14 Pro Max) as HDR HEICs to the Photo Library. After processing the ProRAW file using CIRAWFilter, whether I use CIContext.heif10Representation() or convert to a CGImage, then UIImage, and use UIImage.heicData(), I get photos that behave oddly in the Photo Library. They appear too dark, and visibly brighten when first viewed, but more problematic is that the photos brighten a great deal more when you edit them with the Photos editor. This is the behavior when using the itur_2100_PQ color space, but itur_2100_HLG behaves similarly, except that it gets dramatically darker when edited. This behavior occurs whether CIRAWFilter.extendedDynamicRangeAmount is set to 0.0, or 2.0, or not set at all. So what am I doing wrong? Here is a minimal iOS app -- well, just the ContentView -- that demonstrates the issue. You also need a .dng ProRAW file included in the project directory named test.dng. I'd love to include such a file, but I can't. Be prepared for a multi-second wait when you save the photo. import SwiftUI import Photos struct ContentView: View { let context = CIContext() let hdrColorSpace = CGColorSpace(name: CGColorSpace.itur_2100_PQ)! var body: some View { VStack(spacing: 100) { Button("Save Photo From CGImage/UIImage") { savePhotoFromUIImage() } Button("Save Photo From CIImage") { savePhotoDirectFromCIImage() } }.padding(60) } //convert RAW with CIRAWFilter to CIImage, then convert to CGImage, then UIImage, then HEIF private func savePhotoFromUIImage() { if let ciImage = processRAW(url: Bundle.main.url(forResource:"test", withExtension: "dng")!) { guard let outputCGImage = context.createCGImage(ciImage, from: ciImage.extent, format: .RGB10, colorSpace: hdrColorSpace) else { return } let uiImage = UIImage(cgImage: outputCGImage) if let heicData = uiImage.heicData() { saveHEIFPhotoToLibrary(imageData: heicData) } else { print("Failed to convert UIImage to HEIC") } } } //convert RAW with CIRAWFilter to CIImage, then to HEIF private func savePhotoDirectFromCIImage() { if let ciImage = processRAW(url: Bundle.main.url(forResource:"test", withExtension: "dng")!) { do { let heif = try context.heif10Representation(of: ciImage, colorSpace: hdrColorSpace) saveHEIFPhotoToLibrary(imageData: heif) } catch { print("Failed to get HEIF representation from CIContext") } } } private func processRAW(url: URL) -> CIImage? { guard let coreRawFilter = CIRAWFilter(imageURL: url) else { return nil } coreRawFilter.extendedDynamicRangeAmount = 2.0 //the issue persists whether this is not set, or set to 0, or set to, say, 2.0 guard let ciImage = coreRawFilter.outputImage else { return nil } return ciImage } private func saveHEIFPhotoToLibrary(imageData: Data) { PHPhotoLibrary.shared().performChanges({ let creationRequest = PHAssetCreationRequest.forAsset() let options = PHAssetResourceCreationOptions() creationRequest.addResource(with: .photo, data: imageData, options: options) }) { success, error in if let error = error { print("Error saving photo: \(error.localizedDescription)") } else { print("Photo saved.") } } } }
0
1
584
Jan ’25
Garageband displaying error 100001 when loading up some AU plugins
I recently got some plugins from Universal Audio, and have licensed them properly through both UA and iLok manager. Whenever I try to load up the plugins (specifically from UA) in GarageBand, it first says that "NSCreateObjectFileImageFromMemory-p47UEwps” because the developper can not be verified. After clicking either 'show in finder' or 'okay', it opens the plugin in a form without its GUI and showing that it is not licensed (even though it is). It also displays error code 100001. I have tried only some basic stuff to troubleshoot like restarting the DAW/my computer and reinstalling/relicensing the softwares. I don't know if the macOS version has anything to do with it but for some reason I just can't get it to work.
1
0
385
Jan ’25
Usage of colorCurves CIFilter
How can I use my RGB Curve points: let redCurve = [CIVector(x: 0, y: 0), CIVector(x: 0.235, y: 0.152), CIVector(x: 0.5, y: 0.5), CIVector(x: 1, y: 1)] let greenCurve = [CIVector(x: 0, y: 0), CIVector(x: 0.247, y: 0.196), CIVector(x: 0.5, y: 0.5), CIVector(x: 1, y: 1)] let blueCurve = [CIVector(x: 0, y: 0), CIVector(x: 0.235, y: 0.184), CIVector(x: 0.466, y: 0.466), CIVector(x: 1, y: 1)] in colorCurvesFilter which I've found in Apple Docs: func colorCurves(inputImage: CIImage) -&gt; CIImage { let colorCurvesEffect = CIFilter.colorCurves() colorCurvesEffect.inputImage = inputImage colorCurvesEffect.curvesDomain = CIVector(x: 0, y: 1) colorCurvesEffect.curvesData = Data( bytes: [Float32]([ 0.0,0.0,0.0, 0.8,0.8,0.8, 1.0,1.0,1.0 ]), count: 36) colorCurvesEffect.colorSpace = CGColorSpaceCreateDeviceRGB() return colorCurvesEffect.outputImage! }
0
0
372
Jan ’25
Logic Pro loads AUv3 when compiled in Swift 5 but not Swift 6
I have spent a long time refactoring lots of older Swift code to compile without error in Swift 6. The app is a v3 audio unit host and audio unit. Having installed Sonoma and XCode 16 I compile the code using Swift 6 and it compiles and runs without any warnings or errors. My host will load my AU no problem. LOGIC PRO is still the ONLY audio unit host that will load native Mac V3 audio units and so I like to test my code using Logic. In Sonoma with XCode 16... My AU passes the most stringent AUVAL tests both in terminal and Logic pro. If I compile the AU source in Swift 5 Logic will see the AU, load it and run it without problems. But when I compile the AU in Swift 6 Logic sees the AU, will scan it and verify it passes the tests but will not load the AU. In XCode I see a log message that a "helper application failed to run" but the debugger never connects to the AU and I don't think Logic even gets as far as instantiating the AU. So... what is causing this? I'm stumped.. Developing AUv3 is a brain-aching maze of undocumented hurdles and I'm hoping someone might have found a solution for this one. Meanwhile I guess my only option is to continue using the Swift 5 compiler. (appending a little note just to mention that all the DSP code is written in C/C++, Swift is used mainly for the user interface and also does some offline thready work )
1
0
513
Jan ’25
Spatial Audio on iOS 18 don't work as inteneded
I’m facing a problem while trying to achieve spatial audio effects in my iOS 18 app. I have tried several approaches to get good 3D audio, but the effect never felt good enough or it didn’t work at all. Also what mostly troubles me is I noticed that AirPods I have doesn’t recognize my app as one having spatial audio (in audio settings it shows "Spatial Audio Not Playing"). So i guess my app doesn't use spatial audio potential. First approach uses AVAudioEnviromentNode with AVAudioEngine. Chaining position of player as well as changing listener’s doesn’t seem to change anything in how audio plays. Here's simple how i initialize AVAudioEngine import Foundation import AVFoundation class AudioManager: ObservableObject { // important class variables var audioEngine: AVAudioEngine! var environmentNode: AVAudioEnvironmentNode! var playerNode: AVAudioPlayerNode! var audioFile: AVAudioFile? ... //Sound set up func setupAudio() { do { let session = AVAudioSession.sharedInstance() try session.setCategory(.playback, mode: .default, options: []) try session.setActive(true) } catch { print("Failed to configure AVAudioSession: \(error.localizedDescription)") } audioEngine = AVAudioEngine() environmentNode = AVAudioEnvironmentNode() playerNode = AVAudioPlayerNode() audioEngine.attach(environmentNode) audioEngine.attach(playerNode) audioEngine.connect(playerNode, to: environmentNode, format: nil) audioEngine.connect(environmentNode, to: audioEngine.mainMixerNode, format: nil) environmentNode.listenerPosition = AVAudio3DPoint(x: 0, y: 0, z: 0) environmentNode.listenerAngularOrientation = AVAudio3DAngularOrientation(yaw: 0, pitch: 0, roll: 0) environmentNode.distanceAttenuationParameters.referenceDistance = 1.0 environmentNode.distanceAttenuationParameters.maximumDistance = 100.0 environmentNode.distanceAttenuationParameters.rolloffFactor = 2.0 // example.mp3 is mono sound guard let audioURL = Bundle.main.url(forResource: "example", withExtension: "mp3") else { print("Audio file not found") return } do { audioFile = try AVAudioFile(forReading: audioURL) } catch { print("Failed to load audio file: \(error)") } } ... //Playing sound func playSpatialAudio(pan: Float ) { guard let audioFile = audioFile else { return } // left side playerNode.position = AVAudio3DPoint(x: pan, y: 0, z: 0) playerNode.scheduleFile(audioFile, at: nil, completionHandler: nil) do { try audioEngine.start() playerNode.play() } catch { print("Failed to start audio engine: \(error)") } ... } Second more complex approach using PHASE did better. I’ve made an exemplary app that allows players to move audio player in 3D space. I have added reverb, and sliders changing audio position up to 10 meters each direction from listener but audio seems to only really change left to right (x axis) - again I think it might be trouble with the app not being recognized as spatial. //Crucial class Variables: class PHASEAudioController: ObservableObject{ private var soundSourcePosition: simd_float4x4 = matrix_identity_float4x4 private var audioAsset: PHASESoundAsset! private let phaseEngine: PHASEEngine private let params = PHASEMixerParameters() private var soundSource: PHASESource private var phaseListener: PHASEListener! private var soundEventAsset: PHASESoundEventNodeAsset? // Initialization of PHASE init{ do { let session = AVAudioSession.sharedInstance() try session.setCategory(.playback, mode: .default, options: []) try session.setActive(true) } catch { print("Failed to configure AVAudioSession: \(error.localizedDescription)") } // Init PHASE Engine phaseEngine = PHASEEngine(updateMode: .automatic) phaseEngine.defaultReverbPreset = .mediumHall phaseEngine.outputSpatializationMode = .automatic //nothing helps // Set listener position to (0,0,0) in World space let origin: simd_float4x4 = matrix_identity_float4x4 phaseListener = PHASEListener(engine: phaseEngine) phaseListener.transform = origin phaseListener.automaticHeadTrackingFlags = .orientation try! self.phaseEngine.rootObject.addChild(self.phaseListener) do{ try self.phaseEngine.start(); } catch { print("Could not start PHASE engine") } audioAsset = loadAudioAsset() // Create sound Source // Sphere soundSourcePosition.translate(z:3.0) let sphere = MDLMesh.newEllipsoid(withRadii: vector_float3(0.1,0.1,0.1), radialSegments: 14, verticalSegments: 14, geometryType: MDLGeometryType.triangles, inwardNormals: false, hemisphere: false, allocator: nil) let shape = PHASEShape(engine: phaseEngine, mesh: sphere) soundSource = PHASESource(engine: phaseEngine, shapes: [shape]) soundSource.transform = soundSourcePosition print(soundSourcePosition) do { try phaseEngine.rootObject.addChild(soundSource) } catch { print ("Failed to add a child object to the scene.") } let simpleModel = PHASEGeometricSpreadingDistanceModelParameters() simpleModel.rolloffFactor = rolloffFactor soundPipeline.distanceModelParameters = simpleModel let samplerNode = PHASESamplerNodeDefinition( soundAssetIdentifier: audioAsset.identifier, mixerDefinition: soundPipeline, identifier: audioAsset.identifier + "_SamplerNode") samplerNode.playbackMode = .looping do {soundEventAsset = try phaseEngine.assetRegistry.registerSoundEventAsset( rootNode: samplerNode, identifier: audioAsset.identifier + "_SoundEventAsset") } catch { print("Failed to register a sound event asset.") soundEventAsset = nil } } //Playing sound func playSound(){ // Fire new sound event with currently set properties guard let soundEventAsset else { return } params.addSpatialMixerParameters( identifier: soundPipeline.identifier, source: soundSource, listener: phaseListener) let soundEvent = try! PHASESoundEvent(engine: phaseEngine, assetIdentifier: soundEventAsset.identifier, mixerParameters: params) soundEvent.start(completion: nil) } ... } Also worth mentioning might be that I only own personal team account
4
0
950
Jan ’25
How to access HDRGainMap from AVCapturePhoto
Hey, I'm building a camera app and I want to use the captured HDRGainMap along side the photo to do some processing with a CIFilter chain. How can this be done? I can't find any documentation any where on this, only on how to access the HDRGainMap from an existing HEIC file, which I have done successfully. For this I'm doing something like the following: let gainmap = CGImageSourceCopyAuxiliaryDataInfoAtIndex(source, 0, kCGImageAuxiliaryDataTypeHDRGainMap) let gainDict = NSDictionary(dictionary: gainmap) let gainData = gainDict[kCGImageAuxiliaryDataInfoData] as? Data let gainDescription = gainDict[kCGImageAuxiliaryDataInfoDataDescription] let gainMeta = gainDict[kCGImageAuxiliaryDataInfoMetadata] However I'm not sure what the approach is with a AVCapturePhoto output from a AVCaptureDevice. Thanks!
2
0
696
Jan ’25
PIP Camera in iOS App
I am developing an iOS app with video call functionality and implementing Picture in Picture (PiP) mode for video calls. The issue I am facing is that the camera stops capturing video when the app goes to the background, even though the PiP view is still visible. I have noticed that some apps, like Telegram, manage to keep the camera working in PiP mode while the app is in the background. How can I achieve this in my app?
1
0
588
Jan ’25
Macro-mode in AVCaptureDevice(custom camera)
Hi, I would like to use macro-mode for the custom camera using AVCaptureDevice in my project. This feature might help to automatically adjust and switch between lenses to get a close up clear image. It looks like this feature is not available and there are no open apis to achieve macro mode from Apple. Is there a way to get this functionality in the custom camera without losing the image quality. Please let me know if this is possible. Thanks you, Adil Thamarasseri
7
1
1.3k
Jan ’25
Title: Ambisonic B-Format Playback Issues on Vision Pro
I'm trying to implement Ambisonic B-Format audio playback on Vision Pro with head tracking. So far audio plays, head tracking works, and the sound appears to be stereo. The problem is that it is not a proper binaural playback when compared to playing back the audiofile with a DAW. Has anyone successfully implemented B-Format playback on Vision Pro? Any suggestions on my current implementation: func playAmbiAudioForum() async { do { try AVAudioSession.sharedInstance().setCategory(.playback) try AVAudioSession.sharedInstance().setActive(true) // AudioFile laoding/preperation guard let testFileURL = Bundle.main.url(forResource: "audiofile", withExtension: "wav") else { print("Test file not found") return } let audioFile = try AVAudioFile(forReading: testFileURL) let audioFileFormat = audioFile.fileFormat // create AVAudioFormat with Ambisonics B Format guard let layout = AVAudioChannelLayout(layoutTag: kAudioChannelLayoutTag_Ambisonic_B_Format) else { print("layout failed") return } let format = AVAudioFormat( commonFormat: audioFile.processingFormat.commonFormat, sampleRate: audioFile.fileFormat.sampleRate, interleaved: false, channelLayout: layout ) // write audiofile to buffer guard let buffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: UInt32(audioFile.length)) else { print("buffer failed") return } try audioFile.read(into: buffer) playerNode.renderingAlgorithm = .HRTF // connecting nodes audioEngine.attach(playerNode) audioEngine.connect(playerNode, to: audioEngine.outputNode, format: format) audioEngine.prepare() playerNode.scheduleBuffer(buffer, at: nil) { print("File finished playing") } try audioEngine.start() playerNode.play() } catch { print("Setup error:", error) } }
0
0
483
Jan ’25