Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Activity

Fairplay expiration issue
We noticed the behaviour of expiration of FairPlay license changed from iOS 16.x to some version iOS 17 and the latest iOS 18.3.2 and Safari. On iOS 16.x, the video playback will stop when the license expires, but on iOS 17.x + the video continues but no audio and no error fired. On latest Safari the video and audio all continues. Any changes for the latest FairPlay and how we adapt this from the license server? Thanks
0
0
388
Mar ’25
Unknown error -12881 when using AVAssetResourceLoader
Here we are focusing to change the cookie at every 120 seconds while playing , in apple avplayer we can't modify cookie after initialisation due to that we followed the approach to using " Resource loader delegate " to pass cookie as a header value . What I notice is that the playlist file (.m3u8) gets downloaded correctly. Then video file (.m4a) some chunks also gets downloaded. I know that the .ts file is downloaded because I can see the GET request completing on the web server with status 200. I also set a breakpoint at the following line: loadingRequest.dataRequest?.respond(with: data) immediately got error from avplayer status as "The operation could not be completed. An unknown error occurred (-12881) From core media" Need confirmation on why I am unable to load HLS using resource loader. is it possible to update cookie value while paying continuously on avplayer. override func viewDidLoad() { super.viewDidLoad() // Do any additional setup after loading the view. let urlString = "localhost://demo.unified-streaming.com/k8s/features/stable/video/tears-of-steel/tears-of-steel.ism/.m3u8" guard let url = URL(string: urlString) else { print("Invalid URL") return } //Create cookie to prepare for player asset let cookie = HTTPCookie(properties: [ .name: "dazn-token", .value: "cookie value", .domain: url.host() ?? "", .path: "/", .discard: true ]) //Create cookie key to set AVURLAsset let options = [AVURLAssetHTTPCookiesKey: [cookie]] let asset = AVURLAsset(url: url,options: options) proxy = ReverseProxyResourceLoader() proxy?.cookie = "exampleCookie" // Set resource loader delegate to moniter the chunks asset.resourceLoader.setDelegate(proxy, queue: DispatchQueue.global()) // Load asset keys asynchronously (e.g., "playable") let keys = ["playable"] // Initialize the AVPlayer with the URL let playerItem = AVPlayerItem(asset: asset) self.player = AVPlayer(playerItem: playerItem) playerItem.addObserver(self, forKeyPath: "status", options: [.new, .initial], context: nil) // Observe 'error' property (if needed) playerItem.addObserver(self, forKeyPath: "error", options: [.new], context: nil) let contentKeySessionDelegate = ContentKeyDelegate() // Initialize AVContentKeySession let contentKeySession = AVContentKeySession(keySystem: .clearKey) self.contentKeySession = contentKeySession contentKeySession.setDelegate(contentKeySessionDelegate, queue: DispatchQueue.main) // Associate the asset with the content key session contentKeySession.addContentKeyRecipient(asset) // Create a layer for the AVPlayer and add it to the view playerLayer = AVPlayerLayer(player: player) playerLayer?.frame = view.bounds playerLayer?.videoGravity = .resizeAspect if let playerLayer = playerLayer { view.layer.addSublayer(playerLayer) } NotificationCenter.default.addObserver( self, selector: #selector(playerDidFinishPlaying), name: .AVPlayerItemDidPlayToEndTime, object: player?.currentItem ) // Start playback player?.play() } // Update cookie when ever needed func updateCookie() { proxy?.cookie = "update exampleCookie" } @objc private func playerDidFinishPlaying(notification: Notification) { print("Playback finished!") // Optionally, handle end-of-playback actions here } // // ReverseProxyResourceLoader.swift // HLSDemo // // Created by Gajje.Venkatarao on 12/12/24. // import Foundation import AVKit import AVFoundation class ReverseProxyResourceLoader: NSObject, AVAssetResourceLoaderDelegate { var cookie = "" func resourceLoader(_ resourceLoader: AVAssetResourceLoader, shouldWaitForLoadingOfRequestedResource loadingRequest: AVAssetResourceLoadingRequest) -> Bool { resourceLoader.preloadsEligibleContentKeys = true guard let interceptedURL = loadingRequest.request.url else { loadingRequest.finishLoading(with: NSError(domain: "ReverseProxy", code: -1, userInfo: [NSLocalizedDescriptionKey: "Invalid URL"])) return false } if interceptedURL.scheme == "skd" { print("Token updated Cookie:", interceptedURL ) return false } var components = URLComponents(url: interceptedURL, resolvingAgainstBaseURL: false) components?.scheme = "https" // Replace with the original scheme guard let originalURL = components?.url else { loadingRequest.finishLoading(with: NSError(domain: "ReverseProxy", code: -1, userInfo: [NSLocalizedDescriptionKey: "Failed to map URL"])) loadingRequest.finishLoading() return false } var request = URLRequest(url: originalURL) request.httpMethod = "GET" if let storeCoockie = HTTPCookie(properties: [ .name: "dazn-token", .value: cookie, .domain: originalURL.host ?? "", .path: "/", .discard: true ]){ HTTPCookieStorage.shared.setCookie(storeCoockie) } let headers = loadingRequest.request.allHTTPHeaderFields ?? [:] for (key, value) in headers { request.addValue(value, forHTTPHeaderField: key) } request.addValue(cookie, forHTTPHeaderField: "Cookie") URLSession.shared.configuration.httpShouldSetCookies = true request.httpShouldHandleCookies = true let task = (URLSession.shared.dataTask(with: originalURL) { data, response, error in if let error = error { print("Error Received:", error) loadingRequest.finishLoading(with: error) return } print(originalURL) guard let data = data , let url = response?.url else { loadingRequest.finishLoading(with: NSError(domain: "ReverseProxy", code: -1, userInfo: [NSLocalizedDescriptionKey: "No data received"])) return } loadingRequest.dataRequest?.respond(with: data) loadingRequest.finishLoading() } as URLSessionDataTask) task.resume() return true } } Example project
0
0
603
Dec ’24
How to disable the built-in speakers and microphone on a Mac
I need to implement a solution through an API or custom driver to completely block out the built-in speakers and microphone of Mac, because I need other apps to use specified external devices as audio input and output. Is there a way to achieve this requirement? What I mean is that even in system preferences, it should not be possible to choose the built-in microphone and speakers; only my external device can be used.
0
0
83
Apr ’25
AVSpeechSynthesizer & Bluetooth Issues
Hello, I have a CarPlay Navigation app and utilize the AVSpeechSynthesizer to speak directions to a user. Everything works great on my CarPlay simulator as well as when plugged into my GMC truck. However, I found out yesterday that one of my users with a Ford truck the audio would cut in an out. After much troubleshooting, I was able to replicate this on my own truck when using Bluetooth to connect to CarPlay. My user was also utilizing Bluetooth. Has anyone else experienced this? Is there a fix to the problem? import SwiftUI import AVFoundation class TextToSpeechService: NSObject, ObservableObject, AVSpeechSynthesizerDelegate { private var speechSynthesizer = AVSpeechSynthesizer() static let shared = TextToSpeechService() override init() { super.init() speechSynthesizer.delegate = self } func configureAudioSession() { speechSynthesizer.delegate = self do { try AVAudioSession.sharedInstance().setCategory(.playback, mode: .voicePrompt, options: [.mixWithOthers, .allowBluetooth]) } catch { print("Failed to set audio session category: \(error.localizedDescription)") } } func speak(_ text: String) { Task(priority: .high) { let speechUtterance = AVSpeechUtterance(string: text) speechUtterance.voice = AVSpeechSynthesisVoice(language: AVSpeechSynthesisVoice.currentLanguageCode()) try AVAudioSession.sharedInstance().setActive(true, options: .notifyOthersOnDeactivation) speechSynthesizer.speak(speechUtterance) } } func speechSynthesizer(_ synthesizer: AVSpeechSynthesizer, didFinish utterance: AVSpeechUtterance) { Task { stopSpeech() try AVAudioSession.sharedInstance().setActive(false) } } func stopSpeech() { speechSynthesizer.stopSpeaking(at: .immediate) } }
0
0
427
Feb ’25
iPhone 16 Camera Control and AVCaptureSlider – Is there a way to detect which slider is active?
I am following the Apple sample code and trying to add a manual focus lens position slider: @available(iOS 18.0, *) private func addCameraControls() { if !self.session.controls.isEmpty { for control in self.session.controls { self.session.removeControl(control) } } self.cameraControlFocusSlider = nil //Focus Slider if self.videoDevice!.isLockingFocusWithCustomLensPositionSupported { self.cameraControlFocusSlider = AVCaptureSlider("Focus", symbolName: "dot.square", in: 0.0...1.0) self.cameraControlFocusSlider!.setActionQueue(self.sessionQueue) { focusValue in //Do manual focus } if self.session.canAddControl(self.cameraControlFocusSlider!) { self.session.addControl(self.cameraControlFocusSlider!) } } } So there are these AVCaptureSessionControlsDelegate methods: final func sessionControlsDidBecomeActive(_ session: AVCaptureSession) { print ("sessionControlsDidBecomeActive") } final func sessionControlsWillEnterFullscreenAppearance(_ session: AVCaptureSession) { print ("sessionControlsWillEnterFullscreenAppearance") } final func sessionControlsWillExitFullscreenAppearance(_ session: AVCaptureSession) { print ("sessionControlsWillExitFullscreenAppearance") } final func sessionControlsDidBecomeInactive(_ session: AVCaptureSession) { print ("sessionControlsDidBecomeInactive") } So when self.cameraControlFocusSlider is presented, I have to show the current value of the lense position. Lens position can change from auto focus and also from manual focus by the user using the app UI. Is there a way to see if self.cameraControlFocusSlider is active or being used? Please note that I will have more than one AVCaptureSlider in the final code.
0
0
428
Mar ’25
AVPlayerItem. externalMetadata not available
According to the documentation (https://developer.apple.com/documentation/avfoundation/avplayeritem/externalmetadata), AVPlayerItem should have an externalMetadata property. However it does not appear to be visible to my app. When I try, I get: Value of type 'AVPlayerItem' has no member 'externalMetadata' Documentation states iOS 12.2+; I am building with a minimum deployment target of iOS 18. Code snippet: import Foundation import AVFoundation /// ... in function ... // create metadata as described in https://developer.apple.com/videos/play/wwdc2022/110338 var title = AVMutableMetadataItem() title.identifier = .commonIdentifierAlbumName title.value = "My Title" as NSString? title.extendedLanguageTag = "und" var playerItem = await AVPlayerItem(asset: composition) playerItem.externalMetadata = [ title ]
0
0
72
Apr ’25
Media Library Access not working in my App since iOS 18.2
Hi There, I have an app which access the media library, to save and load files. Since the IOS 18.2, the access to the media library stopped working. Now, I've noticed that our App doesn't show in the List of apps with access to Files ( Privacy & Security -> Files & Folders). Weird behavior is that, one iPhone with iOS 18.3.1 can access to the Files but others no, same iOS version 18.3.1. Test on Simulators (MAC) and works fine also. My info.plist file have the keys to access media library for long time and hasn't changed (at least in the las 4 years) including the key "Privacy - Media Library Usage Description". Also, I've noticed, that the message (popup) that request access to the media library, when using the app for the first time, doesn't show up anymore. We request access to the network (wifi) and this message still showing up but no the media library. I'm using Visual Studio with Xamarin on a MAC. I really appreciate any help you can because is very odd behavior and this started from the iOS 18.2.
0
0
382
Feb ’25
How to override the default USB video
According to the doc, I did a simple demo to verify. My env: ProductName: macOS ProductVersion: 15.5 BuildVersion: 24F74 2.4 GHz 四核Intel Core i5 Info.plist: <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> <plist version="1.0"> <dict> <key>IOKitPersonalities</key> <dict> <key>UVCamera</key> <dict> <key>CFBundleIdentifierKernel</key> <string>com.apple.kpi.iokit</string> <key>IOClass</key> <string>IOUserService</string> <key>IOMatchCategory</key> <string>$(PRODUCT_BUNDLE_IDENTIFIER)</string> <key>IOProviderClass</key> <string>IOUserResources</string> <key>IOResourceMatch</key> <string>IOKit</string> <key>IOUserClass</key> <string>UVCamera</string> <key>IOUserServerName</key> <string>$(PRODUCT_BUNDLE_IDENTIFIER)</string> <key>IOProbeScore</key> <integer>100000</integer> <key>idVendor</key> <integer>1452</integer> <key>idProduct</key> <integer>34068</integer> </dict> </dict> <key>OSBundleUsageDescription</key> <string></string> </dict> </plist> UVCamera.cpp // // UVCamera.cpp // UVCamera // // Created by DTEN on 2025/6/12. // #include <os/log.h> #include <DriverKit/IOUserServer.h> #include <DriverKit/IOLib.h> #include "UVCamera.h" kern_return_t IMPL(UVCamera, Start) { kern_return_t ret; ret = Start(provider, SUPERDISPATCH); os_log(OS_LOG_DEFAULT, "Hello World"); return ret; } UVCamera.iig // // UVCamera.iig // UVCamera // // Created by DTEN on 2025/6/12. // #ifndef UVCamera_h #define UVCamera_h #include <Availability.h> #include <DriverKit/IOService.iig> class UVCamera: public IOService { public: virtual kern_return_t Start(IOService * provider) override; }; #endif /* UVCamera_h */ Then I build by xcode and mv it to /Library/DriverExtensions: sudo mv com.lqs.MyVirtualCam.UVCamera.dext /Library/DriverExtensions sudo kmutil install -R / -r /Library/DriverExtensions kmutil rebuild done However,the dext can't be loaded: kmutil showloaded --list-only | grep UVCamera No variant specified, falling back to release What's the problem? anyone can help me?
0
0
108
Jun ’25
Camera USB
am new to using Swift for a Mac Application. I am trying to control an external UVC-compliant camera focus and other capabilities. However, I'm having trouble with this and don't know where to start. I have downloaded an application from the App Store and it can control the focus and other capabilities. I've tried IOKit but this seems to be complicated and this does not return any capabilities or control the camera. I also tried AVfoundation and was able to open the camera, but using the following code did not work for me. as a device.isFocusPointOfInterestSupported returns false and without checking the app crashes. @IBAction func focusChanged(_ sender: NSSlider) { do { guard let device = videoDevice else { return } try device.lockForConfiguration() // Check if focus mode and point of interest are supported if device.isFocusModeSupported(.locked) { device.focusMode = .locked } if device.isFocusPointOfInterestSupported { // Map the slider value (0.0 to 1.0) to the focus point's X coordinate let focusX = CGFloat(sender.doubleValue) let focusPoint = CGPoint(x: focusX, y: 0.5) // Y coordinate is typically 0.5 (centered vertically) device.focusPointOfInterest = focusPoint } else { print("Focus point of interest is not supported on this device.") } device.unlockForConfiguration() // Log focus settings print("Focus point: \(device.focusPointOfInterest)") print("Focus mode: \(device.focusMode.rawValue)") } catch { print("Error adjusting focus: \(error)") } Any help or advice is much appreciated.
0
0
471
Jan ’25
​​Can VideoToolbox properly decode HEVC bitstreams when a single frame is split into multiple slice NALUs?​
I am currently developing an HEVC player using VideoToolbox on an iOS device. I have successfully created an HEVC decoder that receives HEVC streams from our custom image capture and encoding device, and it can decode and display images properly. However, when my image capture and encoding device configures the encoder to output HEVC streams with ​​fragmented NALUs​​ (i.e., an I-frame or P-frame is split and stored across multiple slice NALUs), the iOS decoder can be initialized successfully but fails to decode and output images. ​​Can VideoToolbox properly decode HEVC bitstreams when a single frame is split into multiple slice NALUs?​​ Key Observations: ​​1. Single-NALU frames​​ work fine. ​​2. Multi-NALU frames​​ (sliced I/P-frames) cause decoding failure. 3. The decoder session is created successfully (VTDecompressionSessionCreate returns no error).
0
0
131
May ’25
AVPlayer error: Too many open files
For some users in production, there's a high probability that after launching the App, using AVPlayer to play any local audio resources results in the following error. Restarting the App doesn't help. issue: [error: Error Domain=AVFoundationErrorDomain Code=-11800 "这项操作无法完成" UserInfo={NSLocalizedFailureReason=发生未知错误(24), NSLocalizedDescription=这项操作无法完成, NSUnderlyingError=0x30311f270 {Error Domain=NSPOSIXErrorDomain Code=24 "Too many open files"}} I've checked the code, and there aren't actually multiple AVPlayers playing simultaneously. What could be causing this?
0
0
416
Feb ’25
Lightning to HDMI mirrors
I am developing a VOD playback app, but when I stream video to an external monitor connected via HDMI with Lightning on iOS 18 or later, the screen goes dark and I cannot confirm playback. The app I am developing does not detect the HDMI and display the Player separately, but simply mirrors the video. We have confirmed that the same phenomenon occurs with other services, but we were able to confirm playback with some services such as Apple TV. Please let us know if there are any other necessary settings such as video certificates required for video playback. We would also like to know if the problem occurs with iOS 18 or later.
0
0
268
Mar ’25
AVFoundationErrorDomain Code=-11819 in AVPlayer – Causes & Fixes?
We are encountering an issue where AVPlayer throws the error: Error Domain=AVFoundationErrorDomain Code=-11819 "Cannot Complete Action" > Underlying Error Domain[(null)]:Code[0]:Desc[(null)] This error seems to occur intermittently during video playback, especially after extended usage or when switching between different streams. We observe Error 11819 (AVFoundationErrorDomain) in Conviva platform that some of our users experience it but we couldn't reproduce it so far and we’re need support to determine the root cause and/or best practices to prevent it. Some questions we have: What typically triggers this error? Could it be related to memory/resource constraints, network instability, or backgrounding? Are there any recommended ways to handle or recover from this error gracefully? Any insights or guidance would be greatly appreciated. Thanks!
0
0
146
Apr ’25
[Fairplay Streaming] Time Manipulation Detection DRM License Servers
Our team conducted security testing and found one vulnerability with fairplay license acquisition. Our QA engineer manually changed the device's system date and time (setting it 4 days into the future) and was able to successfully obtain a license response and initiate playback on an iOS device. However, on an Android device, the license acquisition failed. Can you please tell us if Time Manipulation Detection is available in FairPlay SDK?
0
0
422
Feb ’25
Couldn't able to hear audio via speaker on ios real device
This is my native module code implementation I'm getting base64 encoded string from server and passing this to my native module of pcm player to play audio App.tsx PcmPlayer.writeChunk(e.data); PcmPlayer.swift import AVFoundation @objc(PcmPlayer) class PcmPlayer: RCTEventEmitter { private var engine: AVAudioEngine? private var playerNode: AVAudioPlayerNode? private var format: AVAudioFormat? private var bufferQueue = [Data]() private var isPlaying = false private var hasEnded = false private var scheduledBufferCount = 0 private let minBufferBytes = 50000 private let pcmQueue = DispatchQueue(label: "pcm.queue") override init() { super.init() } override func supportedEvents() -> [String]! { return ["onStatus", "onMessage"] } @objc(initPlayer:channels:bitsPerSample:) func initPlayer(_ sampleRate: NSNumber, channels: NSNumber, bitsPerSample: NSNumber) { pcmQueue.async { self.stopInternal() let session = AVAudioSession.sharedInstance() do { try session.setCategory(.playback, mode: .default, options: []) try session.setActive(true, options: .notifyOthersOnDeactivation) try session.setMode(.default) print("🔈 Audio session active. Output route:", session.currentRoute.outputs) } catch { print("❌ Audio session setup failed:", error) return } self.engine = AVAudioEngine() self.playerNode = AVAudioPlayerNode() guard let engine = self.engine, let playerNode = self.playerNode else { print("❌ Engine or playerNode is nil") return } engine.attach(playerNode) self.format = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: sampleRate.doubleValue, channels: AVAudioChannelCount(channels.uintValue), interleaved: false) guard let format = self.format else { print("❌ Failed to create AVAudioFormat") return } engine.connect(playerNode, to: engine.mainMixerNode, format: format) do { try engine.start() playerNode.play() engine.mainMixerNode.outputVolume = 1.0 print("✅ AVAudioEngine started with format:", format) } catch { print("❌ Engine start failed:", error) } self.hasEnded = false } } @objc(writeChunk:) func writeChunk(_ base64Pcm: String) { pcmQueue.async { guard base64Pcm.count >= 10 else { print("⚠️ Skipping short base64 string") return } var padded = base64Pcm let mod4 = base64Pcm.count % 4 if mod4 > 0 { padded += String(repeating: "=", count: 4 - mod4) } guard let data = Data(base64Encoded: padded, options: .ignoreUnknownCharacters) else { print("❌ Failed to decode base64") return } self.bufferQueue.append(data) print("📥 Received PCM chunk (\(data.count) bytes)") print("📥 writeChunk called. isPlaying=\(self.isPlaying), bufferQueue.count=\(self.bufferQueue.count)") if !self.isPlaying { self.isPlaying = true self.waitForBufferAndStartPlayback() } else if self.scheduledBufferCount == 0 { self.isPlaying = true self.waitForBufferAndStartPlayback() } } } private func waitForBufferAndStartPlayback() { DispatchQueue.global().async { while self.queueSize() < self.minBufferBytes && !self.hasEnded { Thread.sleep(forTimeInterval: 0.01) } self.writeLoop() } } private func writeLoop() { DispatchQueue.global().async { writeLoop: while self.isPlaying { if self.bufferQueue.isEmpty { for _ in 0..<100 { Thread.sleep(forTimeInterval: 0.01) if !self.bufferQueue.isEmpty { break } } if self.bufferQueue.isEmpty { print("🔇 No more data to play after waiting") self.isPlaying = false break writeLoop } } var data: Data? self.pcmQueue.sync { if !self.bufferQueue.isEmpty { data = self.bufferQueue.removeFirst() } } guard let chunk = data else { print("⚠️ No data to process") continue } if let buffer = self.pcmBufferFromData(chunk) { self.scheduledBufferCount += 1 self.playerNode?.scheduleBuffer(buffer, completionHandler: { self.pcmQueue.async { self.scheduledBufferCount -= 1 if self.bufferQueue.isEmpty && self.scheduledBufferCount == 0 { print("ℹ️ Playback idle - waiting for more data") self.isPlaying = false } } }) } } } } private func pcmBufferFromData(_ data: Data) -> AVAudioPCMBuffer? { guard let format = self.format else { return nil } let frameCount = UInt32(data.count / 2) guard let buffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: frameCount) else { print("❌ Failed to create AVAudioPCMBuffer") return nil } buffer.frameLength = frameCount guard let floatChannelData = buffer.floatChannelData?[0] else { print("❌ floatChannelData is nil") return nil } data.withUnsafeBytes { (rawBuffer: UnsafeRawBufferPointer) in let int16Buffer = rawBuffer.bindMemory(to: Int16.self) let count = min(int16Buffer.count, Int(frameCount)) for i in 0..<count { floatChannelData[i] = Float32(int16Buffer[i]) / Float32(Int16.max) } } return buffer } @objc(stopPlayer) func stopPlayer() { pcmQueue.async { self.stopInternal() } } private func stopInternal() { print("🛑 stopInternal called") self.playerNode?.stop() self.engine?.stop() self.engine?.reset() self.playerNode = nil self.engine = nil self.format = nil self.bufferQueue.removeAll() self.isPlaying = false self.hasEnded = true self.scheduledBufferCount = 0 } @objc(canWrite:rejecter:) func canWrite(_ resolve: @escaping RCTPromiseResolveBlock, rejecter reject: RCTPromiseRejectBlock) { pcmQueue.async { resolve(self.bufferQueue.count < 20) } } @objc(flushPlayer:rejecter:) func flushPlayer(_ resolve: @escaping RCTPromiseResolveBlock, rejecter reject: RCTPromiseRejectBlock) { pcmQueue.async { self.bufferQueue.removeAll() resolve(nil) } } @objc static override func requiresMainQueueSetup() -> Bool { return false } private func queueSize() -> Int { return pcmQueue.sync { return self.bufferQueue.reduce(0) { $0 + $1.count } } } } I couldn't able to hear any audio via my real iOS device also it is working fine on emulator.
0
0
192
Jul ’25
Slow performance decoding large images with Core Image.
I'm building a camera app that does some post processing after the photo has been taken. With 12MP the processing is pretty good, but larger images 24MP is very slow. I created a very simple example to demonstrate the issue, which is loading an image and the rendering it to data. let context = CIContext() let imageUrl = Bundle.main.url(forResource: "12mp", withExtension: "jpg")! let data = try! Data(contentsOf: imageUrl) let ciImage = CIImage(data: data)! let start = CFAbsoluteTimeGetCurrent() let data = context.jpegRepresentation(of: ciImage, colorSpace: context.workingColorSpace!) print(data?.count) print("Resize Completed: " + String(CFAbsoluteTimeGetCurrent() - start)) Running this code on an iPhone 16 Pro with different images produces these benchmarks: 12MP => 0.03s 24MP => 1.22s 48MP => 2.98s I understand that processing time will increase with resolution but it doesn't seem linear. I have tried setting different CiContext options such as .useSoftwareRenderer: false but it has made no difference. From profiling the process it looks like the JPEG decoding is the bottle neck. This is for a 48MP Image: Is there any way this can be improved?
0
0
614
Dec ’24
Capture session interrupted randomly
We are facing a strange issue where a small portion of our large userbase can not start the capture session in our app, as it gets interrupted with the following reason: AVCaptureSessionInterruptionReasonVideoDeviceNotAvailableWithMultipleForegroundApps Our users are all from iPhones, no one is using an iPad. Just to be sure we have set session.isMultitaskingCameraAccessEnabled = true but it does not seem to make any difference. Another weird scenario we are seeing on an even smaller number of users is that the following call: AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back) returns nil. A quick look at our error reports show this happening on iPhone XR, 13 and 14 models. They should all support this device type. Any help on investigating these issue would be greatly appreciated!
0
0
108
Jun ’25
Changing instrument with AVMIDIControlChangeEvent bankSelect
I've been trying to use AVMIDIControlChangeEvent with a bankSelect message type to change the instrument the sequencer uses on a AVMusicTrack with no luck. I started with the Apple AVAEMixerSample, converting the initial setup/loading and portions dealing with the sequencer to Swift. I got that working and playing the "bluesyRiff" and then modified it to play individual notes. So my createAndSetupSequencer looked like func createAndSetupSequencer() { sequencer = AVAudioSequencer(audioEngine: engine) // guard let midiFileURL = Bundle.main.url(forResource: "bluesyRiff", withExtension: "mid") else { // print (" failed guard trying to get URL for bluesyRiff") // return // } let track = sequencer.createAndAppendTrack() var currTime = 1.0 for i: UInt32 in 0...8 { let newNoteEvent = AVMIDINoteEvent(channel: 0, key: 60+i, velocity: 64, duration: 2.0) track.addEvent(newNoteEvent, at: AVMusicTimeStamp(currTime)) currTime += 2.0 } The notes played, so then I also replaced the gs_instruments sound bank with GeneralUser GS MuseScore v1.442 first by trying guard let soundBankURL = Bundle.main.url(forResource: "GeneralUser GS MuseScore v1.442", withExtension: "sf2") else { return} do { try sampler.loadSoundBankInstrument(at: soundBankURL, program: 0x001C, bankMSB: 0x79, bankLSB: 0x08) } catch{.... } This appears to work, the instrument (8 which is "Funk Guitar") plays. If I change to bankLSB: 0x00 I get the "Palm Muted guitar". So I know that the soundfont has these instruments Stuff goes off the rails when I try to change the instruments in createAndSetupSequencer. Putting let programChange = AVMIDIProgramChangeEvent(channel: 0, programNumber: 0x001C) let bankChange = AVMIDIControlChangeEvent(channel: 0, messageType: AVMIDIControlChangeEvent.MessageType.bankSelect, value: 0x00) track.addEvent(programChange, at: AVMusicTimeStamp(1.0)) track.addEvent(bankChange, at: AVMusicTimeStamp(1.0)) just before my add note loop doesn't produce any change. Loading bankLSB 8 (Funk) in sampler.loadSoundBankInstrument and trying to change with bankSelect 0 (Palm muted) in createAndSetupSequencer results in instrument 8 (Funk) playing not Palm Muted. Loading bankLSB 0 (Palm muted) and trying to change with bankSelect 8 (Funk) doesn't work, 0 (Palm muted) plays I also tried sampler.loadInstrument(at: soundBankURL) and then I always get the first instrument in the sound font file (piano)no matter what values I put in my programChange/bankChange I've also changed the time in the track.addEvent to be 0, 1.0, 3.0 etc to no success The sampler.loadSoundBankInstrument specifies two UInt8 parameters, bankMSB and BankLSB while the AVMIDIControlChangeEvent bankSelect value is UInt32 suggesting it might be some combination of bankMSB and BankLSB. But the documentation makes no mention of what this should look like. I tried various combinations of 0x7908, 0X0879 etc to no avail I will also point out that I am able to successfully execute other control change events For example adding if i == 1 { let portamentoOnEvent = AVMIDIControlChangeEvent(channel: 0, messageType: AVMIDIControlChangeEvent.MessageType.portamento, value: 0xFF) track.addEvent(portamentoOnEvent, at: AVMusicTimeStamp(currTime)) let portamentoRateEvent = AVMIDIControlChangeEvent(channel: 0, messageType: AVMIDIControlChangeEvent.MessageType.portamentoTime, value: 64) track.addEvent(portamentoRateEvent, at: AVMusicTimeStamp(currTime)) } does produce a change in the sound. (As an aside, a definition of what portamento time is, other than "the rate of portamento" would be welcome. is it notes/seconds? freq/minute? beats/hour?) I was able to get the instrument to change in a different program using MusicPlayer and a series of MusicTrackNewMIDIChannelEvent on a track but these operate on a MusicTrack not the AVMusicTrack which the sequencer uses. Has anyone been successful in switching instruments through an AVMIDIControlChangeEvent or have any feedback on how to do this?
0
0
346
Mar ’25
Apple Music API High Error Rate
I am getting high error rates from the Apple Music API. This has been happening for months now, and it is quite frustrating. It is a mix of 404, 504, and random 500 errors. I hit these endpoints all of the time, so it is not like I am hitting a resource that doesn't exist. Why is this happening? Is this a known issue that is getting worked on?
0
0
84
Jun ’25
Playing audio live from Bluetooth headset on iPhone speaker
Hi guys, I am having issue in live-streaming audio from Bluetooth headset and playing it live on the iPhone speaker. I am able to redirect audio back to the headset but this is not what I want. The issue happens when I am trying to override output - the iPhone switches to speaker but also switches a microphone. This is example of the code: import AVFoundation class AudioRecorder { let player: AVAudioPlayerNode let engine:AVAudioEngine let audioSession:AVAudioSession let audioSessionOutput:AVAudioSession init() { self.player = AVAudioPlayerNode() self.engine = AVAudioEngine() self.audioSession = AVAudioSession.sharedInstance() self.audioSessionOutput = AVAudioSession() do { try self.audioSession.setCategory(AVAudioSession.Category.playAndRecord, options: [.defaultToSpeaker]) try self.audioSessionOutput.setCategory(AVAudioSession.Category.playAndRecord, options: [.allowBluetooth]) // enables Bluetooth HFP profile try self.audioSession.setMode(AVAudioSession.Mode.default) try self.audioSession.setActive(true) // try self.audioSession.overrideOutputAudioPort(.speaker) // doens't work } catch { print(error) } let input = self.engine.inputNode self.engine.attach(self.player) let bus = 0 let inputFormat = input.inputFormat(forBus: bus) self.engine.connect(self.player, to: engine.mainMixerNode, format: inputFormat) input.installTap(onBus: bus, bufferSize: 512, format: inputFormat) { (buffer, time) -> Void in self.player.scheduleBuffer(buffer) print(buffer) } } public func start() { try! self.engine.start() self.player.play() } public func stop() { self.player.stop() self.engine.stop() } } I am not sure if this is a bug or not. Can somebody point me into the right direction? I there a way to design a custom audio routing? I would also appreciate some good documentation besides AVFoundation docs.
0
0
309
Mar ’25