Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Activity

iPad app on macOS not asking for microphone permission
Hello, I have an iOS app that is recording audio that is working fine on iPads/iPhones. It asks for microphone permission and after that recording works. I installed the same app on my M3 MacBook via TestFlight since iPad apps are supposed to work without a change that way. The app starts fine and everything, but it never asks for Microphone permission, so I can't record. Do I need to do something to make this happen (this is not macCatalyst, its running the arm64 iPhone binary on macOS) thanks
2
1
825
Mar ’25
MPMusicPlayerController.applicationMusicPlayer.currentPlaybackRate no longer working in iOS 26.0 (Tahoe)?
I'm wondering if someone happened issues with currentPlaybackRate in released version of iOS 26.0? There is no issue happened in case of iOS 18.5. When I changed currentPlaybackRate on iOS 26.0, it seems to unexpectedly change currentPlaybackRate to be 0 or 1.0 forcibly no matter DRM or non-DRM contents. And also, playing music will be abnormal behavior and unstable with noise if currentPlaybackRate is not 1.0. And changes stop state and play state frequently.
2
2
621
Oct ’25
How can third-party iOS apps obtain real-time waveform / spectrogram data for Apple Music tracks (similar to djay & other DJ apps)?
Hi everyone, I’m working on an iOS MusicKit app that overlays a metronome on top of Apple Music playback. To line the clicks up perfectly I’d like access to low-level audio analysis data—ideally a waveform / spectrogram or beat grid—while the track is playing. I’ve noticed that several approved DJ apps (e.g. djay, Serato, rekordbox) can already: • Display detailed scrolling waveforms of Apple Music songs • Scratch, loop or time-stretch those tracks in real time That implies they receive decoded PCM frames or at least high-resolution analysis data from Apple Music under a special entitlement. My questions: 1. Does MusicKit (or any public framework) expose real-time audio buffers, FFT bins, or beat markers for streaming Apple Music content? 2. If not, is there an Apple program or entitlement that developers can apply for—similar to the “DJ with Apple Music” initiative—to gain that deeper access? 3. Where can I find official documentation or a point of contact for this kind of request? I’ve searched the docs and forums but only see standard MusicKit playback APIs, which don’t appear to expose raw audio for DRM-protected songs. Any guidance, links or insider tips on the proper application process would be hugely appreciated! Thanks in advance.
2
2
384
Oct ’25
WKWebView Crashes on iOS During YouTube Playlist Playback
I’m encountering a consistent crash in WebKit when using WKWebView to play a YouTube playlist in my iOS app. Playback starts successfully, but the web process terminates during the second video in the playlist. This only occurs on physical devices, not in the simulator. Here’s a simplified Swift example of my setup: import SwiftUI import WebKit struct ContentView: View { private let playlistID = "PLig2mjpwQBZnghraUKGhCqc9eAy0UbpDN" var body: some View { YouTubeWebView(playlistID: playlistID) .edgesIgnoringSafeArea(.all) } } struct YouTubeWebView: UIViewRepresentable { let playlistID: String func makeUIView(context: Context) -> WKWebView { let config = WKWebViewConfiguration() config.allowsInlineMediaPlayback = true let webView = WKWebView(frame: .zero, configuration: config) webView.scrollView.isScrollEnabled = true let html = """ <!doctype html> <html> <head> <meta name="viewport" content="initial-scale=1.0, maximum-scale=1.0"> <style>body,html{height:100%;margin:0;background:#000}iframe{width:100%;height:100%;border:0}</style> </head> <body> <iframe src="https://www.youtube-nocookie.com/embed/videoseries?list=\(playlistID)&controls=1&rel=0&playsinline=1&iv_load_policy=3" frameborder="0" allow="encrypted-media; picture-in-picture; fullscreen" webkit-playsinline allowfullscreen ></iframe> </body> </html> """ webView.loadHTMLString(html, baseURL: nil) return webView } func updateUIView(_ uiView: WKWebView, context: Context) {} } #Preview { ContentView() } Observed behavior: First video plays without issue. Web process crashes when the second video in the playlist starts. Console logs show WebProcessProxy::didClose and repeated memory status messages. Using ProcessAssertion or background activity does not prevent the crash. Only occurs on physical devices; simulators do not reproduce the issue. Questions: Is there something I should change or add in my WKWebView setup or HTML/iframe to prevent the crash when playing the second video in a playlist on physical iOS devices? Is there an officially supported way to limit memory or prevent WebKit from terminating the web process during multi-video playback? Are there recommended patterns for playing YouTube playlists in a WKWebView on iOS without risking crashes? Any tips for debugging or configuring WKWebView to make it more stable for continuous playlist playback? Thanks in advance for any guidance!
2
0
478
Oct ’25
After iOS 18.5, the AVFoundation AVCaptureSessionInterruptionReason.videoDeviceNotAvailableWithMultipleForegroundApps error has caused a significant increase in camera black screen issues.
Issue: After iOS 18.5 release, our app is experiencing a significant increase in AVCaptureSessionInterruptionReason.videoDeviceNotAvailableWithMultipleForegroundApps errors. Details: Our camera-related code has not been updated recently.However, we've observed that the error rate has significantly increased starting from May 2025. The error rate has risen from approximately 0.02% (2 in 10,000 users) to 0.1% (1 in 1,000 users). This represents a 5x increase in error occurrence. The frequency has increased noticeably since iOS 18.5 This is affecting our app's camera functionality and user experience Questions: Are there any known changes in iOS 18.5 regarding camera access management? What are the recommended best practices to handle this interruption reason? Are there any API changes we should be aware of? Best, Shay
2
1
472
Oct ’25
Torch Freezes Ultra-Wide Camera When Switching Between Wide & Ultra-Wide Lenses (AVFoundation Bug?)
I'm developing an iOS app using AVFoundation for real-time video capture and object detection. While implementing torch functionality with camera switching (between Wide and Ultra-Wide lenses), I encountered a critical issue where the camera freezes when toggling the torch while the Ultra-Wide camera is active. Issue If the torch is ON and I switch from Wide to Ultra-Wide, the camera freezes If the Ultra-Wide camera is active and I try to turn the torch ON, the camera freezes The iPhone Camera app allows using the torch while recording video with the Ultra-Wide lens, so this should be possible via AVFoundation as well. Code snippet DispatchQueue.global(qos: .userInitiated).async { [weak self] in guard let self = self else { return } let isSwitchingToUltraWide = !self.isUsingFisheyeCamera let cameraType: AVCaptureDevice.DeviceType = isSwitchingToUltraWide ? .builtInUltraWideCamera : .builtInWideAngleCamera let cameraName = isSwitchingToUltraWide ? "Ultra Wide" : "Wide" guard let selectedCamera = AVCaptureDevice.default(cameraType, for: .video, position: .back) else { DispatchQueue.main.async { self.showAlert(title: "Camera Error", message: "\(cameraName) camera is not available on this device.") } return } do { let currentInput = self.videoCapture.captureSession.inputs.first as? AVCaptureDeviceInput self.videoCapture.captureSession.beginConfiguration() if isSwitchingToUltraWide &amp;&amp; self.isFlashlightOn { self.forceEnableTorchThroughWide() } if let currentInput = currentInput { self.videoCapture.captureSession.removeInput(currentInput) } let videoInput = try AVCaptureDeviceInput(device: selectedCamera) self.videoCapture.captureSession.addInput(videoInput) self.videoCapture.captureSession.commitConfiguration() self.videoCapture.updateVideoOrientation() DispatchQueue.main.async { if let barButton = sender as? UIBarButtonItem { barButton.title = isSwitchingToUltraWide ? "Wide" : "Ultra Wide" barButton.tintColor = isSwitchingToUltraWide ? UIColor.systemGreen : UIColor.white } print("Switched to \(cameraName) camera.") } self.isUsingFisheyeCamera.toggle() } catch { DispatchQueue.main.async { self.showAlert(title: "Camera Error", message: "Failed to switch to \(cameraName) camera: \(error.localizedDescription)") } } } } Expected Behavior Torch should be able to work when Ultra-Wide is active, just like the iPhone Camera app does. The camera should not freeze when switching between Wide and Ultra-Wide with the torch ON. AVCaptureSession should not crash when toggling the torch while Ultra-Wide is active. Questions &amp; Help Needed Is this a known issue with AVFoundation? How does the iPhone Camera app allow using the torch while recording in Ultra-Wide? What’s the correct way to switch between Wide and Ultra-Wide cameras without freezing when the torch is active? Info Device tested: iPhone 13 Pro / iPhone 15 Pro / Iphone 15 iOS Version: iOS 17.3 / iOS 18.0 Xcode Version: 16.2
2
1
692
Oct ’25
save audio file in iOS 18 instead of iOS 12
I'm able to get text to speech to audio file using the following code for iOS 12 iPhone 8 to create a car file: audioFile = try AVAudioFile( forWriting: saveToURL, settings: pcmBuffer.format.settings, commonFormat: .pcmFormatInt16, interleaved: false) where pcmBuffer.format.settings is: [AVAudioFileTypeKey: kAudioFileMP3Type, AVSampleRateKey: 48000, AVEncoderBitRateKey: 128000, AVNumberOfChannelsKey: 2, AVFormatIDKey: kAudioFormatLinearPCM] However, this code does not work when I run the app in iOS 18 on iPhone 13 Pro Max. The audio file is created, but it doesn't sound right. It has a lot of static and it seems the speech is very low pitch. Can anyone give me a hint or an answer?
2
0
133
Mar ’25
Control system video effects support for CMIO extension
We're distributing a virtual camera with our app that does not profit in the slightest from automatically applied system video effects both to the video going in (physical camera device) or out (virtual camera device). I'm aware of setting NSCameraReactionEffectGesturesEnabledDefault in Info.plist and determining active video effects via AVCaptureDevice API. Those are obviously crutches, because having to tell users to go look for and click around in menu bar apps is the opposite of a great UX. To make our product's video output more deterministic, I'm looking for a way to tell the CMIO subsystem that our virtual camera does not support any of the system video effects. I'm seeing properties like AVCaptureDevice.Format.isPortraitEffectSupported and AVCaptureDevice.Format.isStudioLightSupported whose documentation refers to the format's ability to support these effects. Since we're setting a CMFormatDescription via CMIOExtensionStreamSource.formats I was hoping to find something in the extensions, but wasn't successful so far. Can this be done?
2
0
225
Nov ’25
How to display artwork images from MusicKit with UIKit?
In SwiftUI there is a built-in component for displaying album artworks called Artwork but there is no equivalent for UIKit. My current approach is to use the .url() method to read image's URL and download the image or read it from the disk but the performance is much worse than it was previously with MPMediaItem's artworkImage method. let artworkQueue = DispatchQueue( label: "MusicKit-ArtworkQueue", qos: .default, attributes: .concurrent ) let artworkSemaphore = DispatchSemaphore(value: 5) extension Song { func artworkImage(for size: CGSize, completion: @escaping (UIImage?) -> Void) { artworkQueue.async { artworkSemaphore.wait() defer { artworkSemaphore.signal() } let imageURL = artwork?.url( width: Int(size.width), height: Int(size.height) ) // I hate doing this as it might very well break in the future guard let imageURL, imageURL.scheme == "musicKit" else { return completion(nil) } guard let imageData = try? Data(contentsOf: imageURL), let image = UIImage(data: imageData) else { return completion(nil) } completion(image) } } } I really dislike this approach because it feels hacky but somewhat works. You might ask what's the semaphore for? Well, without it I could notice that MusicKit was choking and after reading too many artworks at once. Can someone from Apple please provide us with an example on how to use MusicKit with UIKit properly? Ideally (IMO) we would have a method defined on Song and other MusicKit structures that returns the image for us, just like MPMediaItem had the .artwork() method. It would make our lives so much easier.
2
0
1k
Mar ’25
Problems recording audio on Tahoe 26.0 (Intel only)
I have some tried-and-tested code that records and plays back audio via AUHAL which breaks on Tahoe on Intel. The same code works fine on Sequioa and also works on Tahoe on Apple Silicon. To start with something simple, the following code to request access to the Microphone doesn't work as it should: bool RequestMicrophoneAccess () { __block AVAuthorizationStatus status = [AVCaptureDevice authorizationStatusForMediaType: AVMediaTypeAudio]; if (status == AVAuthorizationStatusAuthorized) return true; __block bool done = false; [AVCaptureDevice requestAccessForMediaType: AVMediaTypeAudio completionHandler: ^ (BOOL granted) { status = (granted) ? AVAuthorizationStatusAuthorized : AVAuthorizationStatusDenied; done = true; }]; while (!done) CFRunLoopRunInMode (kCFRunLoopDefaultMode, 2.0, true); return status == AVAuthorizationStatusAuthorized; } On Tahoe on Intel, the code runs to completion but granted is always returned as NO. Tellingly, the popup to ask the user to grant microphone access is never displayed, even though the app is not present in the Privacy pane and never appears there. On Apple Silicon, everything works fine. There are some other problems, but I'm hoping they have a common underlying cause and that the Apple guys can figure out what's wrong from the information in this post. I'd be happy to test any potential fix. Thanks.
2
0
447
Oct ’25
PTTFramework w/ AVAudioSession
Hi all, I have spent a lot of time reading the tech note and watching the WDDC video that introduce the PTTFramework on iOS. I currently have a custom setup where I am using AVAudioEngine to schedule and play buffers that are being streamed through a call. I am looking to use the PTTFramework to allow a user to trigger this push to talk behavior from the lock screen and the various places with the system UI it provides. However I am unsure what the correct behavior is regarding the handling of the audio session. Right now I am using .playback when there is no active voice transmission so that devices such as AirPods can be in AD2P mode where applicable, and then transitioning to .playbackAndRecord category only when the mic input should become active. Following this change in my AVAudioEngine manager I am then manually activating and deactivating the audio session manually when the engine is either playing/recording or idle. In the documentation it states that you should not attempt to activate or deactivate your audio session directly, but allow the framework to handle it. Does that mean that I need to either call the request to transmit delegate function or set an active participant on the channel manager first, and then wait for the didBecomeActive delegate method to trigger before I actually attempt to play or record any audio? (I am using the fullDuplex mode currently.) I noticed that that delegate method will only trigger if the audio session wasn't active before doing one of the above (setting active participant, requesting transmit). Lastly, when using the PTTFramework it also mentions that we get support for PTT devices and I notice on the didBeginTransmittingFrom property we have a handsfreeButton case. Is there any documentation or resources for what is actually supported out of the box for this? I am currently working on handling a lot of the push to talk through bluetooth LE, and wanted to make sure there wasn't overlap with what the system provides. Thank you!
2
0
620
Feb ’25
Memory Leak in AVAudioPlayer in Simulator only
I have a memory leak, when using AVAudioPlayer. I managed to narrow down the issue into a very simple app, which code I paste in at the end. The memory leak start immediately when I start playing sound, but only in the emylator. On the real iPhone there is no memory leak. The memory leak on the Simulator looks like this: import SwiftUI import AVFoundation struct ContentView_Audio: View { var sound: AVAudioPlayer? init() { guard let path = Bundle.main.path(forResource: "cd201", ofType: "mp3") else { return } let url = URL(fileURLWithPath: path) do { try AVAudioSession.sharedInstance().setCategory(.playback, mode: .default, options: [.mixWithOthers]) } catch { return } do { try AVAudioSession.sharedInstance().setActive(true) } catch { return } do { sound = try AVAudioPlayer(contentsOf: url) } catch { return } } var body: some View { HStack { Button { playSound() } label: { ZStack { Circle() .fill(.mint.opacity(0.3)) .frame(width: 44, height: 44) .shadow(radius: 8) Image(systemName: "play.fill") .resizable() .frame(width: 20, height: 20) } } .padding() Button { stopSound() } label: { ZStack { Circle() .fill(.mint.opacity(0.3)) .frame(width: 44, height: 44) .shadow(radius: 8) Image(systemName: "stop.fill") .resizable() .frame(width: 20, height: 20) } } .padding() } } private func playSound() { guard sound != nil else { return } sound?.volume = 1 // sound?.numberOfLoops = -1 sound?.play() } func stopSound() { sound?.stop() } }
2
0
124
Apr ’25
Lock screen media controls for MusicKit/ ApplicationMusicPlayer
Hi, when using ApplicationMusicPlayer from MusicKit my app automatically gets the media controls on the lock screen: Play/ Pause, Skip Buttons, Playback Position etc. I would like to customize these. Tried a bunch of things, e.g. using MPRemoteCommandCenter. So far I haven't had any success. Does anyone know how I can customize the media controls of ApplicationMusicPlayer. Thank you.
2
0
513
Sep ’25
How to disable/hide Audio Controls on lock screen from WkWebView
Hi, I am trying to remove the audio controls for my app on the lock screen. Since I use WKWebView, there are 3 audio tags in my html and I play and pause em via JS. However, if I do not play any sound since app launch, there are no audio controls on the lock screen. But if I play one of those 3 files (they are even less then 3 Sec sound effects e.g. for buttons) the audio controls appears on lock screen. Note even when the sounds on pause() or not playing they were listed on the lock screen. What I have tried so far without success MPNowPlayingInfoCenter.default().nowPlayingInfo = [:] and ``try audioSession.setCategory(.playback, mode: .default, options: []) try audioSession.setActive(false, options: .notifyOthersOnDeactivation)`` and UIApplication.shared.endReceivingRemoteControlEvents() Another problem is that the app scales with iOS system settings "display zoom". Is there a way to deny it? It is latest Xcode verion 16.3 and iOS 18. I have no background mode in my Capabilities. Nothing worked so far. Has anyone an idea? Greetings
2
0
125
May ’25
PHPickerFilter doesn't always apply in Collections tab when using PHPickerViewController
Hi everyone, I’m running into an issue with PHPickerFilter when using PHPickerViewController. When I configure the picker with a .videos and .livePhotos filter, it seems to work correctly in the Photos tab. However, when I switch to the Collections tab, the filter doesn’t always apply — users can still see and select static image assets in certain collections (e.g. from one of the People & Pets sections). Here’s a simplified snippet of my setup: var configuration = PHPickerConfiguration(photoLibrary: .shared()) configuration.selectionLimit = 1 var filters = [PHPickerFilter]() filters.append(.videos) filters.append(.livePhotos) configuration.filter = PHPickerFilter.any(of: filters) configuration.preferredAssetRepresentationMode = .current let picker = PHPickerViewController(configuration: configuration) picker.delegate = self present(picker, animated: true) Expected behavior: The picker should consistently respect the filter across both Photos and Collections tabs, only showing assets that match the filter. Actual behavior: The filter seems to apply correctly in the Photos tab, but in the Collections tab, other asset types are still visible/selectable. Has anyone else encountered this behavior? Is this expected or a known issue, or am I missing something in the configuration? Thanks in advance!
2
0
445
Oct ’25
How to synchronize the clock sources of two audio devices
I created a virtual audio device to capture system audio with a sample rate of 44.1 kHz. After capturing the audio, I forward it to the hardware sound card using AVAudioEngine, also with a sample rate of 44.1 kHz. However, due to the clock sources being unsynchronized, problems occur after a period of playback. How can I retrieve the clock source of the hardware device and set it for the virtual device?
2
0
230
May ’25