Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Activity

Media Library Access not working in my App since iOS 18.2
Hi There, I have an app which access the media library, to save and load files. Since the IOS 18.2, the access to the media library stopped working. Now, I've noticed that our App doesn't show in the List of apps with access to Files ( Privacy & Security -> Files & Folders). Weird behavior is that, one iPhone with iOS 18.3.1 can access to the Files but others no, same iOS version 18.3.1. Test on Simulators (MAC) and works fine also. My info.plist file have the keys to access media library for long time and hasn't changed (at least in the las 4 years) including the key "Privacy - Media Library Usage Description". Also, I've noticed, that the message (popup) that request access to the media library, when using the app for the first time, doesn't show up anymore. We request access to the network (wifi) and this message still showing up but no the media library. I'm using Visual Studio with Xamarin on a MAC. I really appreciate any help you can because is very odd behavior and this started from the iOS 18.2.
0
0
382
Feb ’25
Capacitor app on iOS inline video stops
Hello, I’m experiencing an issue with video playback in my Javascript (SvelteKit) application using Capacitor. The video plays and loops correctly on Android and web browsers (including Safari) but stops unexpectedly after a few iterations on iOS native App. <video src={videoPath} autoplay muted loop playsinline class="h-auto w-full max-w-full object-cover"></video> Has anyone encountered a similar issue or have insights into what might be causing this behavior on iOS? Any suggestions or workarounds would be greatly appreciated. Maybe it has something to do with the iOS power saving policy? Thank you in advance for your help!
0
0
497
Jan ’25
Carplay rate button always displays "0x".
Hi. I am working on an audio app for iOS. I have added the CPNowPlayingPlaybackRateButton to my CPNowPlayingTemplate. When the button is clicked, my handler changes the rate in the AVPlayer and updates the MPNowPlayingInfoCenter to the new rate, for example, 2.0. Throughout, the Carplay button always displays "0x". I am wondering how to get this UI to accurately reflect the playback rate the user has selected, as always displaying 0x is a poor user experience. You may suggest MPChangePlaybackRateCommand is relevant here, but I have not been able to get that to work either, and judging by posts online, not many other people have either. I have made a post about that here: https://developer.apple.com/forums/thread/773099 Is this a known Apple bug? Is there a way to get the UI to accurately reflect the playback rate of my audio? Kind regards.
0
0
326
Jan ’25
Capturing multiple screens no longer works with macOS Sequoia
Capturing more than one display is no longer working with macOS Sequoia. We have a product that allows users to capture up to 2 displays/screens. Our application is using gstreamer which in turn is based on AVFoundation. I found a quick way to replicate the issue by just running 2 captures from separate terminals. Assuming display 1 has device index 0, and display 2 has device index 1, here are the steps: install gstreamer with brew install gstreamer Then open 2 terminal windows and launch the following processes: terminal 1 (device-index:0): gst-launch-1.0 avfvideosrc -e device-index=0 capture-screen=true ! queue ! videoscale ! video/x-raw,width=640,height=360 ! videoconvert ! osxvideosink terminal 2 (device-index:1): gst-launch-1.0 avfvideosrc -e device-index=1 capture-screen=true ! queue ! videoscale ! video/x-raw,width=640,height=360 ! videoconvert ! osxvideosink The first process that is launched will show the screen, the second process launched will not. Testing this on macOS Ventura and Sonoma works as expected, showing both screens. I submitted the same issue on Feedback Assistant: FB15900976
2
0
369
Apr ’25
Has Dext perfectly replaced Kext functionality?
I found some documentation about Kext, but I heard they have now moved to Dext. So I was wondering if Dext could completely imitate the previous Kext. https://developer.apple.com/documentation/kernel/implementing_drivers_system_extensions_and_kexts This page is written like this Important In macOS 11 and later, the kernel doesn’t load a kext if an equivalent DriverKit solution exists. You may continue to use kexts in macOS 10.15 and earlier.
0
0
256
Mar ’25
Changing playback rate of AVQueuePlayer not working for some Airplay devices.
Hi. I am working on an audio app for iOS. I have implemented UI and handling which allows the user to change playback rate of audio. When the user selects a different rate, I update the rate property on my AVQueuePlayer. This is working well on device. When I use Airplay, it works for some devices and not for others. Some devices won't change playback rate and will always play at 1x speed. Is this possibly a limitation of those 3rd-party devices? Or is there something I'm missing/should check? Would love to get playback rate changes working across all Airplay devices with our app. Kind regards.
0
0
428
Jan ’25
Custom Share Desination stopped working in FCP X 11
We integrate with FCP X using a custom share destination and the Apple Script interface. This has been working fine until the the recent version 11 update of FCP X. With this update we are no longer receiving the open event when the export has completed. We get the apple event to creat the Asset and the file is exported to the location we set in the response. There is just no open event after that. I suspect something is wrong with our scripting support but I have no idea what or how to troubleshoot. This works fine in 10.8.1 and below.
0
0
378
Dec ’24
TTS Audio Unit Extension: File Write Access in App Group Container Denied Despite Proper Entitlements
I'm developing a TTS Audio Unit Extension that needs to write trace/log files to a shared App Group container. While the main app can successfully create and write files to the container, the extension gets sandbox denied errors despite having proper App Group entitlements configured. Setup: Main App (Flutter) and TTS Audio Unit Extension share the same App Group App Group is properly configured in developer portal and entitlements Main app successfully creates and uses files in the container Container structure shows existing directories (config/, dictionary/) with populated files Both targets have App Group capability enabled and entitlements set Current behavior: Extension can access/read the App Group container Extension can see existing directories and files All write attempts are blocked with "sandbox deny(1) file-write-create" errors Code example: const char* createSharedGroupPathWithComponent(const char* groupId, const char* component) { NSString* groupIdStr = [NSString stringWithUTF8String:groupId]; NSString* componentStr = [NSString stringWithUTF8String:component]; NSURL* url = [[NSFileManager defaultManager] containerURLForSecurityApplicationGroupIdentifier:groupIdStr]; NSURL* fullPath = [url URLByAppendingPathComponent:componentStr]; NSError *error = nil; if (![[NSFileManager defaultManager] createDirectoryAtPath:fullPath.path withIntermediateDirectories:YES attributes:nil error:&amp;error]) { NSLog(@"Unable to create directory %@", error.localizedDescription); } return [[fullPath path] UTF8String]; } Error output: Sandbox: simaromur-extension(996) deny(1) file-write-create /private/var/mobile/Containers/Shared/AppGroup/36CAFE9C-BD82-43DD-A962-2B4424E60043/trace Key questions: Are there additional entitlements required for TTS Audio Unit Extensions to write to App Group containers? Is this a known limitation of TTS Audio Unit Extensions? What is the recommended way to handle logging/tracing in TTS Audio Unit Extensions? If writing to App Group containers is not supported, what alternatives are available? Current entitlements: &lt;dict&gt; &lt;key&gt;com.apple.security.application-groups&lt;/key&gt; &lt;array&gt; &lt;string&gt;group.com.&lt;company&gt;.&lt;appname&gt;&lt;/string&gt; &lt;/array&gt; &lt;/dict&gt;
0
0
99
Apr ’25
Changing instrument with AVMIDIControlChangeEvent bankSelect
I've been trying to use AVMIDIControlChangeEvent with a bankSelect message type to change the instrument the sequencer uses on a AVMusicTrack with no luck. I started with the Apple AVAEMixerSample, converting the initial setup/loading and portions dealing with the sequencer to Swift. I got that working and playing the "bluesyRiff" and then modified it to play individual notes. So my createAndSetupSequencer looked like func createAndSetupSequencer() { sequencer = AVAudioSequencer(audioEngine: engine) // guard let midiFileURL = Bundle.main.url(forResource: "bluesyRiff", withExtension: "mid") else { // print (" failed guard trying to get URL for bluesyRiff") // return // } let track = sequencer.createAndAppendTrack() var currTime = 1.0 for i: UInt32 in 0...8 { let newNoteEvent = AVMIDINoteEvent(channel: 0, key: 60+i, velocity: 64, duration: 2.0) track.addEvent(newNoteEvent, at: AVMusicTimeStamp(currTime)) currTime += 2.0 } The notes played, so then I also replaced the gs_instruments sound bank with GeneralUser GS MuseScore v1.442 first by trying guard let soundBankURL = Bundle.main.url(forResource: "GeneralUser GS MuseScore v1.442", withExtension: "sf2") else { return} do { try sampler.loadSoundBankInstrument(at: soundBankURL, program: 0x001C, bankMSB: 0x79, bankLSB: 0x08) } catch{.... } This appears to work, the instrument (8 which is "Funk Guitar") plays. If I change to bankLSB: 0x00 I get the "Palm Muted guitar". So I know that the soundfont has these instruments Stuff goes off the rails when I try to change the instruments in createAndSetupSequencer. Putting let programChange = AVMIDIProgramChangeEvent(channel: 0, programNumber: 0x001C) let bankChange = AVMIDIControlChangeEvent(channel: 0, messageType: AVMIDIControlChangeEvent.MessageType.bankSelect, value: 0x00) track.addEvent(programChange, at: AVMusicTimeStamp(1.0)) track.addEvent(bankChange, at: AVMusicTimeStamp(1.0)) just before my add note loop doesn't produce any change. Loading bankLSB 8 (Funk) in sampler.loadSoundBankInstrument and trying to change with bankSelect 0 (Palm muted) in createAndSetupSequencer results in instrument 8 (Funk) playing not Palm Muted. Loading bankLSB 0 (Palm muted) and trying to change with bankSelect 8 (Funk) doesn't work, 0 (Palm muted) plays I also tried sampler.loadInstrument(at: soundBankURL) and then I always get the first instrument in the sound font file (piano)no matter what values I put in my programChange/bankChange I've also changed the time in the track.addEvent to be 0, 1.0, 3.0 etc to no success The sampler.loadSoundBankInstrument specifies two UInt8 parameters, bankMSB and BankLSB while the AVMIDIControlChangeEvent bankSelect value is UInt32 suggesting it might be some combination of bankMSB and BankLSB. But the documentation makes no mention of what this should look like. I tried various combinations of 0x7908, 0X0879 etc to no avail I will also point out that I am able to successfully execute other control change events For example adding if i == 1 { let portamentoOnEvent = AVMIDIControlChangeEvent(channel: 0, messageType: AVMIDIControlChangeEvent.MessageType.portamento, value: 0xFF) track.addEvent(portamentoOnEvent, at: AVMusicTimeStamp(currTime)) let portamentoRateEvent = AVMIDIControlChangeEvent(channel: 0, messageType: AVMIDIControlChangeEvent.MessageType.portamentoTime, value: 64) track.addEvent(portamentoRateEvent, at: AVMusicTimeStamp(currTime)) } does produce a change in the sound. (As an aside, a definition of what portamento time is, other than "the rate of portamento" would be welcome. is it notes/seconds? freq/minute? beats/hour?) I was able to get the instrument to change in a different program using MusicPlayer and a series of MusicTrackNewMIDIChannelEvent on a track but these operate on a MusicTrack not the AVMusicTrack which the sequencer uses. Has anyone been successful in switching instruments through an AVMIDIControlChangeEvent or have any feedback on how to do this?
0
0
347
Mar ’25
ApplicationMusicPlayer / MediaPlayer Refuses to Play
We use BassDSDPlayer / SFBAudioEngine to play just about any file, but playing Apple Music is failing. All subscriptions are up to date. We stop the SFBAudioEngine and the BassDSDPlayer before playing Apple Music to no avail. PRINTS: Supported files in /Users/dorian/Music/Music/Media.localized/Music/4: 28364 Apple Music is authorized and can play catalog. Resetting default output device... Releasing BassDSDPlayer audio device... BassDSDPlayer: Audio device released. STOPPED sfbAudioDevice Default output device is ID: 76 applicationQueuePlayer _establishConnectionIfNeeded timeout [ping did not pong] applicationQueuePlayer _establishConnectionIfNeeded timeout [ping did not pong] Player State - After resetting output: Playback Status: stopped Queue Count: 0 No track is playing. Music player reset successfully. BassDSDPlayer: Audio device released. Default output device set successfully: 76 Default output device is ID: 76 Default output device set successfully: 76 Default output device ID: 76 Validated PlayParameters for track: squabble up PlayParameters: PlayParameters(id: 1781270321, kind: "song", isLibrary: nil, catalogID: nil, libraryID: nil, deviceLocalID: nil, rawValues: [:]) Starting playback... Player State - After playback: Playback Status: stopped Queue Count: 1 No track is playing. Notification BASS DSD NSConcreteNotification 0x600007ce2b00 {name = kUpdateSongInfo; object = { AlbumTitle = GNX; ArtistName = "Kendrick Lamar"; SongArtwork = "<NSImage 0x6000041b7ca0 Size={300, 300} RepProvider=<NSImageArrayRepProvider: 0x600003518770, reps:(\n "NSBitmapImageRep 0x600009ed9dc0 Size={300, 300} ColorSpace=(not yet loaded) BPS=8 BPP=(not yet loaded) Pixels=300x300 Alpha=NO Planar=NO Format=(not yet loaded) CurrentBacking=nil (faulting) CGImageSource=0x600007ce15c0"\n)>>"; SongLength = "157.992"; SongTitle = "squabble up"; Source = AppleMusic; }} Apple Music track loaded: squabble up by Kendrick Lamar Player State - Before play: Playback Status: stopped Queue Count: 1 No track is playing. prepareToPlay failed [no target descriptor] NSError Code: 1, Domain: MPMusicPlayerControllerErrorDomain Player State - After play: Playback Status: stopped Queue Count: 1 No track is playing. func playAppleMusicTracks(tracks: [Track]) { AppleMusicManager.shared.isAuthorizedAndReadyForPlayback { isAuthorized in guard isAuthorized else { print("Apple Music authorization or capabilities insufficient for playback.") return } print("Resetting default output device...") self.stopSFBAudioDevice() self.resetMusicPlayer() self.resetAudioSystem() self.ensureOutputDeviceReady() Task { for track in tracks { guard self.validatePlayParameters(for: track) else { continue } do { try await ApplicationMusicPlayer.shared.queue.insert(track, position: .afterCurrentEntry) guard !ApplicationMusicPlayer.shared.queue.entries.isEmpty else { print("Queue is empty after queuing. Playback cannot proceed.") return } self.notifyAppleMusicTrackInfo(track) } catch { print("Error starting playback: \(error)") if let nsError = error as NSError? { print("NSError Code: \(nsError.code), Domain: \(nsError.domain)") } } } MusicKitWrapper.shared.logPlayerState(message: "After playback") } } } @objc public class MusicKitWrapper: NSObject { @objc public static let shared = MusicKitWrapper() private let player = ApplicationMusicPlayer.shared // Play the current track @objc public func play() { guard !player.queue.entries.isEmpty else { print("Queue is empty. Cannot start playback.") return } logPlayerState(message: "Before play") Task { do { try await player.prepareToPlay() try await player.play() print("Playback started successfully.") } catch { if let nsError = error as NSError? { print("NSError Code: \(nsError.code), Domain: \(nsError.domain)") } } logPlayerState(message: "After play") } } Any help would be appreciated. Thanks!
5
0
602
Jan ’25
Usage of colorCurvesFilter
How can I use my RGB Curve points: let redCurve = [CIVector(x: 0, y: 0), CIVector(x: 0.235, y: 0.152), CIVector(x: 0.5, y: 0.5), CIVector(x: 1, y: 1)] let greenCurve = [CIVector(x: 0, y: 0), CIVector(x: 0.247, y: 0.196), CIVector(x: 0.5, y: 0.5), CIVector(x: 1, y: 1)] let blueCurve = [CIVector(x: 0, y: 0), CIVector(x: 0.235, y: 0.184), CIVector(x: 0.466, y: 0.466), CIVector(x: 1, y: 1)] in colorCurvesFilter which I've found in Apple Docs: func colorCurves(inputImage: CIImage) -> CIImage { let colorCurvesEffect = CIFilter.colorCurves() colorCurvesEffect.inputImage = inputImage colorCurvesEffect.curvesDomain = CIVector(x: 0, y: 1) colorCurvesEffect.curvesData = Data( bytes: [Float32]([ 0.0,0.0,0.0, 0.8,0.8,0.8, 1.0,1.0,1.0 ]), count: 36) colorCurvesEffect.colorSpace = CGColorSpaceCreateDeviceRGB() return colorCurvesEffect.outputImage! }
0
0
365
Jan ’25
Audio / Video sync issue on iOS using AVSampleBufferRenderSynchronizer
My current app implements a custom video player, based on a AVSampleBufferRenderSynchronizer synchronising two renderers: an AVSampleBufferDisplayLayer receiving decoded CVPixelBuffer-based video CMSampleBuffers, and an AVSampleBufferAudioRenderer receiving decoded lpcm-based audio CMSampleBuffers. The AVSampleBufferRenderSynchronizer is started when the first image (in presentation order) is decoded and enqueued, using avSynchronizer.setRate(_ rate: Float, time: CMTime), with rate = 1 and time the presentation timestamp of the first decoded image. Presentation timestamps of video and audio sample buffers are consistent, and on most streams, the audio and video are correctly synchronized. However on some network streams, on iOS, the audio and video aren't synchronized, with a time difference that seems to increase with time. On the other hand, with the same player code and network streams on macOS, the synchronization always works fine. This reminds me of something I've read, about cases where an AVSampleBufferRenderSynchronizer could not synchronize audio and video, causing them to run with independent and potentially drifting clocks, but I cannot find it again. So, any help / hints on this sync problem will be greatly appreciated! :)
2
0
1.2k
Apr ’25
Use Metal to conver HDR Pixelbuffer to SDR Pixelbuffer
I see some demo show convert HDR video to SDR Pixelbuffer,such AVAssetReader、 AVVideoComposition 、AVComposition 、AVFoundation. But In some cases,I want to render HDR Pixelbuffer and record video. AVCaptureSession *session = [[AVCaptureSession alloc] init]; session.sessionPreset = AVCaptureSessionPresetHigh; AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; if ([videoDevice isVideoHDRSupported]) { NSError *error = nil; if ([videoDevice lockForConfiguration:&error]) { videoDevice.automaticallyAdjustsVideoHDREnabled = NO; videoDevice.videoHDREnabled = YES; // 开启 HDR [videoDevice unlockForConfiguration]; } else { NSLog(@"Error: %@", error.localizedDescription); } } Real-time processing of HDR data requires processing of video frame data (such as filters), ensuring that the processing chain supports 10-bit color depth and HDR metadata. And use imagesBuffer to object tracking, etc. How to solve this problem?
1
0
464
Jan ’25
Issue with Audio Sample Rate Conversion in Video Calls
Hey everyone, I'm encountering an issue with audio sample rate conversion that I'm hoping someone can help with. Here's the breakdown: Issue Description: I've installed a tap on an input device to convert audio to an optimal sample rate. There's a converter node added on top of this setup. The problem arises when joining Zoom or FaceTime calls—the converter gets deallocated from memory, causing the program to crash. Symptoms: The converter node is being deallocated during video calls. The program crashes entirely when this happens. Traditional methods of monitoring sample rate changes (tracking nominal or actual sample rates) aren't working as expected. The Big Challenge: I can't figure out how to properly monitor sample rate changes. Listeners set up to track these changes don't trigger when the device joins a Zoom or FaceTime call. Please, if anyone has experience with this or knows a solution, I'd really appreciate your help. Thanks in advance! ⁠
0
0
66
Apr ’25
AVPlayer periodic time observer stops notifying when switching back from AirPlay to local playback
The app registers a periodic time observer to the AVPlayer when the playback starts and it works fine. When switching to AirPlay during playback, the periodic time observation continues working as expected. However, when switching back to local playback, the periodic time observer does not fire anymore until a seek is performed. The app removes the periodic time observer only when the playback stops. I can see that when switching back to local playback, the timeControlStatus successively changes to .waitingToPlayAtSpecifiedRate (reason: .evaluatingBufferingRate) then to .waitingToPlayAtSpecifiedRate (reason: .toMinimizeStalls) and finally to .playing But the time observation does not work anymore. Also, the issue is systematic with Live and VOD streams providing a program date (with HLS property #EXT-X-PROGRAM-DATE-TIME), with or without any DRM, and is never reproduced with other VOD streams.
2
0
107
Apr ’25
Error codes -1102 and 1852797029 when entering the background playing HLS assets
In our logging tools (Firebase) I see a lot of errors reported when users are playing content and the app transitions to the background. A AVPlayerItemFailedToPlayToEndTime notification is fired with an error containing error codes like -1102 and 1852797029 which seem to correspond to NSURLErrorNoPermissionsToReadFile and kCMIOHardwareIllegalOperationError respectively. To me, it looks like these might have something to do with caching logic. The items being played are HLS streams and we make use of AVAssetDownloadTask to make any streamed content offline available. Our setup is similar to the sample provided here: https://developer.apple.com/documentation/avfoundation/using-avfoundation-to-play-and-persist-http-live-streams. Whenever an item is selected for playback the app will check if a cached version is available and if so gets the url to the stored file like the "localAssetForStream()" method in the example, or get the asset from a currently running AVAssetDownloadTask for the item, or else, starts a new AVAssetDownloadTask and returns a AVAsset from that task to play. This seems to work fine, and I can't reproduce the issues our users and our logging tools are reporting. Is there some case I am missing where AVAssetDownloadTask and associated AVAssets might become unreadable when the app transitions to the background? Or do these errors indicate a different problem entirely?
0
0
215
Mar ’25
Inconsistent FPS (20 FPS Issue) While Recording Video Using AVCaptureSession.
Hi, I am recording video using my app. And setting up fps also using below code. But sometime video is being recorded using 20 FPS. Can someone please let me know what I am doing wrong? private func eightBitVariantOfFormat() -> AVCaptureDevice.Format? { let activeFormat = self.videoDeviceInput.device.activeFormat let fpsToBeSupported: Int = 60 debugPrint("fpsToBeSupported - \(fpsToBeSupported)" as AnyObject) let allSupportedFormats = self.videoDeviceInput.device.formats debugPrint("all formats - \(allSupportedFormats)" as AnyObject) let activeDimensions = CMVideoFormatDescriptionGetDimensions(activeFormat.formatDescription) debugPrint("activeDimensions - \(activeDimensions)" as AnyObject) let filterBasedOnDimensions = allSupportedFormats.filter({ (CMVideoFormatDescriptionGetDimensions($0.formatDescription).width == activeDimensions.width) && (CMVideoFormatDescriptionGetDimensions($0.formatDescription).height == activeDimensions.height) }) if filterBasedOnDimensions.isEmpty { // Dimension not found. Required format not found to handle. debugPrint("Dimension not found" as AnyObject) return activeFormat } debugPrint("filterBasedOnDimensions - \(filterBasedOnDimensions)" as AnyObject) let filterBasedOnMaxFrameRate = filterBasedOnDimensions.compactMap({ format in let videoSupportedFrameRateRanges = format.videoSupportedFrameRateRanges if !videoSupportedFrameRateRanges.isEmpty { let contains = videoSupportedFrameRateRanges.contains(where: { Int($0.maxFrameRate) >= fpsToBeSupported }) if contains { return format } else { return nil } } else { return nil } }) debugPrint("allFormatsToBeSupported - \(filterBasedOnMaxFrameRate)" as AnyObject) guard !filterBasedOnMaxFrameRate.isEmpty else { debugPrint("Taking default active format as nothing found when filtered using desired FPS" as AnyObject) return activeFormat } var formatToBeUsed: AVCaptureDevice.Format! if let four_two_zero_v = filterBasedOnMaxFrameRate.first(where: { CMFormatDescriptionGetMediaSubType($0.formatDescription) == kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange}) { // 'vide'/'420v' formatToBeUsed = four_two_zero_v } else { // Take the first one from above array. formatToBeUsed = filterBasedOnMaxFrameRate.first } do { try self.videoDeviceInput.device.lockForConfiguration() self.videoDeviceInput.device.activeFormat = formatToBeUsed self.videoDeviceInput.device.activeVideoMinFrameDuration = CMTimeMake(value: 1, timescale: Int32(fpsToBeSupported)) self.videoDeviceInput.device.activeVideoMaxFrameDuration = CMTimeMake(value: 1, timescale: Int32(fpsToBeSupported)) if videoDeviceInput.device.isFocusModeSupported(.continuousAutoFocus) { self.videoDeviceInput.device.focusMode = AVCaptureDevice.FocusMode.continuousAutoFocus } self.videoDeviceInput.device.unlockForConfiguration() } catch let error { debugPrint("\(error)" as AnyObject) } return formatToBeUsed }
1
0
398
Mar ’25
Memory leak AVAudioPlayer
Let's consider the following code. I've created an actor that loads a list of .mp3 files from a Bundle and then makes it available for audio reproduction. Unfortunately, I'm experiencing a memory leak. At the play method. player.play() From Instruments I get _malloc_type_malloc_outlined libsystem_malloc.dylib start_wqthread libsystem_pthread.dylib private actor AudioActor { enum Failure: Error { case soundsNotLoaded([AudioPlayerClient.Sound: Error]) } enum Player { case music(AVAudioPlayer) } var players: [Sound: Player] = [:] let bundles: [Bundle] init(bundles: UncheckedSendable<[Bundle]>) { self.bundles = bundles.wrappedValue } func load(sounds: [Sound]) throws { try AVAudioSession.sharedInstance().setActive(true, options: []) var errors: [Sound: Error] = [:] for sound in sounds { guard let url = bundle.url(forResource: sound.name, withExtension: "mp3") else { continue } do { self.players[sound] = try .music(AVAudioPlayer(contentsOf: url)) } catch { errors[sound] = error } } guard errors.isEmpty else { throw Failure.soundsNotLoaded(errors) } } func play(sound: Sound, loops: Int?) throws { guard let player = self.players[sound] else { return } switch player { case let .music(player): player.numberOfLoops = loops ?? -1 player.play() } } func stop(sound: Sound) throws { guard let player = self.players[sound] else { throw Failure.soundsNotLoaded([:]) } switch player { case let .music(player): player.stop() } } }
0
0
101
Mar ’25