Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Activity

WWDC25 Camera & Photos group lab summary (Part 3 of 3)
(Note: this is part 3 of a 3 part posting. See Part 1 or Part 2) At WWDC25 we launched a new type of Lab event for the developer community - Group Labs. A Group Lab is a panel Q&A designed for a large audience of developers. Group Labs are a unique opportunity for the community to submit questions directly to a panel of Apple engineers and designers. Here are the highlights from the WWDC25 Group Lab for Camera & Photos. WWDC25 Camera & Photos group lab ran for one hour at 6 PM PST on Tuesday June 10th, 2025 Question 24 What’s the best approach for optimizing barcode scanning using AVFoundation or Vision in low-light or angled scenarios Turn on flash in low-light scenarios Lower framerate to improve exposure and reduce noise Wait until the capture is in focus/notify your user that they need to get closer Question 25 Recent iPhone models introduced macro mode which automatically switch between lenses to take into account of the focal distance difference. Is there official API to implement this, or should I implement them myself using LiDAR values. Using builtInTripleCamera and builtInDualWideCamera will automatically switch to macro when available Question 26 Is there a way to quickly create a thumbnail after the user selects an image with PhotosPicker? File provider API Additional questions from the WWDC25 in-person labs that occurred later in the WWDC week Question 1 When should I build my custom photo picker instead of using the system one? Always start with the system picker -> try embeddable customization APIs -> fallback to custom picker for very special needs Question 2 I'm building a new camera app for pros and I want to give my users the most un-processed image possible, and the most control over the capture as possible. How can I do that with AVCapture? If stills, Brief Bayer RAW capture overview, or Pro RAW if you want Apple's processing and dynamic range If video, talk about prores LOG. Custom exposure settings are available throguh the apis maybe global/local tonemapping discussion?
0
0
366
Jul ’25
In Speech framework is SFTranscriptionSegment timing supposed to be off and speechRecognitionMetadata nil until isFinal?
I'm working in Swift/SwiftUI, running XCode 16.3 on macOS 15.4 and I've seen this when running in the iOS simulator and in a macOS app run from XCode. I've also seen this behaviour with 3 different audio files. Nothing in the documentation says that the speechRecognitionMetadata property on an SFSpeechRecognitionResult will be nil until isFinal, but that's the behaviour I'm seeing. I've stripped my class down to the following: private var isAuthed = false // I call this in a .task {} in my SwiftUI View public func requestSpeechRecognizerPermission() { SFSpeechRecognizer.requestAuthorization { authStatus in Task { self.isAuthed = authStatus == .authorized } } } public func transcribe(from url: URL) { guard isAuthed else { return } let locale = Locale(identifier: "en-US") let recognizer = SFSpeechRecognizer(locale: locale) let recognitionRequest = SFSpeechURLRecognitionRequest(url: url) // the behaviour occurs whether I set this to true or not, I recently set // it to true to see if it made a difference recognizer?.supportsOnDeviceRecognition = true recognitionRequest.shouldReportPartialResults = true recognitionRequest.addsPunctuation = true recognizer?.recognitionTask(with: recognitionRequest) { (result, error) in guard result != nil else { return } if result!.isFinal { //speechRecognitionMetadata is not nil } else { //speechRecognitionMetadata is nil } } } } Further, and this isn't documented either, the SFTranscriptionSegment values don't have correct timestamp and duration values until isFinal. The values aren't all zero, but they don't align with the timing in the audio and they change to accurate values when isFinal is true. The transcription otherwise "works", in that I get transcription text before isFinal and if I wait for isFinal the segments are correct and speechRecognitionMetadata is filled with values. The context here is I'm trying to generate a transcription that I can then highlight the spoken sections of as audio plays and I'm thinking I must be just trying to use the Speech framework in a way it does not work. I got my concept working if I pre-process the audio (i.e. run it through until isFinal and save the results I need to json), but being able to do even a rougher version of it 'on the fly' - which requires segments to have the right timestamp/duration before isFinal - is perhaps impossible?
1
0
169
Jul ’25
VTDecompressionSession consistently drops frames after sync frame
I'm seeking to a specific sync frame in a video file (HEVC, recorded on iPad). When I feed the buffers from that sync frame on to VTDecompressionSession it consistently drops the 2.,3.,4. buffer with a kVTVideoDecoderReferenceMissingErr (or no error but no buffer on the simulator). If I feed all the buffers from the penultimate sync frame prior to the desired frame the buffers come out fine but that would just create a massive overhead to always do it. Tried multiple OS versions, devices etc. Seems a consistent problem. Here's a sample project with the offending video (disregard memory handling etc): https://github.com/marcuseckert/vtSample I've filed a radar FB18228296 but would appreciate any feedback on circumventing or at least detecting this behavior prior to decoding.
0
0
144
Jun ’25
Is it possible to query the TV's current refresh rate?
I'm working on a media app that would like to be able to tell if the TV connected to tvOS is running at 59.94hz or 60.00hz, so it can optimize a video stream. It looks like the best I can currently do is to check if the user has Match Content Rate enabled, and based on that, when calling displayManager.preferredDisplayCriteria to change video modes, I could guess which rate their TV might be in. It's not very ideal, because not all TVs support both of these rates, and my request for 59.94 might end up as 60 and vice versa. I dug around and can't find any available method in UIScreen to get this info. The odd thing is, the data is right there in currentMode when I look in the debugger, but it seems to be in a private or undocumented class. Is there any way to get at it?
0
0
238
Jun ’25
How do I find the catalogID for songs in a apple music playlist? (MusicKit & Apple Music API)
Hello, How do I find the apple music catalogID for songs in a apple music playlist? Im building an iOS app that uses MusicKit/Apple Music API. For this example, you can assume that my iOS app simply allows users to upload their apple music playlists. And when users open a specific playlist, I want to find the catalogID for each song. Currently all the songs are returning song IDs in this format “i.PkdJvPXI2AJgm8”. I believe these are libraryIDs, not catalogIDs. I’d like to find a front end solution using MusicKit. But perhaps a back end solution using the Apple Music Rest API is required. Any recommendations would be appreciated!
1
0
131
Jun ’25
UIDocumentPickerViewController in Audiounit Extension unable to receive touches
Hello, I have an existing AUv3 instrument plugin. In the plug in, users can access files (audio files, song projects) via a UIDocumentPickerViewController In Logic Pro, (and some other hosts, but not all), the document picker is unable to receive touches, while a keyboard case is attached to the iPad. Removing the case (this is an Apple brand iPad case) allows the interactions to resume and allows me to pick files in the usual way. One of my users reports this non-responsive behavior occurs even after disconnecting their keyboard. I have fiddled with entitlements all day, and have determined that is not the issue, since the keyboard disconnection appears to fix it every time for me. Here is my, very boilerplate, presentation code : guard let type = UTType("com.my.type") else { return } let fileBrowser = UIDocumentPickerViewController(forOpeningContentTypes: [type]) fileBrowser.overrideUserInterfaceStyle = .dark fileBrowser.delegate = self fileBrowser.directoryURL = myFileFolderURL() self.present(fileBrowser, animated: true) {
2
0
568
Jul ’25
Memory Leak in AVAudioPlayer in Simulator only
I have a memory leak, when using AVAudioPlayer. I managed to narrow down the issue into a very simple app, which code I paste in at the end. The memory leak start immediately when I start playing sound, but only in the emylator. On the real iPhone there is no memory leak. The memory leak on the Simulator looks like this: import SwiftUI import AVFoundation struct ContentView_Audio: View { var sound: AVAudioPlayer? init() { guard let path = Bundle.main.path(forResource: "cd201", ofType: "mp3") else { return } let url = URL(fileURLWithPath: path) do { try AVAudioSession.sharedInstance().setCategory(.playback, mode: .default, options: [.mixWithOthers]) } catch { return } do { try AVAudioSession.sharedInstance().setActive(true) } catch { return } do { sound = try AVAudioPlayer(contentsOf: url) } catch { return } } var body: some View { HStack { Button { playSound() } label: { ZStack { Circle() .fill(.mint.opacity(0.3)) .frame(width: 44, height: 44) .shadow(radius: 8) Image(systemName: "play.fill") .resizable() .frame(width: 20, height: 20) } } .padding() Button { stopSound() } label: { ZStack { Circle() .fill(.mint.opacity(0.3)) .frame(width: 44, height: 44) .shadow(radius: 8) Image(systemName: "stop.fill") .resizable() .frame(width: 20, height: 20) } } .padding() } } private func playSound() { guard sound != nil else { return } sound?.volume = 1 // sound?.numberOfLoops = -1 sound?.play() } func stopSound() { sound?.stop() } }
2
0
133
Apr ’25
401 Authorization error for Music Feed API
I've established proper authorization for general music api calls, but when I use that same authorization to retrieve metadata for the latest music feeds (see https://developer.apple.com/documentation/applemusicfeed/requesting-a-feed-export), I get a 401 Unauthorized error. As per the documentation, I'm simply issuing a GET against https://api.media.apple.com/v1/feed/album/latest. Are there different entitlements needed for the Music Feed API?
2
0
107
May ’25
How can I find the user's "Favorite Songs" playlist?
It sounds simple but searching for the name "Favorite Songs" is a non-starter because it's called different names in different countries, even if I specify "&l=en_us" on the query. So is there another property, relationship or combination thereof which I can use to tell me when I've found the right playlist? Properties I've looked at so far: canEdit: will always be false so narrows things down a little inFavorites: not helpful as it depends on whether the user has favourite the favourites playlist, so not relevant hasCatalog: seems always true so again may narrow things down a bit isPublic: doesn't help Adding the catalog relationship doesn't seem to show anything immediately useful either. Can anyone help? Ideally I'd like to see this as a "kind" or "type" as it has different properties to other playlists, but frankly I'll take anything at this point.
0
0
293
Jul ’25
how to enumerate build-in media devices?
Hello, I need to enumerate built-in media devices (cameras, microphones, etc.). For this purpose, I am using the CoreAudio and CoreMediaIO frameworks. According to the table 'Daemon-Safe Frameworks' in Apple’s TN2083, CoreAudio is daemon-safe. However, the documentation does not mention CoreMediaIO. Can CoreMediaIO be used in a daemon? If not, are there any documented alternatives to detect built-in cameras in a daemon (e.g., via device classes in IOKit)? Thank you in advance, Pavel
0
0
169
May ’25
AVPlayerItem. externalMetadata not available
According to the documentation (https://developer.apple.com/documentation/avfoundation/avplayeritem/externalmetadata), AVPlayerItem should have an externalMetadata property. However it does not appear to be visible to my app. When I try, I get: Value of type 'AVPlayerItem' has no member 'externalMetadata' Documentation states iOS 12.2+; I am building with a minimum deployment target of iOS 18. Code snippet: import Foundation import AVFoundation /// ... in function ... // create metadata as described in https://developer.apple.com/videos/play/wwdc2022/110338 var title = AVMutableMetadataItem() title.identifier = .commonIdentifierAlbumName title.value = "My Title" as NSString? title.extendedLanguageTag = "und" var playerItem = await AVPlayerItem(asset: composition) playerItem.externalMetadata = [ title ]
0
0
104
Apr ’25
How to Get Device Orientation in Background (PiP) Mode?
My app is a camera app that supports Picture-in-Picture (PiP) mode. Normally, when the device rotates, I get the device orientation from iOS and use it to rotate the camera feed so that the preview stays correctly aligned. However, when the app enters PiP mode, it is considered to be in the background, and I can no longer receive orientation updates from the system. As a result, I can’t apply rotation corrections to the camera video in PiP mode. Is there any way to retrieve device orientation while the app is in the background (specifically during PiP mode)? Any guidance would be greatly appreciated. Thank you!
0
0
76
May ’25
Capturing multiple screens no longer works with macOS Sequoia
Capturing more than one display is no longer working with macOS Sequoia. We have a product that allows users to capture up to 2 displays/screens. Our application is using gstreamer which in turn is based on AVFoundation. I found a quick way to replicate the issue by just running 2 captures from separate terminals. Assuming display 1 has device index 0, and display 2 has device index 1, here are the steps: install gstreamer with brew install gstreamer Then open 2 terminal windows and launch the following processes: terminal 1 (device-index:0): gst-launch-1.0 avfvideosrc -e device-index=0 capture-screen=true ! queue ! videoscale ! video/x-raw,width=640,height=360 ! videoconvert ! osxvideosink terminal 2 (device-index:1): gst-launch-1.0 avfvideosrc -e device-index=1 capture-screen=true ! queue ! videoscale ! video/x-raw,width=640,height=360 ! videoconvert ! osxvideosink The first process that is launched will show the screen, the second process launched will not. Testing this on macOS Ventura and Sonoma works as expected, showing both screens. I submitted the same issue on Feedback Assistant: FB15900976
2
0
393
Apr ’25
Can a Location-Based Audio AR Experience Run in the Background on iOS?
Hi everyone! I’ve developed a location-based Audio AR app in Unity with FMOD & Resonance Audio and AirPods Pro Head-Tracking to create a ubiquitous augmented soundscape experience. Think of it as an audio version of Pokémon Go, but with a more precise location requirement to ensure spatial audio is placed correctly. I want this experience to run in the background on iOS, but from what I’ve gathered, it seems Unity doesn’t support this well. So, I’m considering developing a Swift version instead. Since this is primarily for research purposes, privacy concerns are not a major issue in my case. However, I’ve come across some potential challenges: Real-time precise location updates – Can iOS provide fully instantaneous, high-accuracy location updates in the background? Continuous real-time data processing – Can an app continuously process spatial audio, head-tracking, and location data while running in the background? I’m not sure if newer iOS versions have improved in these areas or if there are workarounds to achieve this. Would this kind of experience be feasible to run in the background on iOS? Any insights or pointers would be greatly appreciated! I’m very new to iOS development, so apologies if this is a basic question. Thanks in advance!
0
0
110
Apr ’25
How to get a callback once a requested frameDuration change has been applied?
When changing a camera's exposure, AVFoundation provides a callback which offers the timestamp of the first frame captured with the new exposure duration: AVCaptureDevice.setExposureModeCustom(duration:, iso:, completionHandler:). I want to get a similar callback when changing frame duration. After setting AVCaptureDevice.activeVideoMinFrameDuration or AVCaptureDevice.activeVideoMinFrameDuration to a new value, how can I compute the index or the timestamp of the first camera frame which was captured using the newly set frame duration?
0
0
529
Aug ’25
Apple Music web player will not work on wkwebview web browser or electron chromium browser.
Hello, I'm trying to create a webbrowser but currently when signed into apple music webplayer I get the following message when I attempt to play on any versions of my webbrowser: Not available on the web You can listen to this in the Apple Music app. Is there a way to setup DRM (assuming this is the issue) with apple to allow my webbrowser to play this content? I believe Apple TV is also affected. Thank you ahead of time.
0
0
165
May ’25
Number of songs in the Apple Music Feed
Hello, I'm evaluating the Apple Music Feed dataset and I noticed that the total number of songs available in the feed is too small. As of today, the number of objects returned in each feed is: 51,198,712 albums 23,093,698 artists 173,235,315 songs This gives an average of 3.38 songs per album which is quite low. Also, iterating on the data I see that there are albums referencing songs that don't exist in the songs feed. I would like to know: Is the feed data incomplete? If so, in what situations an object may be missing from the feed? Thank you in advance!
0
0
303
Aug ’25
WWDC25 Camera & Photos group lab summary (Part 3 of 3)
(Note: this is part 3 of a 3 part posting. See Part 1 or Part 2) At WWDC25 we launched a new type of Lab event for the developer community - Group Labs. A Group Lab is a panel Q&A designed for a large audience of developers. Group Labs are a unique opportunity for the community to submit questions directly to a panel of Apple engineers and designers. Here are the highlights from the WWDC25 Group Lab for Camera & Photos. WWDC25 Camera & Photos group lab ran for one hour at 6 PM PST on Tuesday June 10th, 2025 Question 24 What’s the best approach for optimizing barcode scanning using AVFoundation or Vision in low-light or angled scenarios Turn on flash in low-light scenarios Lower framerate to improve exposure and reduce noise Wait until the capture is in focus/notify your user that they need to get closer Question 25 Recent iPhone models introduced macro mode which automatically switch between lenses to take into account of the focal distance difference. Is there official API to implement this, or should I implement them myself using LiDAR values. Using builtInTripleCamera and builtInDualWideCamera will automatically switch to macro when available Question 26 Is there a way to quickly create a thumbnail after the user selects an image with PhotosPicker? File provider API Additional questions from the WWDC25 in-person labs that occurred later in the WWDC week Question 1 When should I build my custom photo picker instead of using the system one? Always start with the system picker -> try embeddable customization APIs -> fallback to custom picker for very special needs Question 2 I'm building a new camera app for pros and I want to give my users the most un-processed image possible, and the most control over the capture as possible. How can I do that with AVCapture? If stills, Brief Bayer RAW capture overview, or Pro RAW if you want Apple's processing and dynamic range If video, talk about prores LOG. Custom exposure settings are available throguh the apis maybe global/local tonemapping discussion?
Replies
0
Boosts
0
Views
366
Activity
Jul ’25
In Speech framework is SFTranscriptionSegment timing supposed to be off and speechRecognitionMetadata nil until isFinal?
I'm working in Swift/SwiftUI, running XCode 16.3 on macOS 15.4 and I've seen this when running in the iOS simulator and in a macOS app run from XCode. I've also seen this behaviour with 3 different audio files. Nothing in the documentation says that the speechRecognitionMetadata property on an SFSpeechRecognitionResult will be nil until isFinal, but that's the behaviour I'm seeing. I've stripped my class down to the following: private var isAuthed = false // I call this in a .task {} in my SwiftUI View public func requestSpeechRecognizerPermission() { SFSpeechRecognizer.requestAuthorization { authStatus in Task { self.isAuthed = authStatus == .authorized } } } public func transcribe(from url: URL) { guard isAuthed else { return } let locale = Locale(identifier: "en-US") let recognizer = SFSpeechRecognizer(locale: locale) let recognitionRequest = SFSpeechURLRecognitionRequest(url: url) // the behaviour occurs whether I set this to true or not, I recently set // it to true to see if it made a difference recognizer?.supportsOnDeviceRecognition = true recognitionRequest.shouldReportPartialResults = true recognitionRequest.addsPunctuation = true recognizer?.recognitionTask(with: recognitionRequest) { (result, error) in guard result != nil else { return } if result!.isFinal { //speechRecognitionMetadata is not nil } else { //speechRecognitionMetadata is nil } } } } Further, and this isn't documented either, the SFTranscriptionSegment values don't have correct timestamp and duration values until isFinal. The values aren't all zero, but they don't align with the timing in the audio and they change to accurate values when isFinal is true. The transcription otherwise "works", in that I get transcription text before isFinal and if I wait for isFinal the segments are correct and speechRecognitionMetadata is filled with values. The context here is I'm trying to generate a transcription that I can then highlight the spoken sections of as audio plays and I'm thinking I must be just trying to use the Speech framework in a way it does not work. I got my concept working if I pre-process the audio (i.e. run it through until isFinal and save the results I need to json), but being able to do even a rougher version of it 'on the fly' - which requires segments to have the right timestamp/duration before isFinal - is perhaps impossible?
Replies
1
Boosts
0
Views
169
Activity
Jul ’25
VTDecompressionSession consistently drops frames after sync frame
I'm seeking to a specific sync frame in a video file (HEVC, recorded on iPad). When I feed the buffers from that sync frame on to VTDecompressionSession it consistently drops the 2.,3.,4. buffer with a kVTVideoDecoderReferenceMissingErr (or no error but no buffer on the simulator). If I feed all the buffers from the penultimate sync frame prior to the desired frame the buffers come out fine but that would just create a massive overhead to always do it. Tried multiple OS versions, devices etc. Seems a consistent problem. Here's a sample project with the offending video (disregard memory handling etc): https://github.com/marcuseckert/vtSample I've filed a radar FB18228296 but would appreciate any feedback on circumventing or at least detecting this behavior prior to decoding.
Replies
0
Boosts
0
Views
144
Activity
Jun ’25
Is it possible to query the TV's current refresh rate?
I'm working on a media app that would like to be able to tell if the TV connected to tvOS is running at 59.94hz or 60.00hz, so it can optimize a video stream. It looks like the best I can currently do is to check if the user has Match Content Rate enabled, and based on that, when calling displayManager.preferredDisplayCriteria to change video modes, I could guess which rate their TV might be in. It's not very ideal, because not all TVs support both of these rates, and my request for 59.94 might end up as 60 and vice versa. I dug around and can't find any available method in UIScreen to get this info. The odd thing is, the data is right there in currentMode when I look in the debugger, but it seems to be in a private or undocumented class. Is there any way to get at it?
Replies
0
Boosts
0
Views
238
Activity
Jun ’25
How do I find the catalogID for songs in a apple music playlist? (MusicKit & Apple Music API)
Hello, How do I find the apple music catalogID for songs in a apple music playlist? Im building an iOS app that uses MusicKit/Apple Music API. For this example, you can assume that my iOS app simply allows users to upload their apple music playlists. And when users open a specific playlist, I want to find the catalogID for each song. Currently all the songs are returning song IDs in this format “i.PkdJvPXI2AJgm8”. I believe these are libraryIDs, not catalogIDs. I’d like to find a front end solution using MusicKit. But perhaps a back end solution using the Apple Music Rest API is required. Any recommendations would be appreciated!
Replies
1
Boosts
0
Views
131
Activity
Jun ’25
UIDocumentPickerViewController in Audiounit Extension unable to receive touches
Hello, I have an existing AUv3 instrument plugin. In the plug in, users can access files (audio files, song projects) via a UIDocumentPickerViewController In Logic Pro, (and some other hosts, but not all), the document picker is unable to receive touches, while a keyboard case is attached to the iPad. Removing the case (this is an Apple brand iPad case) allows the interactions to resume and allows me to pick files in the usual way. One of my users reports this non-responsive behavior occurs even after disconnecting their keyboard. I have fiddled with entitlements all day, and have determined that is not the issue, since the keyboard disconnection appears to fix it every time for me. Here is my, very boilerplate, presentation code : guard let type = UTType("com.my.type") else { return } let fileBrowser = UIDocumentPickerViewController(forOpeningContentTypes: [type]) fileBrowser.overrideUserInterfaceStyle = .dark fileBrowser.delegate = self fileBrowser.directoryURL = myFileFolderURL() self.present(fileBrowser, animated: true) {
Replies
2
Boosts
0
Views
568
Activity
Jul ’25
Usage of Apple Music Feed leads to error 500
Hello, I'm trying to receive parquet files using the example that provided in documentation. I've done all required steps but receive constantly error 500 with "Upstream Service Error". By looking into the issues list, seems this error exists for months. Is it possible to get it working?
Replies
2
Boosts
0
Views
157
Activity
May ’25
Memory Leak in AVAudioPlayer in Simulator only
I have a memory leak, when using AVAudioPlayer. I managed to narrow down the issue into a very simple app, which code I paste in at the end. The memory leak start immediately when I start playing sound, but only in the emylator. On the real iPhone there is no memory leak. The memory leak on the Simulator looks like this: import SwiftUI import AVFoundation struct ContentView_Audio: View { var sound: AVAudioPlayer? init() { guard let path = Bundle.main.path(forResource: "cd201", ofType: "mp3") else { return } let url = URL(fileURLWithPath: path) do { try AVAudioSession.sharedInstance().setCategory(.playback, mode: .default, options: [.mixWithOthers]) } catch { return } do { try AVAudioSession.sharedInstance().setActive(true) } catch { return } do { sound = try AVAudioPlayer(contentsOf: url) } catch { return } } var body: some View { HStack { Button { playSound() } label: { ZStack { Circle() .fill(.mint.opacity(0.3)) .frame(width: 44, height: 44) .shadow(radius: 8) Image(systemName: "play.fill") .resizable() .frame(width: 20, height: 20) } } .padding() Button { stopSound() } label: { ZStack { Circle() .fill(.mint.opacity(0.3)) .frame(width: 44, height: 44) .shadow(radius: 8) Image(systemName: "stop.fill") .resizable() .frame(width: 20, height: 20) } } .padding() } } private func playSound() { guard sound != nil else { return } sound?.volume = 1 // sound?.numberOfLoops = -1 sound?.play() } func stopSound() { sound?.stop() } }
Replies
2
Boosts
0
Views
133
Activity
Apr ’25
401 Authorization error for Music Feed API
I've established proper authorization for general music api calls, but when I use that same authorization to retrieve metadata for the latest music feeds (see https://developer.apple.com/documentation/applemusicfeed/requesting-a-feed-export), I get a 401 Unauthorized error. As per the documentation, I'm simply issuing a GET against https://api.media.apple.com/v1/feed/album/latest. Are there different entitlements needed for the Music Feed API?
Replies
2
Boosts
0
Views
107
Activity
May ’25
How can I find the user's "Favorite Songs" playlist?
It sounds simple but searching for the name "Favorite Songs" is a non-starter because it's called different names in different countries, even if I specify "&l=en_us" on the query. So is there another property, relationship or combination thereof which I can use to tell me when I've found the right playlist? Properties I've looked at so far: canEdit: will always be false so narrows things down a little inFavorites: not helpful as it depends on whether the user has favourite the favourites playlist, so not relevant hasCatalog: seems always true so again may narrow things down a bit isPublic: doesn't help Adding the catalog relationship doesn't seem to show anything immediately useful either. Can anyone help? Ideally I'd like to see this as a "kind" or "type" as it has different properties to other playlists, but frankly I'll take anything at this point.
Replies
0
Boosts
0
Views
293
Activity
Jul ’25
how to enumerate build-in media devices?
Hello, I need to enumerate built-in media devices (cameras, microphones, etc.). For this purpose, I am using the CoreAudio and CoreMediaIO frameworks. According to the table 'Daemon-Safe Frameworks' in Apple’s TN2083, CoreAudio is daemon-safe. However, the documentation does not mention CoreMediaIO. Can CoreMediaIO be used in a daemon? If not, are there any documented alternatives to detect built-in cameras in a daemon (e.g., via device classes in IOKit)? Thank you in advance, Pavel
Replies
0
Boosts
0
Views
169
Activity
May ’25
AVPlayerItem. externalMetadata not available
According to the documentation (https://developer.apple.com/documentation/avfoundation/avplayeritem/externalmetadata), AVPlayerItem should have an externalMetadata property. However it does not appear to be visible to my app. When I try, I get: Value of type 'AVPlayerItem' has no member 'externalMetadata' Documentation states iOS 12.2+; I am building with a minimum deployment target of iOS 18. Code snippet: import Foundation import AVFoundation /// ... in function ... // create metadata as described in https://developer.apple.com/videos/play/wwdc2022/110338 var title = AVMutableMetadataItem() title.identifier = .commonIdentifierAlbumName title.value = "My Title" as NSString? title.extendedLanguageTag = "und" var playerItem = await AVPlayerItem(asset: composition) playerItem.externalMetadata = [ title ]
Replies
0
Boosts
0
Views
104
Activity
Apr ’25
How to Get Device Orientation in Background (PiP) Mode?
My app is a camera app that supports Picture-in-Picture (PiP) mode. Normally, when the device rotates, I get the device orientation from iOS and use it to rotate the camera feed so that the preview stays correctly aligned. However, when the app enters PiP mode, it is considered to be in the background, and I can no longer receive orientation updates from the system. As a result, I can’t apply rotation corrections to the camera video in PiP mode. Is there any way to retrieve device orientation while the app is in the background (specifically during PiP mode)? Any guidance would be greatly appreciated. Thank you!
Replies
0
Boosts
0
Views
76
Activity
May ’25
React website working properly in Android but not in Iphone
We have a React website build to scan qr codes. The website is properly working for Android devices but for Iphone we see a camera glitch causing delay in scan which is unexpected. Website URL : https://react-qr-code-scanner-app.vercel.app/
Replies
0
Boosts
0
Views
49
Activity
Jul ’25
Capturing multiple screens no longer works with macOS Sequoia
Capturing more than one display is no longer working with macOS Sequoia. We have a product that allows users to capture up to 2 displays/screens. Our application is using gstreamer which in turn is based on AVFoundation. I found a quick way to replicate the issue by just running 2 captures from separate terminals. Assuming display 1 has device index 0, and display 2 has device index 1, here are the steps: install gstreamer with brew install gstreamer Then open 2 terminal windows and launch the following processes: terminal 1 (device-index:0): gst-launch-1.0 avfvideosrc -e device-index=0 capture-screen=true ! queue ! videoscale ! video/x-raw,width=640,height=360 ! videoconvert ! osxvideosink terminal 2 (device-index:1): gst-launch-1.0 avfvideosrc -e device-index=1 capture-screen=true ! queue ! videoscale ! video/x-raw,width=640,height=360 ! videoconvert ! osxvideosink The first process that is launched will show the screen, the second process launched will not. Testing this on macOS Ventura and Sonoma works as expected, showing both screens. I submitted the same issue on Feedback Assistant: FB15900976
Replies
2
Boosts
0
Views
393
Activity
Apr ’25
Can a Location-Based Audio AR Experience Run in the Background on iOS?
Hi everyone! I’ve developed a location-based Audio AR app in Unity with FMOD & Resonance Audio and AirPods Pro Head-Tracking to create a ubiquitous augmented soundscape experience. Think of it as an audio version of Pokémon Go, but with a more precise location requirement to ensure spatial audio is placed correctly. I want this experience to run in the background on iOS, but from what I’ve gathered, it seems Unity doesn’t support this well. So, I’m considering developing a Swift version instead. Since this is primarily for research purposes, privacy concerns are not a major issue in my case. However, I’ve come across some potential challenges: Real-time precise location updates – Can iOS provide fully instantaneous, high-accuracy location updates in the background? Continuous real-time data processing – Can an app continuously process spatial audio, head-tracking, and location data while running in the background? I’m not sure if newer iOS versions have improved in these areas or if there are workarounds to achieve this. Would this kind of experience be feasible to run in the background on iOS? Any insights or pointers would be greatly appreciated! I’m very new to iOS development, so apologies if this is a basic question. Thanks in advance!
Replies
0
Boosts
0
Views
110
Activity
Apr ’25
How to get a callback once a requested frameDuration change has been applied?
When changing a camera's exposure, AVFoundation provides a callback which offers the timestamp of the first frame captured with the new exposure duration: AVCaptureDevice.setExposureModeCustom(duration:, iso:, completionHandler:). I want to get a similar callback when changing frame duration. After setting AVCaptureDevice.activeVideoMinFrameDuration or AVCaptureDevice.activeVideoMinFrameDuration to a new value, how can I compute the index or the timestamp of the first camera frame which was captured using the newly set frame duration?
Replies
0
Boosts
0
Views
529
Activity
Aug ’25
Apple Music web player will not work on wkwebview web browser or electron chromium browser.
Hello, I'm trying to create a webbrowser but currently when signed into apple music webplayer I get the following message when I attempt to play on any versions of my webbrowser: Not available on the web You can listen to this in the Apple Music app. Is there a way to setup DRM (assuming this is the issue) with apple to allow my webbrowser to play this content? I believe Apple TV is also affected. Thank you ahead of time.
Replies
0
Boosts
0
Views
165
Activity
May ’25
Number of songs in the Apple Music Feed
Hello, I'm evaluating the Apple Music Feed dataset and I noticed that the total number of songs available in the feed is too small. As of today, the number of objects returned in each feed is: 51,198,712 albums 23,093,698 artists 173,235,315 songs This gives an average of 3.38 songs per album which is quite low. Also, iterating on the data I see that there are albums referencing songs that don't exist in the songs feed. I would like to know: Is the feed data incomplete? If so, in what situations an object may be missing from the feed? Thank you in advance!
Replies
0
Boosts
0
Views
303
Activity
Aug ’25
iOS 18 CarPlay: “There was a problem loading this content” error after playback
In iOS 18, CarPlay shows an error: “There was a problem loading this content” after playback starts. Audio works fine, but the Now Playing screen doesn’t load. I’m using MPPlayableContentManager. This worked fine in iOS 17. Anyone else seeing this error in iOS 18?
Replies
0
Boosts
0
Views
114
Activity
May ’25