Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Activity

APMP & Photography?
Hi, I'm a fan of the gallery in vision pro which has video as well as still photography but I'm wondering if Apple has considered adding the projected media tags to heic so that we can go that next step from Spatial photos to Immersive photos. I have a device that can give me 12k x 6k fisheye images in HDR, but it can't do it at a framerate or resolution that's good enough for video, so I want to cut my losses and show off immersive photos instead. Is there something Apple is already working on for APMP stills or should I create my own app that reads metadata inside a HEIC that I infer in a similar way to the demo "ProjectedMediaConversion" is doing for Video. It would be great to have 180VR photos, which could show as Spatial in a gallery view, but going immersive would half-surround you instead of floating in the blurred view. I think that would be a pretty amazing effect.
2
0
298
Oct ’25
AVFoundation Custom Video Compositor Skipping Frames During AVPlayer Playback Despite 60 FPS Frame Duration
I'm building a Swift video editor with AVFoundation and a custom compositor. Despite setting AVVideoComposition.frameDuration to 60 FPS, I'm seeing significant frame skipping during playback. Console Output Shows Frame Skipping Frame #0 at 0.0 ms (fps: 60.0) Frame #2 at 33.333333333333336 ms (fps: 60.0) Frame #6 at 100.0 ms (fps: 60.0) Frame #10 at 166.66666666666666 ms (fps: 60.0) Frame #32 at 533.3333333333334 ms (fps: 60.0) Frame #62 at 1033.3333333333335 ms (fps: 60.0) Frame #96 at 1600.0 ms (fps: 60.0) Instead of frames every ~16.67ms (60 FPS), I'm getting irregular intervals, sometimes 33ms, 67ms, or hundreds of milliseconds apart. Renderer.swift (Key Parts) @MainActor class Renderer: ObservableObject { @Published var playerItem: AVPlayerItem? private let assetManager: ProjectAssetManager? private let compositorId: String func buildComposition() async { // ... load mouse moves/clicks data ... let composition = AVMutableComposition() let videoTrack = composition.addMutableTrack( withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid ) var currentTime = CMTime.zero var layerInstructions: [AVMutableVideoCompositionLayerInstruction] = [] // Insert video segments for videoURL in videoURLs { let asset = AVAsset(url: videoURL) let tracks = try await asset.loadTracks(withMediaType: .video) let assetVideoTrack = tracks.first let duration = try await asset.load(.duration) try videoTrack.insertTimeRange( CMTimeRange(start: .zero, duration: duration), of: assetVideoTrack, at: currentTime ) let layerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videoTrack) let transform = try await assetVideoTrack.load(.preferredTransform) layerInstruction.setTransform(transform, at: currentTime) layerInstructions.append(layerInstruction) currentTime = CMTimeAdd(currentTime, duration) } let videoComposition = AVMutableVideoComposition() videoComposition.frameDuration = CMTime(value: 1, timescale: 60) // 60 FPS // Set render size from first video if let firstURL = videoURLs.first { let firstAsset = AVAsset(url: firstURL) let firstTrack = try await firstAsset.loadTracks(withMediaType: .video).first let naturalSize = try await firstTrack.load(.naturalSize) let transform = try await firstTrack.load(.preferredTransform) videoComposition.renderSize = CGSize( width: abs(naturalSize.applying(transform).width), height: abs(naturalSize.applying(transform).height) ) } let instruction = CompositorInstruction() instruction.timeRange = CMTimeRange(start: .zero, duration: currentTime) instruction.layerInstructions = layerInstructions instruction.compositorId = compositorId videoComposition.instructions = [instruction] videoComposition.customVideoCompositorClass = CustomVideoCompositor.self let playerItem = AVPlayerItem(asset: composition) playerItem.videoComposition = videoComposition self.playerItem = playerItem } } class CompositorInstruction: NSObject, AVVideoCompositionInstructionProtocol { var timeRange: CMTimeRange = .zero var enablePostProcessing: Bool = false var containsTweening: Bool = false var requiredSourceTrackIDs: [NSValue]? var passthroughTrackID: CMPersistentTrackID = kCMPersistentTrackID_Invalid var layerInstructions: [AVVideoCompositionLayerInstruction] = [] var compositorId: String = "" } class CustomVideoCompositor: NSObject, AVVideoCompositing { var sourcePixelBufferAttributes: [String : Any]? = [ kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_32BGRA) ] var requiredPixelBufferAttributesForRenderContext: [String : Any] = [ kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_32BGRA) ] func renderContextChanged(_ newRenderContext: AVVideoCompositionRenderContext) {} func startRequest(_ asyncVideoCompositionRequest: AVAsynchronousVideoCompositionRequest) { guard let sourceTrackID = asyncVideoCompositionRequest.sourceTrackIDs.first?.int32Value, let sourcePixelBuffer = asyncVideoCompositionRequest.sourceFrame(byTrackID: sourceTrackID), let outputBuffer = asyncVideoCompositionRequest.renderContext.newPixelBuffer() else { asyncVideoCompositionRequest.finish(with: NSError(domain: "VideoCompositor", code: -1)) return } let videoComposition = asyncVideoCompositionRequest.renderContext.videoComposition let frameDuration = videoComposition.frameDuration let fps = Double(frameDuration.timescale) / Double(frameDuration.value) let compositionTime = asyncVideoCompositionRequest.compositionTime let seconds = CMTimeGetSeconds(compositionTime) let frameInMilliseconds = seconds * 1000 let frameNumber = Int(round(seconds * fps)) print("Frame #\(frameNumber) at \(frameInMilliseconds) ms (fps: \(fps))") asyncVideoCompositionRequest.finish(withComposedVideoFrame: outputBuffer) } func cancelAllPendingVideoCompositionRequests() {} } VideoPlayerViewModel @MainActor class VideoPlayerViewModel: ObservableObject { let player = AVPlayer() private let renderer: Renderer func loadVideo() async { await renderer.buildComposition() if let playerItem = renderer.playerItem { player.replaceCurrentItem(with: playerItem) } } } What I've Tried Frame skipping is consistent—exact same timestamps on every playback Issue persists even with minimal processing (just passing through buffers) Occurs regardless of compositor complexity Please note that I need every frame at exact millisecond intervals for my application. Frame loss or inconsistent frameInMillisecond values are not acceptable.
1
0
288
Oct ’25
Mute behavior of Volume button on AVPlayerViewController iOS 26
With older iOS versions, when user taps Mute/Volume button on AVPLayerViewController to unmute, the system restores the sound volume of device to the level when user muted before. On iOS 26, when user taps unmute button on screen, the volume starts from 0 (not restore). (but it still restores if user unmutes by pressing physical volume buttons). As I understand, the Volume bar/button on AVPlayerViewController is MPVolumeView, and I can not control it. So this is a feature of the system. But I got complaints that this is a bug. I did not find documents that describe this change of Mute button behavior. I need some bases to explain this situation. Thank you.
0
1
124
Oct ’25
No audio in screen recordings when using AVAudioEngine Voice Processing
Hello, We are developing a real-time speech recognition application and are utilizing AVAudioEngine with voice processing enabled on the input node. However, we have observed that enabling this mode interferes with the built-in iOS screen recording feature - specifically, the recorded video does not capture any audio when this mode is active. Since we want users to be able to record their experience within our app, this issue significantly impacts our functionality. Is there a known workaround or recommended approach to ensure that both voice processing and screen recording can function simultaneously? Any guidance would be greatly appreciated. Thank you!
2
1
345
Oct ’25
PushToTalk
Using the PushToTalk library, call requestBeginTransmitting (channelUUID: UUID) on a Bluetooth device and then use the PTChannelManagerial Delegate proxy method channelManager:(PTChannelManager *)channelManager didActivateAudioSession:(AVAudioSession *)audioSession Start recording sound inside. Completed recording
11
0
902
Oct ’25
Email "Updates to FairPlay Streaming Server SDK" makes no sense
Hi, While I don't normally use FairPlay, I got this email that is so strangely worded I am wondering if it makes sense to people who do know it or if it has some typo or what. You can only generate certificates for SDK 4, also SDK 4 is no longer supported? (Also "will not expire" is imprecise phrasing, certificates presumably will expire, I think it meant to say are still valid / are not invalidated.)
3
0
320
Oct ’25
CoreMIDI driver - flow control
Hi, when a CoreMIDI driver controls physical HW it is probably quite commune to have to control the amount of MIDI data received from the system. What comes to mind is to just delay returning control of the MIDIDriverInterface::Send() callback to the calling process. While the application trying to send MIDI really stalls until the callback returns it seems only to be a side effect of a generally stalled CoreMIDI server. Between the callbacks the application can send as much MIDI data as it wants to CoreMIDI, it's buffering seems to be endless... However the HW might not be able to play out all the data. It seems there is no way to indicate an overflow/full buffer situation back the application/CoreMIDI. How is this supposed to work? Thanks, any hints or pointers are highly appreciated! Hagen.
0
0
148
Oct ’25
New API to control front camera orientation like native Camera app on iPhone 17 with iOS 26?
The iPhone 17’s front camera with the new 18MP square sensor and iOS 26’s Center Stage feature can auto-rotate between portrait and landscape like the native iOS Camera app. Is there a Swift or AVFoundation API that allows developers to manually control front camera orientation in the same way the native Camera app does? Or is this auto-rotation strictly handled by the system without public API access?
0
0
193
Oct ’25
Process to request the restricted entitlement behind “DJ with Apple Music” (tempo control / time-stretch on Apple Music streams)?
Hi, I’m an iOS developer building an app with an use case that needs advanced playback on Apple Music subscription streams, specifically: • Real-time tempo change (BPM) during playback — i.e., time-stretch with key-lock, not just crossfade. • Beat-matched transitions between tracks. From what I can tell, this capability seems to exist only for approved partners and isn’t available through public MusicKit. Question: What’s the official request path to be evaluated for that restricted partner entitlement (application form, questionnaire, NDA, or internal team/BD contact)? If the entitlement identifier is internal, how can I get my account routed to the right Apple Music team? For reference, publicly announced partners include Algoriddim djay, Serato DJ Pro, rekordbox (AlphaTheta), and Engine DJ—all of which appear to implement mixing features that imply advanced playback (tempo/beat-matching) on Apple Music content. I’d prefer not to share product details publicly for the moment and can provide specifics privately if needed. Thanks in advance!
0
1
227
Oct ’25
watchOS 26: Audio Playback Interrupted by Fitness Notifications Across Multiple Apps
After upgrading to watchOS 26, users report that when playing music on Apple Watch, if a fitness reminder is received, the music automatically pauses and users need to manually tap the play button to resume music playback. This phenomenon occurs with multiple music and podcast apps. This issue did not exist before the upgrade. We would like to know if this is an Apple bug or if there are any special development configurations needed?"
1
0
159
Oct ’25
Apple Music treats asian artists with romanized names as two different names
Hello, I'm trying to write a shortcut using Toolbox Pro that gets triggered by an accessibility trigger and then favorites the currently playing song. It's working pretty well, but I noticed that for some artists, especially asian ones, it simply doesn't work. While debugging, I noticed that the tool uses the same song ID, artist ID, everything as it should to search for the song and favorite it. However, I noticed that Apple Music treats artists with romanized names as two separate artists! https://music.apple.com/br/artist/王菲/41760704 https://music.apple.com/br/artist/faye-wong/41760704?l=en-GB You can see that the ID is the same (41760704). It seems that, when I search for the artist, the first artist (王菲) returns, so that when I open URLs on the web for the artist I can see a star next to the song name, meaning that it got a like. However, the romanized artist (faye-wong) doesn't have a like on the same song. This is very weird, right?
0
0
64
Oct ’25
iOS26中ALAssetsLibrary 编译报错问题
mac os 系统版本:26.0 (25A354) Xcode版本:Version 26.0 (17A324) 项目编译报错 `SwiftExplicitDependencyCompileModuleFromInterface arm64 /Users/zhz/Library/Developer/Xcode/DerivedData/ModuleCache.noindex/AssetsLibrary-HTIJ05N58KN3.swiftmodule /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS26.0.sdk/usr/lib/swift/AssetsLibrary.swiftmodule/arm64e-apple-ios.swiftinterface:10:25: error: 'ALAssetsLibrary' is unavailable in iOS: Use PHPhotoLibrary from the Photos framework instead 8 | public import _StringProcessing 9 | public import _SwiftConcurrencyShims 10 | extension AssetsLibrary.ALAssetsLibrary { | `- error: 'ALAssetsLibrary' is unavailable in iOS: Use PHPhotoLibrary from the Photos framework instead 11 | #if compiler(>=5.3) && $NonescapableTypes 12 | @available(iOS, introduced: 9.0, deprecated: 9.0, obsoleted: 26.0) /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS26.0.sdk/System/Library/Frameworks/AssetsLibrary.framework/Headers/ALAssetsLibrary.h:80:12: note: 'ALAssetsLibrary' was obsoleted in iOS 26.0 78 | 79 | OS_EXPORT AL_DEPRECATED(4, "Use PHPhotoLibrary from the Photos framework instead") 80 | @interface ALAssetsLibrary : NSObject { | `- note: 'ALAssetsLibrary' was obsoleted in iOS 26.0 81 | @package 82 | id _internal; /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS26.0.sdk/usr/lib/swift/AssetsLibrary.swiftmodule/arm64e-apple-ios.swiftinterface:1:1: error: failed to build module 'AssetsLibrary'; this SDK is not supported by the compiler (the SDK is built with 'Apple Swift version 6.2 effective-5.10 (swiftlang-6.2.0.17.14 clang-1700.3.17.1)', while this compiler is 'Apple Swift version 6.2 effective-5.10 (swiftlang-6.2.0.19.9 clang-1700.3.19.1)'). Please select a toolchain which matches the SDK.
3
4
1.3k
Oct ’25
Displaying and working with Favorites in iOS app
New to iOS development and I've been trying to make heads or tails of the documentation. I know there is a difference between the data fields returned from songs from the user library and from the category, but whenever I search on the apple site I can't find a list of each. For example, Im trying to get the releaseDate of a song in my library, but it seems I'll have to cross-query either the catalog entry for the using song.catalogID or the song.irsc but when I try to use them I can't find a cross reference between the two. I'm totally turned around. Also trying to determine if a song in my library has been favorited or not? isFavorited (or something similar) doesn't seem to be a thing. Using this code and trying to find a way to display a solid star if the song has been favorited or an empty one if it's not. Seems like a basic request but I can't find anything on how to do it. I've searched docs, googled, tried. Does apple want us to query the user's Favorited Songs playlist or something? How do I know which playlist that is? I know isFavorited isn't a thing, just using it here so you can see what my intension is: HStack(spacing: 10) { Image(systemName: song.isFavorited ? "star.fill" : "star") .foregroundColor(song.isFavorited ? .yellow : .gray) Image(systemName: "magnifyingglass") }
1
0
173
Oct ’25