Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Created

Playing audio live from Bluetooth headset on iPhone speaker
Hi guys, I am having issue in live-streaming audio from Bluetooth headset and playing it live on the iPhone speaker. I am able to redirect audio back to the headset but this is not what I want. The issue happens when I am trying to override output - the iPhone switches to speaker but also switches a microphone. This is example of the code: import AVFoundation class AudioRecorder { let player: AVAudioPlayerNode let engine:AVAudioEngine let audioSession:AVAudioSession let audioSessionOutput:AVAudioSession init() { self.player = AVAudioPlayerNode() self.engine = AVAudioEngine() self.audioSession = AVAudioSession.sharedInstance() self.audioSessionOutput = AVAudioSession() do { try self.audioSession.setCategory(AVAudioSession.Category.playAndRecord, options: [.defaultToSpeaker]) try self.audioSessionOutput.setCategory(AVAudioSession.Category.playAndRecord, options: [.allowBluetooth]) // enables Bluetooth HFP profile try self.audioSession.setMode(AVAudioSession.Mode.default) try self.audioSession.setActive(true) // try self.audioSession.overrideOutputAudioPort(.speaker) // doens't work } catch { print(error) } let input = self.engine.inputNode self.engine.attach(self.player) let bus = 0 let inputFormat = input.inputFormat(forBus: bus) self.engine.connect(self.player, to: engine.mainMixerNode, format: inputFormat) input.installTap(onBus: bus, bufferSize: 512, format: inputFormat) { (buffer, time) -> Void in self.player.scheduleBuffer(buffer) print(buffer) } } public func start() { try! self.engine.start() self.player.play() } public func stop() { self.player.stop() self.engine.stop() } } I am not sure if this is a bug or not. Can somebody point me into the right direction? I there a way to design a custom audio routing? I would also appreciate some good documentation besides AVFoundation docs.
0
0
341
Mar ’25
AVPlayer: Significant Delays and Asset Loss When Playing Partially Downloaded HLS Content Offline
We're experiencing significant issues with AVPlayer when attempting to play partially downloaded HLS content in offline mode. Our app downloads HLS video content for offline viewing, but users encounter the following problems: Excessive Loading Delay: When offline, AVPlayer attempts to load resources for up to 60 seconds before playing the locally available segments Asset Loss: Sometimes AVPlayer completely loses the asset reference and fails to play the video on subsequent attempts Inconsistent Behavior: The same partially downloaded asset might play immediately in one session but take 30+ seconds in another Network Activity Despite Offline Settings: Despite configuring options to prevent network usage, AVPlayer still appears to be attempting network connections These issues severely impact our offline user experience, especially for users with intermittent connectivity. Technical Details Implementation Context Our app downloads HLS videos for offline viewing using AVAssetDownloadTask. We store the downloaded content locally and maintain a dictionary mapping of file identifiers to local paths. When attempting to play these videos offline, we experience the described issues. Current Implementation Here's our current implementation for playing the videos: - (void)presentNativeAvplayerForVideo:(Video *)video navContext:(NavContext *)context { NSString *localPath = video.localHlsPath; if (localPath) { NSURL *videoURL = [NSURL URLWithString:localPath]; NSDictionary *options = @{ AVURLAssetPreferPreciseDurationAndTimingKey: @YES, AVURLAssetAllowsCellularAccessKey: @NO, AVURLAssetAllowsExpensiveNetworkAccessKey: @NO, AVURLAssetAllowsConstrainedNetworkAccessKey: @NO }; AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:videoURL options:options]; AVPlayerViewController *playerViewController = [[AVPlayerViewController alloc] init]; NSArray *keys = @[@"duration", @"tracks"]; [asset loadValuesAsynchronouslyForKeys:keys completionHandler:^{ dispatch_async(dispatch_get_main_queue(), ^{ AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset]; AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem]; playerViewController.player = player; [player play]; }); }]; playerViewController.modalPresentationStyle = UIModalPresentationFullScreen; [context presentViewController:playerViewController animated:YES completion:nil]; } } Attempted Solutions We've tried several approaches to mitigate these issues: Modified Asset Options: NSDictionary *options = @{ AVURLAssetPreferPreciseDurationAndTimingKey: @NO, // Changed to NO AVURLAssetAllowsCellularAccessKey: @NO, AVURLAssetAllowsExpensiveNetworkAccessKey: @NO, AVURLAssetAllowsConstrainedNetworkAccessKey: @NO, AVAssetReferenceRestrictionsKey: @(AVAssetReferenceRestrictionForbidRemoteReferenceToLocal) }; Skipped Asynchronous Key Loading: AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset automaticallyLoadedAssetKeys:nil]; Modified Player Settings: player.automaticallyWaitsToMinimizeStalling = NO; [playerItem setPreferredForwardBufferDuration:2.0]; Added Network Resource Restrictions: playerItem.canUseNetworkResourcesForLiveStreamingWhilePaused = NO; Used File URLs Instead of HTTP URLs where possible Despite these attempts, the issues persist. Expected vs. Actual Behavior Expected Behavior: AVPlayer should immediately begin playback of locally available HLS segments When offline, it should not attempt to load from network for more than a few seconds Once an asset is successfully played, it should be reliably available for future playback Actual Behavior: AVPlayer waits 10-60 seconds before playing locally available segments Network activity is observed despite all network-restricting options Sometimes the player fails completely to play a previously available asset Behavior is inconsistent between playback attempts with the same asset Questions: What is the recommended approach for playing partially downloaded HLS content offline with minimal delay? Is there a way to force AVPlayer to immediately use available local segments without attempting to load from the network? Are there any known issues with AVPlayer losing references to locally stored HLS assets? What diagnostic steps would you recommend to track down the specific cause of these delays? Does AVFoundation have specific timeouts for offline HLS playback that could be configured? Any guidance would be greatly appreciated as this issue is significantly impacting our user experience. Device Information iOS Versions Tested: 14.5 - 18.1 Device Models: iPhone 12, iPhone 13, iPhone 14, iPhone 15 Xcode Version: 15.3-16.2.1
1
0
526
Mar ’25
Background recording app getting killed by watch dog.. how to avoid?
We have the necessary background recording entitlements, and for many users... do not run into any issues. However, there is a subset of users that routinely get recordings ending.. we have narrowed this down and believe it to be the work of the watch dog. First we removed the entire view hierarchy when app is backgrounded. There is just 'Text("Recording")' This got the CPU usage in profiler down to 0%. We saw massive improvements to recording success rate. We walked away assuming that was enough. However we are still seeing the same sort of crashes. All in the background. We're using Observation to drive audio state changes to a Live Activity. Are those Observations causing the problem? Why doesn't apple provide a better API to background audio? The internet is full of weird issues https://stackoverflow.com/questions/76010213/why-is-my-react-native-app-sometimes-terminated-in-the-background-while-tracking https://stackoverflow.com/questions/71656047/why-is-my-react-native-app-terminating-in-the-background-while-recording-ios-r https://github.com/expo/expo/issues/16807 This is such a terrible user experience. And we have very little visibility into what is happening and why. No where in apple documentation states that in order for background recording to work, the app can only be 'Text("Recording")' It does not outline a CPU or memory threshold. It just kills us.
2
0
410
Mar ’25
iPhone 16 Camera Control and AVCaptureSlider – Is there a way to detect which slider is active?
I am following the Apple sample code and trying to add a manual focus lens position slider: @available(iOS 18.0, *) private func addCameraControls() { if !self.session.controls.isEmpty { for control in self.session.controls { self.session.removeControl(control) } } self.cameraControlFocusSlider = nil //Focus Slider if self.videoDevice!.isLockingFocusWithCustomLensPositionSupported { self.cameraControlFocusSlider = AVCaptureSlider("Focus", symbolName: "dot.square", in: 0.0...1.0) self.cameraControlFocusSlider!.setActionQueue(self.sessionQueue) { focusValue in //Do manual focus } if self.session.canAddControl(self.cameraControlFocusSlider!) { self.session.addControl(self.cameraControlFocusSlider!) } } } So there are these AVCaptureSessionControlsDelegate methods: final func sessionControlsDidBecomeActive(_ session: AVCaptureSession) { print ("sessionControlsDidBecomeActive") } final func sessionControlsWillEnterFullscreenAppearance(_ session: AVCaptureSession) { print ("sessionControlsWillEnterFullscreenAppearance") } final func sessionControlsWillExitFullscreenAppearance(_ session: AVCaptureSession) { print ("sessionControlsWillExitFullscreenAppearance") } final func sessionControlsDidBecomeInactive(_ session: AVCaptureSession) { print ("sessionControlsDidBecomeInactive") } So when self.cameraControlFocusSlider is presented, I have to show the current value of the lense position. Lens position can change from auto focus and also from manual focus by the user using the app UI. Is there a way to see if self.cameraControlFocusSlider is active or being used? Please note that I will have more than one AVCaptureSlider in the final code.
0
0
455
Mar ’25
Making sense of AVAudioSession interruption notifications
I have an app under development - demo here - https://youtu.be/VbAfUk_eYl0?si=s6EDBx-4G6P_QbZO - which is sort of an audio player for airdropped files - something useful to musicians who dump work in progress to their phone, make notes, revise and update. I've been testing my handling of audio session interruption notifications, but seems to be a lot of inconsistency in how, when and why iOS delivers them, and I'm wondering if there is some rhyme or reason to it that I'm just not detecting. For example, I am playing a song in my app. Switch to Apple Music and start playing a song there. My app gets an interruption began notification - this is consistent. Switch back to my app, and about half the time, I will get an interruption ended notification (coupled often with a blast of the tail of whatever audio buffer was partially played when the interruption started, even though the engine was stopped - and followed by call to my AVAudioPlayerNodeCompletionCallback - is there some way to avoid this?). Half the time I don't get an interruption ended notification; my app can (as expected) end the interruption by activating the AVAudioSession and playing something. I have not been able to determine any pattern to this behavior, other than that if my app started playing using AVAudioPlayerNode.scheduleSegment rather than scheduleFile I think the notification will be consistently delivered on app activation rather than when I activate the session programmatically. I would like my app to behave deterministically, and would appreciate any help in deciphering what causes the inconsistent behavior in notifications from iOS.
2
0
391
Mar ’25
Lightning to HDMI mirrors
I am developing a VOD playback app, but when I stream video to an external monitor connected via HDMI with Lightning on iOS 18 or later, the screen goes dark and I cannot confirm playback. The app I am developing does not detect the HDMI and display the Player separately, but simply mirrors the video. We have confirmed that the same phenomenon occurs with other services, but we were able to confirm playback with some services such as Apple TV. Please let us know if there are any other necessary settings such as video certificates required for video playback. We would also like to know if the problem occurs with iOS 18 or later.
0
0
278
Mar ’25
Visual isTranslatable: NO; reason: observation failure: noObservations, when trying to play custom compositor video with AVPlayer
I am trying to achieve an animated gradient effect that changes values over time based on the current seconds. I am also using AVPlayer and AVMutableVideoComposition along with custom instruction and class to generate the effect. I didn't want to load any video file, but rather generate a custom video with my own set of instructions. I used Metal Compute shaders to generate the effects and make the video to be 20 seconds. However, when I run the code, I get a frozen player with the gradient applied, but when I try to play the video, I get this warning in the console :- Visual isTranslatable: NO; reason: observation failure: noObservations Here is the screenshot :- My entire code :- import AVFoundation import Metal class GradientVideoCompositorTest: NSObject, AVVideoCompositing { var sourcePixelBufferAttributes: [String: Any]? = [ kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_32BGRA ] var requiredPixelBufferAttributesForRenderContext: [String: Any] = [ kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_32BGRA ] private var renderContext: AVVideoCompositionRenderContext? private var metalDevice: MTLDevice! private var metalCommandQueue: MTLCommandQueue! private var metalLibrary: MTLLibrary! private var metalPipeline: MTLComputePipelineState! override init() { super.init() setupMetal() } func setupMetal() { guard let device = MTLCreateSystemDefaultDevice(), let queue = device.makeCommandQueue(), let library = try? device.makeDefaultLibrary(), let function = library.makeFunction(name: "gradientShader") else { fatalError("Metal setup failed") } self.metalDevice = device self.metalCommandQueue = queue self.metalLibrary = library self.metalPipeline = try? device.makeComputePipelineState(function: function) } func renderContextChanged(_ newRenderContext: AVVideoCompositionRenderContext) { renderContext = newRenderContext } func startRequest(_ request: AVAsynchronousVideoCompositionRequest) { guard let outputPixelBuffer = renderContext?.newPixelBuffer(), let metalTexture = createMetalTexture(from: outputPixelBuffer) else { request.finish(with: NSError(domain: "com.example.gradient", code: -1, userInfo: nil)) return } var time = Float(request.compositionTime.seconds) renderGradient(to: metalTexture, time: time) request.finish(withComposedVideoFrame: outputPixelBuffer) } private func createMetalTexture(from pixelBuffer: CVPixelBuffer) -> MTLTexture? { var texture: MTLTexture? let width = CVPixelBufferGetWidth(pixelBuffer) let height = CVPixelBufferGetHeight(pixelBuffer) let textureDescriptor = MTLTextureDescriptor.texture2DDescriptor( pixelFormat: .bgra8Unorm, width: width, height: height, mipmapped: false ) textureDescriptor.usage = [.shaderWrite, .shaderRead] CVPixelBufferLockBaseAddress(pixelBuffer, .readOnly) if let textureCache = createTextureCache(), let cvTexture = createCVMetalTexture(from: pixelBuffer, cache: textureCache) { texture = CVMetalTextureGetTexture(cvTexture) } CVPixelBufferUnlockBaseAddress(pixelBuffer, .readOnly) return texture } private func renderGradient(to texture: MTLTexture, time: Float) { guard let commandBuffer = metalCommandQueue.makeCommandBuffer(), let commandEncoder = commandBuffer.makeComputeCommandEncoder() else { return } commandEncoder.setComputePipelineState(metalPipeline) commandEncoder.setTexture(texture, index: 0) var mutableTime = time commandEncoder.setBytes(&mutableTime, length: MemoryLayout<Float>.size, index: 0) let threadsPerGroup = MTLSize(width: 16, height: 16, depth: 1) let threadGroups = MTLSize( width: (texture.width + 15) / 16, height: (texture.height + 15) / 16, depth: 1 ) commandEncoder.dispatchThreadgroups(threadGroups, threadsPerThreadgroup: threadsPerGroup) commandEncoder.endEncoding() commandBuffer.commit() } private func createTextureCache() -> CVMetalTextureCache? { var cache: CVMetalTextureCache? CVMetalTextureCacheCreate(kCFAllocatorDefault, nil, metalDevice, nil, &cache) return cache } private func createCVMetalTexture(from pixelBuffer: CVPixelBuffer, cache: CVMetalTextureCache) -> CVMetalTexture? { var cvTexture: CVMetalTexture? let width = CVPixelBufferGetWidth(pixelBuffer) let height = CVPixelBufferGetHeight(pixelBuffer) CVMetalTextureCacheCreateTextureFromImage( kCFAllocatorDefault, cache, pixelBuffer, nil, .bgra8Unorm, width, height, 0, &cvTexture ) return cvTexture } } class GradientCompositionInstructionTest: NSObject, AVVideoCompositionInstructionProtocol { var timeRange: CMTimeRange var enablePostProcessing: Bool = true var containsTweening: Bool = true var requiredSourceTrackIDs: [NSValue]? = nil var passthroughTrackID: CMPersistentTrackID = kCMPersistentTrackID_Invalid init(timeRange: CMTimeRange) { self.timeRange = timeRange } } func createGradientVideoComposition(duration: CMTime, size: CGSize) -> AVMutableVideoComposition { let composition = AVMutableComposition() let instruction = GradientCompositionInstructionTest(timeRange: CMTimeRange(start: .zero, duration: duration)) let videoComposition = AVMutableVideoComposition() videoComposition.customVideoCompositorClass = GradientVideoCompositorTest.self videoComposition.renderSize = size videoComposition.frameDuration = CMTime(value: 1, timescale: 30) // 30 FPS videoComposition.instructions = [instruction] return videoComposition } #include <metal_stdlib> using namespace metal; kernel void gradientShader(texture2d<float, access::write> output [[texture(0)]], constant float &time [[buffer(0)]], uint2 id [[thread_position_in_grid]]) { float2 uv = float2(id) / float2(output.get_width(), output.get_height()); // Animated colors based on time float3 color1 = float3(sin(time) * 0.8 + 0.1, 0.6, 1.0); float3 color2 = float3(0.12, 0.99, cos(time) * 0.9 + 0.3); // Linear interpolation for gradient float3 gradientColor = mix(color1, color2, uv.y); output.write(float4(gradientColor, 1.0), id); }
1
0
365
Mar ’25
AVContentKeySession key renewal on Airplay
Our streaming app uses FairPlay-protected video streams, which previously worked fine when using AVAssetResourceLoaderDelegate to provide CKCs. Recently, we migrated to AVContentKeySession, and while everything works as expected during regular playback, we encountered an issue with AirPlay. Our CKC has a 120-second expiry, so we renew it by calling renewExpiringResponseData.. This trigger the didProvideRenewingContentKeyRequest delegate and we respond with updated CKC. However, when streaming via AirPlay, both video and audio freeze exactly after 120 seconds. To validate the issue, I tested with AVAssetResourceLoaderDelegate and found that I can reproduce the same freeze if I do not renew the key. This suggests that AirPlay is not accepting the renewed CKC when using AVContentKeySession. Additional Details: This issue occurs across different iOS versions and various AirPlay devices. The same content plays without issues when played directly on the device. The renewal process is successful, and segments continue to load, but playback remains frozen. Tried renewing the CKC bit early (100s). I also tried setting player.usesExternalPlaybackWhileExternalScreenIsActive = true, but the issue persists. We don't use persistentKey. Is there anything else that needs to be considered for proper key renewal when AirPlaying? Any help on how to fix this or confirmation if this is a known issue would be greatly appreciated.
4
2
648
Mar ’25
Error -50 writing to AVAudioFile
I'm trying to write 16-bit interleaved 2-channel data captured from a LiveSwitch audio source to a AVAudioFile. The buffer and file formats match but I get a bad parameter error from the API. Does this API not support the specified format or is there some other issue? Here is the debugger output. (lldb) po audioFile.url ▿ file:///private/var/mobile/Containers/Data/Application/1EB14379-0CF2-41B6-B742-4C9A80728DB3/tmp/Heart%20Sounds%201 - _url : file:///private/var/mobile/Containers/Data/Application/1EB14379-0CF2-41B6-B742-4C9A80728DB3/tmp/Heart%20Sounds%201 - _parseInfo : nil - _baseParseInfo : nil (lldb) po error Error Domain=com.apple.coreaudio.avfaudio Code=-50 "(null)" UserInfo={failed call=ExtAudioFileWrite(_impl->_extAudioFile, buffer.frameLength, buffer.audioBufferList)} (lldb) po buffer.format <AVAudioFormat 0x302a12b20: 2 ch, 44100 Hz, Int16, interleaved> (lldb) po audioFile.fileFormat <AVAudioFormat 0x302a515e0: 2 ch, 44100 Hz, Int16, interleaved> (lldb) po buffer.frameLength 882 (lldb) po buffer.audioBufferList ▿ 0x0000000300941e60 - pointerValue : 12894608992 This code handles the details of converting the Live Switch frame into an AVAudioPCMBuffer. extension FMLiveSwitchAudioFrame { func convertedToPCMBuffer() -> AVAudioPCMBuffer { Self.convertToAVAudioPCMBuffer(from: self)! } static func convertToAVAudioPCMBuffer(from frame: FMLiveSwitchAudioFrame) -> AVAudioPCMBuffer? { // Retrieve the audio buffer and format details from the FMLiveSwitchAudioFrame guard let buffer = frame.buffer(), let format = buffer.format() as? FMLiveSwitchAudioFormat else { return nil } // Extract PCM format details from FMLiveSwitchAudioFormat let sampleRate = Double(format.clockRate()) let channelCount = AVAudioChannelCount(format.channelCount()) // Determine bytes per sample based on bit depth let bitsPerSample = 16 let bytesPerSample = bitsPerSample / 8 let bytesPerFrame = bytesPerSample * Int(channelCount) let frameLength = AVAudioFrameCount(Int(buffer.dataBuffer().length()) / bytesPerFrame) // Create an AVAudioFormat from the FMLiveSwitchAudioFormat guard let avAudioFormat = AVAudioFormat(commonFormat: .pcmFormatInt16, sampleRate: sampleRate, channels: channelCount, interleaved: true) else { return nil } // Create an AudioBufferList to wrap the existing buffer let audioBufferList = UnsafeMutablePointer<AudioBufferList>.allocate(capacity: 1) audioBufferList.pointee.mNumberBuffers = 1 audioBufferList.pointee.mBuffers.mNumberChannels = channelCount audioBufferList.pointee.mBuffers.mDataByteSize = UInt32(buffer.dataBuffer().length()) audioBufferList.pointee.mBuffers.mData = buffer.dataBuffer().data().mutableBytes // Directly use LiveSwitch buffer // Transfer ownership of the buffer to AVAudioPCMBuffer let pcmBuffer = AVAudioPCMBuffer(pcmFormat: avAudioFormat, bufferListNoCopy: audioBufferList) /* { buffer in // Ensure the buffer is freed when AVAudioPCMBuffer is deallocated buffer.deallocate() // Only call this if LiveSwitch allows manual deallocation } */ pcmBuffer?.frameLength = frameLength return pcmBuffer } } This is the handler that is invoked with every frame in order to convert it for use with AVAudioFile and optionally update a scrolling signal display on the screen. private func onRaisedFrame(obj: Any!) -> Void { // Bail out early if no one is interested in the data. guard isMonitoring else { return } // Convert LS frame to AVAudioPCMBuffer (no-copy) let frame = obj as! FMLiveSwitchAudioFrame let buffer = frame.convertedToPCMBuffer() // Hand subscribers a reference to the buffer for rendering to display. bufferPublisher?.send(buffer) // If we have and output file, store the data there, as well. guard let audioFile = self.audioFile else { return } do { try audioFile.write(from: buffer) // FIXME: This call is throwing error -50 } catch { FMLiveSwitchLog.error(withMessage: "Failed to write buffer to audio file at \(audioFile.url): \(error)") self.audioFile = nil } } This is how the audio file is being setup. static var recordingFormat: AVAudioFormat = { AVAudioFormat(commonFormat: .pcmFormatInt16, sampleRate: 44_100, channels: 2, interleaved: true)! }() let audioFile = try AVAudioFile(forWriting: outputURL, settings: Self.recordingFormat.settings)
0
0
430
Mar ’25
Fairplay expiration issue
We noticed the behaviour of expiration of FairPlay license changed from iOS 16.x to some version iOS 17 and the latest iOS 18.3.2 and Safari. On iOS 16.x, the video playback will stop when the license expires, but on iOS 17.x + the video continues but no audio and no error fired. On latest Safari the video and audio all continues. Any changes for the latest FairPlay and how we adapt this from the license server? Thanks
0
0
397
Mar ’25
Prevent iOS from Switching Between Back Camera Lenses in getUserMedia (Safari/WebView, iOS 18)
I’m developing a hybrid app (WebView / Turbo Native) that uses getUserMedia to access the back camera for a PPG/heart rate measurement feature (the user places their finger on the camera). Problem: Even when I specify constraints like: { video: { deviceId: '...', facingMode: { exact: 'environment' }, advanced: [{ zoom: 1.0 }] }, audio: false } On iPhone 15 (iOS 18), iOS unexpectedly switches between the wide, ultra-wide, and telephoto lenses during the measurement. This breaks the heart rate detection, and it forces the user to move their finger in the middle of the measurement. Question: Is there any way, via getUserMedia/WebRTC, to force iOS to use only the wide-angle lens and prevent automatic lens switching? I know that with AVFoundation (Swift) you can pick .builtInWideAngleCamera, but I’m hoping to avoid building a custom native layer and would prefer to stick with WebView/JavaScript if possible to save time and complexity. Any suggestions, workarounds, or updates from Apple would be greatly appreciated! Thanks a lot!
1
0
428
Mar ’25
Capturing & Processing ProRaw 48MP images is slow
Hey, I have a camera app that captures a ProRaw photo and then runs a few Core Image filters before saving it to the device as a HEIC. However I'm finding that capturing at 48MP is rather slow. Testing a minimal pipeline on an iPhone 16 Pro: Shutter press => file received in output: 1.2 ~ 1.6s CIRawFilter created using photo file representation then rendered to context, without any filters: 0.8s ~ 1s Saving to device ~0.15s Is this the expected time for capturing processing? The native camera app seems to save the images within half a second. I'm using QualityPrioritization.balanced and the highest resolution available which is 48MP. Would using the CIRawFilter with the pixelBuffer from the photo output be faster? I tried it but couldn't get it to output an image. Are there any other things I could try to speed this up? Is it possible to capture at 24MP instead? Thanks, Alex
1
1
448
Mar ’25
AVPlayer handle 403 error
Hello, Is there a way to handle 403 error returned by the server, eg token expired ? Cannot find any information about this and everything that I tried wasn't working (addObserver, NotificationCenter with .AVPlayerItemNewErrorLogEntry, AVPlayerItemPlaybackStalled, ...) Thank you very much.
0
0
144
Mar ’25
Create live photo error
I want to create a Live Photo. The project includes a .jpg image and a .mov video (2 seconds). Two permissions in xcode have been added: Privacy - Photo Library Usage Description Privacy - Photo Library Additions Usage Description Simulate: iphone 16, ios 18.3 The codes in ContentView.swift : private func saveLivePhoto(imageURL: URL, videoURL: URL, completion: @escaping (Bool, Error?) -> Void) { PHPhotoLibrary.shared().performChanges { let creationRequest = PHAssetCreationRequest.forAsset() let options = PHAssetResourceCreationOptions() options.shouldMoveFile = false creationRequest.addResource(with: .photo, fileURL: imageURL, options: options) creationRequest.addResource(with: .pairedVideo, fileURL: videoURL, options: options) } completionHandler: { success, error in DispatchQueue.main.async { print(error) completion(success, error) } } } guard let imageURL = Bundle.main.url(forResource: "livephoto", withExtension: "jpeg"), let videoURL = Bundle.main.url(forResource: "livephoto", withExtension: "mov") else { showAlertMessage(title: "error", message: "cant find Live Photo ") return } print("imageURL: \(imageURL)") print("videoURL: \(videoURL)") saveLivePhoto(imageURL: imageURL, videoURL: videoURL) { success, error in if success { xxxxx } else { xxxxx } } Really need help, thanks
0
0
229
Mar ’25
Create live photo throw Optional(Error Domain=PHPhotosErrorDomain Code=-1 "(null)")
I want to create a Live Photo. The project includes a .jpg image and a .mov video (2 seconds). I am sure they are correct. Two permissions in xcode have been added: Privacy - Photo Library Usage Description Privacy - Photo Library Additions Usage Description Simulate: iphone 16, ios 18.3 The codes in ContentView.swift : private func saveLivePhoto(imageURL: URL, videoURL: URL, completion: @escaping (Bool, Error?) -> Void) { PHPhotoLibrary.shared().performChanges { let creationRequest = PHAssetCreationRequest.forAsset() let options = PHAssetResourceCreationOptions() options.shouldMoveFile = false creationRequest.addResource(with: .photo, fileURL: imageURL, options: options) creationRequest.addResource(with: .pairedVideo, fileURL: videoURL, options: options) } completionHandler: { success, error in DispatchQueue.main.async { print(error) completion(success, error) } } } guard let imageURL = Bundle.main.url(forResource: "livephoto", withExtension: "jpeg"), let videoURL = Bundle.main.url(forResource: "livephoto", withExtension: "mov") else { showAlertMessage(title: "error", message: "cant find Live Photo ") return } print("imageURL: \(imageURL)") print("videoURL: \(videoURL)") saveLivePhoto(imageURL: imageURL, videoURL: videoURL) { success, error in if success { xxxxx } else { xxxxx } } Really need help, thanks
1
1
292
Mar ’25
access transport in Logic Pro
hi, i need to read wether the transport is playing or stopped but my current method that works for vst does not work for au. is there a lpx resource available for developers anywhere? if (auto* playHead = processor->getPlayHead()) { juce::AudioPlayHead::CurrentPositionInfo posInfo; if (playHead->getCurrentPosition(posInfo)) { bool isCurrentlyPlaying = posInfo.isPlaying; if (isCurrentlyPlaying != wasTransportPlaying) { if (isCurrentlyPlaying) { wasTransportPlaying = isCurrentlyPlaying; startAllTimers(); } else { wasTransportPlaying = isCurrentlyPlaying; stopAllTimers(); } } } } thanks :)
0
0
282
Mar ’25
Apple FPS Package Request – No Response Received. How Long Does It Usually Take?
t has been quite some time since I requested the Apple FPS package, yet I haven’t received it. I haven’t received any email either. Is there a developer support inquiry center where I can check the status of the process? Alternatively, could you share approximately how long it took for you to receive a response email?
0
0
364
Mar ’25