I have tried everything. The songs load unto the playlists and on searches, but when prompted to play, they just won't play.
I have a wrapper since my main player (which carries the buttons for play/rewind/forward/etc.), is in Objc.
//
// ApplePlayerWrapper.swift
// UniversallyMac
//
// Created by Dorian Mattar on 11/10/24.
//
import Foundation
import MusicKit
import MediaPlayer
@objc public class MusicKitWrapper: NSObject {
@objc public static let shared = MusicKitWrapper()
private let player = ApplicationMusicPlayer.shared
// Play the current track
@objc public func play() {
guard !player.queue.entries.isEmpty else {
print("Queue is empty. Cannot start playback.")
return
}
logPlayerState(message: "Before play")
Task {
do {
try await player.prepareToPlay()
try await player.play()
print("Playback started successfully.")
} catch {
if let nsError = error as NSError? {
print("NSError Code: \(nsError.code), Domain: \(nsError.domain)")
}
}
logPlayerState(message: "After play")
}
}
// Log the current player state
@objc public func logPlayerState(message: String = "") {
print("Player State - \(message):")
print("Playback Status: \(player.state.playbackStatus)")
print("Queue Count: \(player.queue.entries.count)")
// Only log current track details if the player is playing
if player.state.playbackStatus == .playing {
if let currentEntry = player.queue.currentEntry {
print("Current Track: \(currentEntry.title)")
print("Current Position: \(player.playbackTime) seconds")
print("Track Length: \(currentEntry.endTime ?? 0.0) seconds")
} else {
print("No current track.")
}
} else {
print("No track is playing.")
}
print("----------")
}
// Debug the queue
@objc public func debugQueue() {
print("Debugging Queue:")
for (index, entry) in player.queue.entries.enumerated() {
print("\(index): \(entry.title)")
}
}
// Ensure track availability in the queue
public func queueTracks(_ tracks: [Track]) {
Task {
do {
for track in tracks {
// Validate Play Parameters
guard let playParameters = track.playParameters else {
print("Track \(track.title) has no Play Parameters.")
continue
}
// Log the Play Parameters
print("Track Title: \(track.title)")
print("Play Parameters: \(playParameters)")
print("Raw Values: \(track.id.rawValue)")
// Ensure the ID is valid
if track.id.rawValue.isEmpty {
print("Track \(track.title) has an invalid or empty ID in Play Parameters.")
continue
}
// Queue the track
try await player.queue.insert(track, position: .afterCurrentEntry)
print("Queued track: \(track.title)")
}
print("Tracks successfully added to the queue.")
} catch {
print("Error queuing tracks: \(error)")
}
debugQueue()
}
}
// Clear the current queue
@objc public func resetMusicPlayer() {
Task {
player.stop()
player.queue.entries.removeAll()
print("Queue cleared.")
print("Apple Music player reset successfully.")
}
}
}
I opened an Apple Dev. ticket, but I'm trying here as well. Thanks!
Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
It seems currently 5G Network Slicing is only available on URL Request / URL Session / Low level network and not on AVPlayer, is it possible to have the app using default slice when start the app and only enable Network Slicing when streaming video via AVPlayer?
Does an artist similarity station broaden selection variety compared to a song similarity station?
You don't have to answer if it is against nondisclosure terms.
I set both the AVCapturePhotoOutput and the AVCapturePhotoSettings with maxPhotoDimensions = .init(width: 8064, height: 6048), but still get the 12mp photo.
Topic:
Media Technologies
SubTopic:
Photos & Camera
In our Apple TV application, we are using the native AVPlayer for live playback functionality. During live restart playback, we intermittently encounter an error when the playback timeline approaches the actual live event end time.
Error:
The operation couldn’t be completed. (CoreMediaErrorDomain error -16839 - Unable to get playlist before long download timer) / Failure reason:
Scenario:
The live event is scheduled from 7:00 AM to 8:00 AM.
Restart playback begins at 7:20 AM, allowing the user to watch the event from the start while the live stream continues in real-time.
As the restart playback timeline approaches the actual event end time (8:00 AM), AVPlayer displays an error, and playback continues in the background.
Topic:
Media Technologies
SubTopic:
Streaming
Tags:
FairPlay Streaming
Media Player
HTTP Live Streaming
I have spent a long time refactoring lots of older Swift code to compile without error in Swift 6.
The app is a v3 audio unit host and audio unit.
Having installed Sonoma and XCode 16 I compile the code using Swift 6 and it compiles and runs without any warnings or errors.
My host will load my AU no problem.
LOGIC PRO is still the ONLY audio unit host that will load native Mac V3 audio units and so I like to test my code using Logic.
In Sonoma with XCode 16...
My AU passes the most stringent AUVAL tests both in terminal and Logic pro.
If I compile the AU source in Swift 5 Logic will see the AU, load it and run it without problems.
But when I compile the AU in Swift 6 Logic sees the AU, will scan it and verify it passes the tests but will not load the AU. In XCode I see a log message that a "helper application failed to run" but the debugger never connects to the AU and I don't think Logic even gets as far as instantiating the AU.
So... what is causing this? I'm stumped..
Developing AUv3 is a brain-aching maze of undocumented hurdles and I'm hoping someone might have found a solution for this one. Meanwhile I guess my only option is to continue using the Swift 5 compiler.
(appending a little note just to mention that all the DSP code is written in C/C++, Swift is used mainly for the user interface and also does some offline thready work )
Hello All,
It seems that it's "very easy" (😬) to implement a little Swift code inside the prepared AU using Xcode 16.2 on Sequoia 15.1.1 and a Mac Studio M1 Ultra, but my issue is that I finally don't know... where.
The documentation says that I've to find the AudioUnitViewController.swift file and then modify the render block :
audioUnit.renderBlock = { (numFrames, ioData) in
// Process audio here
}
in the Xcode project automatically generated, but I didn't find such a file...
If somebody can help me in showing where is the file to be modified, I'll be very grateful !
Thank you very much.
J
How can I use my RGB Curve points:
let redCurve = [CIVector(x: 0, y: 0), CIVector(x: 0.235, y: 0.152), CIVector(x: 0.5, y: 0.5), CIVector(x: 1, y: 1)]
let greenCurve = [CIVector(x: 0, y: 0), CIVector(x: 0.247, y: 0.196), CIVector(x: 0.5, y: 0.5), CIVector(x: 1, y: 1)]
let blueCurve = [CIVector(x: 0, y: 0), CIVector(x: 0.235, y: 0.184), CIVector(x: 0.466, y: 0.466), CIVector(x: 1, y: 1)]
in colorCurvesFilter which I've found in Apple Docs:
func colorCurves(inputImage: CIImage) -> CIImage {
let colorCurvesEffect = CIFilter.colorCurves()
colorCurvesEffect.inputImage = inputImage
colorCurvesEffect.curvesDomain = CIVector(x: 0, y: 1)
colorCurvesEffect.curvesData = Data(
bytes: [Float32]([
0.0,0.0,0.0,
0.8,0.8,0.8,
1.0,1.0,1.0
]), count: 36)
colorCurvesEffect.colorSpace = CGColorSpaceCreateDeviceRGB()
return colorCurvesEffect.outputImage!
}
I recently installed a rear-view camera in my car, and ever since, I've been experiencing a frustrating issue with my CarPlay. After about 15 seconds of playing audio via Bluetooth, the sound stops coming out of the speakers, even though the song continues to run in the background.
For context, my stereo system is an aftermarket unit that I installed to enable CarPlay functionality. Everything worked perfectly before adding the rear-view camera. Unfortunately, my unit does not have a port for a wired connection, so I can't test the audio using a cable.
Has anyone experienced a similar issue? Could the camera installation be interfering with the Bluetooth or audio system somehow? Any advice or troubleshooting tips would be greatly appreciated!
No video is playing and the same error is coming. I have tried everything by resetting. Please get an update as soon as possible.
error-The operation couldn't be completed. (CoreMediaErrorDomain error -42709.)
I would appreciate help in coding or an explanation what to use in swift for an app which will be able to capture LiDAR scanning and RGB data from taken pictures, generate a 3D mesh, and create .OBJ, .MTL, and .JPEG file set for further manipulation of 3D model.
Topic:
Media Technologies
SubTopic:
Photos & Camera
Tags:
3D Graphics
Swift Playground
Object Capture
I’m experiencing a crash at runtime when trying to extract audio from a video. This issue occurs on both iOS 18 and earlier versions. The crash is caused by the following error:
*** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: '*** -[AVAssetExportSession exportAsynchronouslyWithCompletionHandler:] Cannot call exportAsynchronouslyWithCompletionHandler: more than once.'
*** First throw call stack:
(0x1875475ec 0x184ae1244 0x1994c49c0 0x217193358 0x217199899 0x192e208b9 0x217192fd9 0x30204c88d 0x3019e5155 0x301e5fb41 0x301af7add 0x301aff97d 0x301af888d 0x301aff27d 0x301ab5fa5 0x301ab6101 0x192e5ee39)
libc++abi: terminating due to uncaught exception of type NSException
My previous code worked fine, but it's crashing with Swift 6.
Does anyone know a solution for this?
## **Previous code:**
func extractAudioFromVideo(from videoURL: URL, exportHandler: ((AVAssetExportSession, CurrentValueSubject<Float, Never>?) -> Void)? = nil, completion: @escaping (Swift.Result<URL, Error>) -> Void) {
let asset = AVAsset(url: videoURL)
// Create an AVAssetExportSession to export the audio track
guard let exportSession = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetAppleM4A) else {
completion(.failure(NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Failed to create AVAssetExportSession"])))
return
}
// Set the output file type and path
guard let filename = videoURL.lastPathComponent.components(separatedBy: ["."]).first else { return }
let outputURL = VideoUtils.getTempAudioExportUrl(filename)
VideoUtils.deleteFileIfExists(outputURL.path)
exportSession.outputFileType = .m4a
exportSession.outputURL = outputURL
let audioExportProgressPublisher = CurrentValueSubject<Float, Never>(0.0)
if let exportHandler = exportHandler {
exportHandler(exportSession, audioExportProgressPublisher)
}
// Periodically check the progress of the export session
let timer = Timer.scheduledTimer(withTimeInterval: 0.1, repeats: true) { _ in
audioExportProgressPublisher.send(exportSession.progress)
}
// Export the audio track asynchronously
exportSession.exportAsynchronously {
switch exportSession.status {
case .completed:
completion(.success(outputURL))
case .failed:
completion(.failure(exportSession.error ?? NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Unknown error occurred while exporting audio"])))
case .cancelled:
completion(.failure(NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Export session was cancelled"])))
default:
completion(.failure(NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Unknown export session status"])))
}
// Invalidate the timer when the export session completes or is cancelled
timer.invalidate()
}
}
## New Code:
func extractAudioFromVideo(from videoURL: URL, exportHandler: ((AVAssetExportSession, CurrentValueSubject<Float, Never>?) -> Void)? = nil, completion: @escaping (Swift.Result<URL, Error>) -> Void) async {
let asset = AVAsset(url: videoURL)
// Create an AVAssetExportSession to export the audio track
guard let exportSession = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetAppleM4A) else {
completion(.failure(NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Failed to create AVAssetExportSession"])))
return
}
// Set the output file type and path
guard let filename = videoURL.lastPathComponent.components(separatedBy: ["."]).first else { return }
let outputURL = VideoUtils.getTempAudioExportUrl(filename)
VideoUtils.deleteFileIfExists(outputURL.path)
let audioExportProgressPublisher = CurrentValueSubject<Float, Never>(0.0)
if let exportHandler {
exportHandler(exportSession, audioExportProgressPublisher)
}
if #available(iOS 18.0, *) {
do {
try await exportSession.export(to: outputURL, as: .m4a)
let states = exportSession.states(updateInterval: 0.1)
for await state in states {
switch state {
case .pending, .waiting:
break
case .exporting(progress: let progress):
print("Exporting: \(progress.fractionCompleted)")
if progress.isFinished {
completion(.success(outputURL))
}else if progress.isCancelled {
completion(.failure(NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Export session was cancelled"])))
}else {
audioExportProgressPublisher.send(Float(progress.fractionCompleted))
}
}
}
}catch let error {
print(error.localizedDescription)
}
}else {
// Periodically check the progress of the export session
let publishTimer = Timer.publish(every: 0.1, on: .main, in: .common)
.autoconnect()
.sink { [weak exportSession] _ in
guard let exportSession else { return }
audioExportProgressPublisher.send(exportSession.progress)
}
exportSession.outputFileType = .m4a
exportSession.outputURL = outputURL
await exportSession.export()
switch exportSession.status {
case .completed:
completion(.success(outputURL))
case .failed:
completion(.failure(exportSession.error ?? NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Unknown error occurred while exporting audio"])))
case .cancelled:
completion(.failure(NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Export session was cancelled"])))
default:
completion(.failure(NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Unknown export session status"])))
}
// Invalidate the timer when the export session completes or is cancelled
publishTimer.cancel()
}
}
I am creating an app that decodes H.265 elementary streams on iOS.
I use VideoToolBox to decode from H.265 to NV12.
The decoded data is enqueued in the CMSampleBufferDisplayLayer as a CMSampleBuffer.
However, nothing is displayed in the VideoPlayerView. It remains black.
The decoding in VideoToolBox is successful. I confirmed this by saving the NV12 data in the CMSampleBuffer to a file and displaying it using a tool.
Why is nothing displayed in the VideoPlayerView?
I can provide other source code as well.
//
// VideoPlayerView.swift
// H265Decoder
//
// Created by Kohshin Tokunaga on 2025/02/15.
//
import SwiftUI
import AVFoundation
struct VideoPlayerView: UIViewRepresentable {
// Return an H265Player as the coordinator, and start playback there.
func makeCoordinator() -> H265Player {
H265Player()
}
func makeUIView(context: Context) -> UIView {
let uiView = UIView(frame: .zero)
// Base layer for attaching sublayers
uiView.backgroundColor = .black // Screen background color (for iOS)
// Create the display layer and add it to uiView.layer
let displayLayer = context.coordinator.displayLayer
displayLayer.frame = uiView.bounds
displayLayer.backgroundColor = UIColor.clear.cgColor
uiView.layer.addSublayer(displayLayer)
// Start playback
context.coordinator.startPlayback()
return uiView
}
func updateUIView(_ uiView: UIView, context: Context) {
// Reset the frame of the AVSampleBufferDisplayLayer when the view's size changes.
let displayLayer = context.coordinator.displayLayer
displayLayer.frame = uiView.layer.bounds
// Optionally update the layer's background color, etc.
uiView.backgroundColor = .black
displayLayer.backgroundColor = UIColor.clear.cgColor
// Flush transactions if necessary
CATransaction.flush()
}
}
//
// H265Player.swift
// H265Decoder
//
// Created by Kohshin Tokunaga on 2025/02/15.
//
import Foundation
import AVFoundation
import CoreMedia
class H265Player: NSObject, VideoDecoderDelegate {
let displayLayer = AVSampleBufferDisplayLayer()
private var decoder: H265Decoder?
override init() {
super.init()
// Initial configuration for the display layer
displayLayer.videoGravity = .resizeAspect
// Initialize the decoder (delegate = self)
decoder = H265Decoder(delegate: self)
// For simple playback, set isBaseline to true
decoder?.isBaseline = true
}
func startPlayback() {
// Load the file "cars_320x240.h265"
guard let url = Bundle.main.url(forResource: "temp2", withExtension: "h265") else {
print("File not found")
return
}
do {
let data = try Data(contentsOf: url)
// Set FPS and video size as needed
let packet = VideoPacket(data: data,
type: .h265,
fps: 30,
videoSize: CGSize(width: 1080, height: 1920))
// Decode as a single packet
decoder?.decodeOnePacket(packet)
} catch {
print("Failed to load file: \(error)")
}
}
// MARK: - VideoDecoderDelegate
func decodeOutput(video: CMSampleBuffer) {
// When decoding is complete, send the output to AVSampleBufferDisplayLayer
displayLayer.enqueue(video)
}
func decodeOutput(error: DecodeError) {
print("Decoding error: \(error)")
}
}
Topic:
Media Technologies
SubTopic:
Video
I use startCaptureWithHandler to record screen and AVAssetWriter appendSampleBuffer: to save audio and video ,but when played the saved file audio and video are out of sync.
I don t know if it s a AVAssetWriterInputr setup problem,here is my code
NSDictionary *audioCompressionSettings = @{
AVEncoderBitRatePerChannelKey : @(64000),
AVFormatIDKey : @(kAudioFormatMPEG4AAC),
AVNumberOfChannelsKey : @(2),
AVSampleRateKey : @(44100) };
AVAssetWriterInput *audioAssetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:audioCompressionSettings];
audioAssetWriterInput.expectsMediaDataInRealTime = YES;
[_assetWriter addInput:audioAssetWriterInput];
NSDictionary *videoCompressSetting = @{AVVideoAverageBitRateKey:@(screenWidth*screenHeight*5),
AVVideoMaxKeyFrameIntervalKey:@(30),
AVVideoProfileLevelKey : AVVideoProfileLevelH264MainAutoLevel};
NSDictionary *codecSetting = @{AVVideoCodecKey:AVVideoCodecTypeH264,
AVVideoScalingModeKey : AVVideoScalingModeResize,
AVVideoWidthKey:@(screenWidth*2),
AVVideoHeightKey:@(screenHeight*2),
AVVideoCompressionPropertiesKey:videoCompressSetting
};
AVAssetWriterInput* videoAssetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:codecSetting];
videoAssetWriterInput.expectsMediaDataInRealTime = YES;
[_assetWriter addInput:videoAssetWriterInput];
After updating to 18.3 my iPad 7th generation has no sound and will not play videos
Hi,
On macOS Sonoma and earlier versions, I used my script along with a LaunchDaemon to continuously record my screen.
However, after upgrading to macOS Sequoia, the LaunchDaemon no longer works for screen recording. It only works when I use a LaunchAgent.
For my workflow, using a LaunchDaemon and running the ffmpeg process as root is much more efficient than running it as a regular user. Does anyone know how to run a script in the background using a LaunchDaemon on macOS Sequoia?
Here are my LaunchDaemon plist and script:
LaunchDaemon plist:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>Label</key> <string>local.ScreenRecord</string>
<key>Disabled</key> <false/>
<key>RunAtLoad</key> <true/>
<key>KeepAlive</key> <true/>
<key>Nice</key>
<integer>-20</integer>
<key>ProgramArguments</key>
<array>
<string>/bin/bash</string>
<string>/usr/local/distrib/record.bash</string>
</array>
</dict>
</plist>
Script
## !!! START CONFIGURATION !!!
#
FFMPEG_LOC="/usr/local/distrib/ffmpeg"
FFMPEG_INPUT="-hide_banner -f avfoundation -capture_cursor 1 -pixel_format uyvy422"
FFMPEG_VSET="-codec libx264 -r 22 -crf 26 -preset veryfast -b:v 5M -maxrate 7M -bufsize 14M"
TIME_REC="-t 600"
FOLDER_REC="/usr/local/distrib/REC/"
#
## !!! END CONFIGURATION !!!
#
# Find number id monitor
MON_ID=$($FFMPEG_LOC -hide_banner -f avfoundation -list_devices true -i "" 2>&1 | awk -F'[]|[]' '/Capture\ screen/ {print $4}')
#
if [ -n "$MON_ID" ]
then
# Time for name
DATE_REC=$(date +"%m-%d-%Y_%H-%M-%S")
# Number of output video file
OUTPUT_NUM=0
# Full command to record
FFMPEG_FULL_COMMAND=""
for MON_NUM in $MON_ID
do
FFMPEG_FULL_COMMAND=""$FFMPEG_FULL_COMMAND" "$FFMPEG_INPUT" -i "$MON_NUM" "$FFMPEG_VSET" "$TIME_REC" -map "$OUTPUT_NUM" "$FOLDER_REC""$DATE_REC"_Mon"$MON_NUM".mkv"
let OUTPUT_NUM=$OUTPUT_NUM+1
done
$FFMPEG_LOC $FFMPEG_FULL_COMMAND
else
echo "No monitors"
sleep 5
fi
Is it possible to use the AVExternalStorageDevice to access external storage from a connected camera or usb drive (via USB C or Lightning connector) on an iPad/iPhone.
I have tested the following code on an iPhone 14 (iOS 18.1.1) and an iPad Gen 10 (18.3.1), and both return false for:
// returns false on iPhone 14, iPad gen 10
print(AVExternalStorageDeviceDiscoverySession.isSupported)
The following code returns null, when I try to access the external storage discovery session.
// returns null on iOS devices
print(AVExternalStorageDeviceDiscoverySession.shared)
The following returns false, without displaying a permission dialog:
AVExternalStorageDevice.requestAccess(completionHandler: { (granted: Bool) in
// returns false with no permission dialog
print(granted);
What type of iOS devices are supported by AVExternalStorageDeviceDiscoverySession?
What situations has it been used for (e.g. connecting to Camera via the external storage protocol, accessing photos from a SD card with an adapter, accessing photos from usb drive).
Is there are sample code for using the AV External Storage api?
I have an AUv3 that passes all validation and can be loaded into Logic Pro without issue. The UI for the plug in can be any aspect ratio but Logic insists on presenting it in a view with a fixed aspect ratio. That is when resizing, both the height and width are resized. I have never managed to work out what it is I need to do specify to Logic to allow the user to resize width or height independently of each other.
Can anyone tell me what I need to specify in the AU code that will inform Logic that the view can be resized from any side of the window/panel?
I am playing the protected HLS streams and the authorization token expires in 3 minutes. I am trying to achieve this with 'AVAssetResourceLoaderDelegate'. I can refresh the token and play it, but the problem is in between the session, the player stalls for a small time, LIKE 1 SECOND.
Here's my code :
class APLCustomAVARLDelegate: NSObject, AVAssetResourceLoaderDelegate {
static let httpsScheme = "https"
static let redirectErrorCode = 302
static let badRequestErrorCode = 400
private var token: String?
private var retryDictionary = [String: Int]()
private let maxRetries = 3
private func schemeSupported(_ scheme: String) -> Bool {
let supported = ishttpSchemeValid(scheme)
print("Scheme '\(scheme)' supported: \(supported)")
return supported
}
private func reportError(loadingRequest: AVAssetResourceLoadingRequest, error: Int) {
let nsError = NSError(domain: NSURLErrorDomain, code: error, userInfo: nil)
print("Reporting error: \(nsError)")
loadingRequest.finishLoading(with: nsError)
}
// Handle token renewal requests to prevent playback stalls
func resourceLoader(_ resourceLoader: AVAssetResourceLoader, shouldWaitForRenewalOfRequestedResource renewalRequest: AVAssetResourceRenewalRequest) -> Bool {
print("Resource renewal requested for URL: \(renewalRequest.request.url?.absoluteString ?? "unknown URL")")
// Handle renewal the same way we handle initial requests
guard let scheme = renewalRequest.request.url?.scheme else {
print("No scheme found in the renewal URL.")
return false
}
if isHttpsSchemeValid(scheme) {
return handleHttpsRequest(renewalRequest)
}
print("Scheme not supported for renewal.")
return false
}
private func isHttpsSchemeValid(_ scheme: String) -> Bool {
let isValid = scheme == APLCustomAVARLDelegate.httpsScheme
print("httpsScheme scheme '\(scheme)' valid: \(isValid)")
return isValid
}
private func generateHttpsURL(sourceURL: URL) -> URL? {
// If you need to modify the URL, do it here
// Currently this just returns the same URL
let urlString = sourceURL.absoluteString
print("Generated HTTPS URL: \(urlString)")
return URL(string: urlString)
}
private func handleHttpsRequest(_ loadingRequest: AVAssetResourceLoadingRequest) -> Bool {
print("Handling HTTPS request.")
guard let sourceURL = loadingRequest.request.url,
var redirectURL = generateHttpsURL(sourceURL: sourceURL) else {
print("Failed to generate HTTPS URL.")
reportError(loadingRequest: loadingRequest, error: APLCustomAVARLDelegate.badRequestErrorCode)
return true
}
// Track retry attempts with a dictionary keyed by request URL
let urlString = sourceURL.absoluteString
let currentRetries = retryDictionary[urlString] ?? 0
if currentRetries < maxRetries {
retryDictionary[urlString] = currentRetries + 1
} else {
// Too many retries, report a more specific error
reportError(loadingRequest: loadingRequest, error: NSURLErrorTimedOut)
retryDictionary.removeValue(forKey: urlString)
return true
}
if var urlComponents = URLComponents(url: redirectURL, resolvingAgainstBaseURL: false) {
var queryItems = urlComponents.queryItems ?? []
// Generate a fresh token each time
let freshToken = AESTimeBaseEncription.secureEncryptSecretText()
// Check if the token already exists
if let existingTokenIndex = queryItems.firstIndex(where: { $0.name == "token" }) {
// Update the existing token
queryItems[existingTokenIndex].value = freshToken
} else {
// Add the token if it doesn't exist
queryItems.append(URLQueryItem(name: "token", value: freshToken))
}
urlComponents.queryItems = queryItems
redirectURL = urlComponents.url!
}
let redirectRequest = URLRequest(url: redirectURL)
let response = HTTPURLResponse(url: redirectURL, statusCode: APLCustomAVARLDelegate.redirectErrorCode, httpVersion: nil, headerFields: nil)
print("Redirecting HTTPS to URL: \(redirectURL)")
loadingRequest.redirect = redirectRequest
loadingRequest.response = response
loadingRequest.finishLoading()
// If successful, reset the retry counter
if retryDictionary[urlString] == maxRetries {
retryDictionary.removeValue(forKey: urlString)
}
return true
}
}
Hi. I am working on an audio app for iOS. I have added the CPNowPlayingPlaybackRateButton to my CPNowPlayingTemplate.
When the button is clicked, my handler changes the rate in the AVPlayer and updates the MPNowPlayingInfoCenter to the new rate, for example, 2.0.
Throughout, the Carplay button always displays "0x". I am wondering how to get this UI to accurately reflect the playback rate the user has selected, as always displaying 0x is a poor user experience.
You may suggest MPChangePlaybackRateCommand is relevant here, but I have not been able to get that to work either, and judging by posts online, not many other people have either. I have made a post about that here: https://developer.apple.com/forums/thread/773099
Is this a known Apple bug? Is there a way to get the UI to accurately reflect the playback rate of my audio?
Kind regards.
Topic:
Media Technologies
SubTopic:
Audio