Streaming

RSS for tag

Deep dive into the technical specifications that influence seamless playback for streaming services, including bitrates, codecs, and caching mechanisms.

Streaming Documentation

Posts under Streaming subtopic

Post

Replies

Boosts

Views

Activity

ScreenCaptureKit stops capturing after ~10–15 minutes unexpectedly
When using the built-in macOS screen recording feature, the recording stops automatically after approximately 10–15 minutes without any warning or error message. No manual stop action is performed. The recording simply ends silently. The same issue also occurs when using ScreenCaptureKit in a custom application, which suggests this may be a system-level issue related to screen capture rather than an app-specific problem. This issue is reproducible and happens consistently after running for a period of time.
0
0
178
Apr ’26
AVMetricMediaResourceRequestEvent returns error but no URLSession metrics for failed HLS playlist/segment requests
Hello, I am using AVMetrics to monitor HLS playback requests from AVPlayer, specifically AVMetricHLSPlaylistRequestEvent and AVMetricHLSMediaSegmentRequestEvent. These events provide an AVMetricMediaResourceRequestEvent. For successful requests, I can read URLSession metrics. However, when a request fails, the event contains an error but no URLSession metrics. I reproduced this by intercepting HLS playlist and segment requests with Charles Proxy and forcing failures on both the simulator and a physical device. Is this expected behavior? If so, is there any supported way to get timing details for failed HLS requests? I am using code like this: for try await event in playerItem.metrics(forType: AVMetricHLSPlaylistRequestEvent.self) { // ... } for try await event in playerItem.metrics(forType: AVMetricHLSMediaSegmentRequestEvent.self) { // ... } Also, the example shown in the WWDC session does not compile for me (XCode 26.2). I get the following error: Pack expansion requires that '' and 'AVMetricEvent' have the same shape let playerItem: AVPlayerItem = ... let ltkuMetrics = item.metrics(forType: AVMetricPlayerItemLikelyToKeepUpEvent.self) let summaryMetrics = item.metrics(forType: AVMetricPlayerItemPlaybackSummaryEvent.self) for await (metricEvent, publisher) in ltkuMetrics.chronologicalMerge(with: summaryMetrics) { // send metricEvent to server }
2
1
250
3w
FairPlay SPC with an invalid device type
Hi, I received an SPC without a device Identity TLLV and with an invalid (i.e a value that is not specified in the FairPlay programming guide) value of device type in the Device info TLLV. The info I got is the following - Apple Device Type: Type:0x555ea482e2ef0a7c, OS version:189.121.178 Does anyone know what device type it is, and why it does not conform to the Apple spec? Also, should I accept such an SPC or is it not valid? Thanks.
1
0
235
2w
FairPlay SPC v3 documentation mismatch: payload length field size vs sample code
Hi, I’ve identified a discrepancy between the FairPlay Streaming SPC v3 documentation and the provided Swift reference implementation regarding the SPC payload length field. Documentation states: The SPC V3 structure defines: SPC payload length: 4 bytes However, in Apple’s Swift sample implementation: // Move local offset by 12 to adjust for padding localOffset += 12 spcContainer.spcDataSize = Int(try readBigEndianU32(spc, localOffset)) This indicates: A 16-byte field (12 bytes padding + 4-byte length) This behavior also matches the SPC sample provided in the FairPlay Streaming SDK (sample_spc_v3.b64 ). 00000003 // spc version 00000000 // reserved .... 00000000000000000000000000000f40 // spc payload length (16 bytes) Could you please confirm the correct implementation? Thanks
1
0
130
2w
HLS Tools - hlsreport critical error cause
Hi, I'm currently experiencing issues with HLS streams created by FFmpeg running on Safari. When I pass the stream to the mediastreamvalidator tool and then run hlsreport on the output, I get a critical error reported: Media Entry discontinuity value does not match previous playlist for MEDIA-SEQUENCE 1 If I let the stream finish (it's a live stream from an IoT device) and then perform the stream validation again I no longer receive the critical error. My assumption is that this critical error is contributing to the HLS stall on iOS. I have also noticed that if I let the stream continue and then re-load the video control in Safari the stream starts Is there a resource with explanations or remediation paths relevant to the possible output of the hlsreport? My m3u8 output looks like this (I have redacted the server host) #EXTM3U #EXT-X-VERSION:6 #EXT-X-TARGETDURATION:2 #EXT-X-MEDIA-SEQUENCE:1 #EXT-X-PLAYLIST-TYPE:EVENT #EXT-X-INDEPENDENT-SEGMENTS #EXT-X-DISCONTINUITY #EXTINF:2.000000, https://redacted.com/segment-00001.ts #EXTINF:2.000011, https://redacted.com/segment-00002.ts #EXTINF:2.000011, https://redacted.com/segment-00003.ts #EXTINF:2.000011, https://redacted.com/segment-00004.ts #EXTINF:2.000011, #EXT-X-ENDLIST Thanks for any advice or guidance possible - if I can provide isolated code snippets I will do. Andy
1
0
608
1w
AVMutableComposition audio silently drops on iOS 26 when streaming over HTTP/2 (FB22696516)
We've discovered a regression in iOS 26 where AVMutableComposition silently drops audio when the source asset is streamed over HTTP/2. The same file served over HTTP/1.1 plays audio correctly through the same composition code. Direct AVPlayer playback (without composition) works fine on HTTP/2. This did not occur on iOS 18.x. It happens on physical devices only. It does not reproduce on a simulator or on macOS. Tested conditions (same MP4 file, different CDNs): CloudFront (HTTP/2) + Composition → ❌ Audio silent Cloudflare (HTTP/2) + Composition → ❌ Audio silent Akamai (HTTP/1.1) + Composition → ✅ Audio works Apple TS (HTTP/1.1) + Composition → ✅ Audio works Downloaded locally, then composed → ✅ Audio works Direct playback, no composition (HTTP/2) → ✅ Audio works The CloudFront and Akamai URLs serve the identical file — same S3 object, different CDN edge. CDN vendor doesn't matter; any HTTP/2 source triggers it. Minimal reproduction: let asset = AVURLAsset(url: http2URL) let videoTrack = try await asset.loadTracks(withMediaType: .video).first! let audioTrack = try await asset.loadTracks(withMediaType: .audio).first! let duration = try await asset.load(.duration) let composition = AVMutableComposition() let fullRange = CMTimeRange(start: .zero, end: duration) let compVideo = composition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid)! try compVideo.insertTimeRange(fullRange, of: videoTrack, at: .zero) let compAudio = composition.addMutableTrack(withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid)! try compAudio.insertTimeRange(fullRange, of: audioTrack, at: .zero) let item = AVPlayerItem(asset: composition.copy() as! AVComposition) player.replaceCurrentItem(with: item) player.play() // Video plays, audio goes silent after a while Playing the same asset directly works fine: player.replaceCurrentItem(with: AVPlayerItem(asset: asset)) player.play() // Both video and audio work Filed as FB22696516 Sample project: https://github.com/karlingen/AVCompositionBug
2
9
223
11h
How to Monitor Any USB Audio or Video Device on macOS
USB cameras, microphones, HDMI capture cards, and audio interfaces are supposed to "just work" on macOS. In reality, it's often difficult to quickly access or monitor them without opening large and complicated software. Sometimes you simply want to see whether a USB camera is active. Sometimes you want to check an HDMI source connected through a capture card. And in other cases, you may want to use a Mac mini without a dedicated monitor by viewing its HDMI output through a USB capture device directly on another Mac. macOS supports many modern USB AV devices out of the box, but it surprisingly lacks a simple built-in utility for live monitoring and recording. Most users end up using oversized streaming or editing applications just to preview a video signal or monitor audio input. That becomes especially noticeable with: USB webcams HDMI capture adapters USB microphones audio interfaces secondary computers headless Mac mini setups A lightweight monitor utility is often much more practical when you only need real-time access to a device, want to record a stream, or quickly switch between multiple AV inputs. That's one of the reasons I built AV Monitor Pro  -  a native macOS app designed for monitoring and recording connected audio/video devices in real time. It can preview USB cameras, capture cards, microphones, and HDMI sources with minimal setup, and it's especially useful for workflows like running a Mac mini without a monitor, monitoring external devices, or recording live AV input directly on macOS.
0
0
169
5d
ScreenCaptureKit stops capturing after ~10–15 minutes unexpectedly
When using the built-in macOS screen recording feature, the recording stops automatically after approximately 10–15 minutes without any warning or error message. No manual stop action is performed. The recording simply ends silently. The same issue also occurs when using ScreenCaptureKit in a custom application, which suggests this may be a system-level issue related to screen capture rather than an app-specific problem. This issue is reproducible and happens consistently after running for a period of time.
Replies
0
Boosts
0
Views
178
Activity
Apr ’26
AVMetricMediaResourceRequestEvent returns error but no URLSession metrics for failed HLS playlist/segment requests
Hello, I am using AVMetrics to monitor HLS playback requests from AVPlayer, specifically AVMetricHLSPlaylistRequestEvent and AVMetricHLSMediaSegmentRequestEvent. These events provide an AVMetricMediaResourceRequestEvent. For successful requests, I can read URLSession metrics. However, when a request fails, the event contains an error but no URLSession metrics. I reproduced this by intercepting HLS playlist and segment requests with Charles Proxy and forcing failures on both the simulator and a physical device. Is this expected behavior? If so, is there any supported way to get timing details for failed HLS requests? I am using code like this: for try await event in playerItem.metrics(forType: AVMetricHLSPlaylistRequestEvent.self) { // ... } for try await event in playerItem.metrics(forType: AVMetricHLSMediaSegmentRequestEvent.self) { // ... } Also, the example shown in the WWDC session does not compile for me (XCode 26.2). I get the following error: Pack expansion requires that '' and 'AVMetricEvent' have the same shape let playerItem: AVPlayerItem = ... let ltkuMetrics = item.metrics(forType: AVMetricPlayerItemLikelyToKeepUpEvent.self) let summaryMetrics = item.metrics(forType: AVMetricPlayerItemPlaybackSummaryEvent.self) for await (metricEvent, publisher) in ltkuMetrics.chronologicalMerge(with: summaryMetrics) { // send metricEvent to server }
Replies
2
Boosts
1
Views
250
Activity
3w
FairPlay SPC with an invalid device type
Hi, I received an SPC without a device Identity TLLV and with an invalid (i.e a value that is not specified in the FairPlay programming guide) value of device type in the Device info TLLV. The info I got is the following - Apple Device Type: Type:0x555ea482e2ef0a7c, OS version:189.121.178 Does anyone know what device type it is, and why it does not conform to the Apple spec? Also, should I accept such an SPC or is it not valid? Thanks.
Replies
1
Boosts
0
Views
235
Activity
2w
FairPlay SPC v3 documentation mismatch: payload length field size vs sample code
Hi, I’ve identified a discrepancy between the FairPlay Streaming SPC v3 documentation and the provided Swift reference implementation regarding the SPC payload length field. Documentation states: The SPC V3 structure defines: SPC payload length: 4 bytes However, in Apple’s Swift sample implementation: // Move local offset by 12 to adjust for padding localOffset += 12 spcContainer.spcDataSize = Int(try readBigEndianU32(spc, localOffset)) This indicates: A 16-byte field (12 bytes padding + 4-byte length) This behavior also matches the SPC sample provided in the FairPlay Streaming SDK (sample_spc_v3.b64 ). 00000003 // spc version 00000000 // reserved .... 00000000000000000000000000000f40 // spc payload length (16 bytes) Could you please confirm the correct implementation? Thanks
Replies
1
Boosts
0
Views
130
Activity
2w
HLS Tools - hlsreport critical error cause
Hi, I'm currently experiencing issues with HLS streams created by FFmpeg running on Safari. When I pass the stream to the mediastreamvalidator tool and then run hlsreport on the output, I get a critical error reported: Media Entry discontinuity value does not match previous playlist for MEDIA-SEQUENCE 1 If I let the stream finish (it's a live stream from an IoT device) and then perform the stream validation again I no longer receive the critical error. My assumption is that this critical error is contributing to the HLS stall on iOS. I have also noticed that if I let the stream continue and then re-load the video control in Safari the stream starts Is there a resource with explanations or remediation paths relevant to the possible output of the hlsreport? My m3u8 output looks like this (I have redacted the server host) #EXTM3U #EXT-X-VERSION:6 #EXT-X-TARGETDURATION:2 #EXT-X-MEDIA-SEQUENCE:1 #EXT-X-PLAYLIST-TYPE:EVENT #EXT-X-INDEPENDENT-SEGMENTS #EXT-X-DISCONTINUITY #EXTINF:2.000000, https://redacted.com/segment-00001.ts #EXTINF:2.000011, https://redacted.com/segment-00002.ts #EXTINF:2.000011, https://redacted.com/segment-00003.ts #EXTINF:2.000011, https://redacted.com/segment-00004.ts #EXTINF:2.000011, #EXT-X-ENDLIST Thanks for any advice or guidance possible - if I can provide isolated code snippets I will do. Andy
Replies
1
Boosts
0
Views
608
Activity
1w
AVMutableComposition audio silently drops on iOS 26 when streaming over HTTP/2 (FB22696516)
We've discovered a regression in iOS 26 where AVMutableComposition silently drops audio when the source asset is streamed over HTTP/2. The same file served over HTTP/1.1 plays audio correctly through the same composition code. Direct AVPlayer playback (without composition) works fine on HTTP/2. This did not occur on iOS 18.x. It happens on physical devices only. It does not reproduce on a simulator or on macOS. Tested conditions (same MP4 file, different CDNs): CloudFront (HTTP/2) + Composition → ❌ Audio silent Cloudflare (HTTP/2) + Composition → ❌ Audio silent Akamai (HTTP/1.1) + Composition → ✅ Audio works Apple TS (HTTP/1.1) + Composition → ✅ Audio works Downloaded locally, then composed → ✅ Audio works Direct playback, no composition (HTTP/2) → ✅ Audio works The CloudFront and Akamai URLs serve the identical file — same S3 object, different CDN edge. CDN vendor doesn't matter; any HTTP/2 source triggers it. Minimal reproduction: let asset = AVURLAsset(url: http2URL) let videoTrack = try await asset.loadTracks(withMediaType: .video).first! let audioTrack = try await asset.loadTracks(withMediaType: .audio).first! let duration = try await asset.load(.duration) let composition = AVMutableComposition() let fullRange = CMTimeRange(start: .zero, end: duration) let compVideo = composition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid)! try compVideo.insertTimeRange(fullRange, of: videoTrack, at: .zero) let compAudio = composition.addMutableTrack(withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid)! try compAudio.insertTimeRange(fullRange, of: audioTrack, at: .zero) let item = AVPlayerItem(asset: composition.copy() as! AVComposition) player.replaceCurrentItem(with: item) player.play() // Video plays, audio goes silent after a while Playing the same asset directly works fine: player.replaceCurrentItem(with: AVPlayerItem(asset: asset)) player.play() // Both video and audio work Filed as FB22696516 Sample project: https://github.com/karlingen/AVCompositionBug
Replies
2
Boosts
9
Views
223
Activity
11h
How to Monitor Any USB Audio or Video Device on macOS
USB cameras, microphones, HDMI capture cards, and audio interfaces are supposed to "just work" on macOS. In reality, it's often difficult to quickly access or monitor them without opening large and complicated software. Sometimes you simply want to see whether a USB camera is active. Sometimes you want to check an HDMI source connected through a capture card. And in other cases, you may want to use a Mac mini without a dedicated monitor by viewing its HDMI output through a USB capture device directly on another Mac. macOS supports many modern USB AV devices out of the box, but it surprisingly lacks a simple built-in utility for live monitoring and recording. Most users end up using oversized streaming or editing applications just to preview a video signal or monitor audio input. That becomes especially noticeable with: USB webcams HDMI capture adapters USB microphones audio interfaces secondary computers headless Mac mini setups A lightweight monitor utility is often much more practical when you only need real-time access to a device, want to record a stream, or quickly switch between multiple AV inputs. That's one of the reasons I built AV Monitor Pro  -  a native macOS app designed for monitoring and recording connected audio/video devices in real time. It can preview USB cameras, capture cards, microphones, and HDMI sources with minimal setup, and it's especially useful for workflows like running a Mac mini without a monitor, monitoring external devices, or recording live AV input directly on macOS.
Replies
0
Boosts
0
Views
169
Activity
5d