Media Playback Programming for iOS and macOS (Step-by-Step, with Runnable Code)

Version banner (verified): Tested with Swift 6.2, Xcode 26.0.1, minimum targets iOS 17 / iPadOS 17, macOS 14 (to use SwiftUI @Observable and the modern SwiftUI/AVKit stack). Generated Oct 16, 2025 (Asia/Seoul).

Tip: If you must support older OS versions, you can replace @Observable with ObservableObject + @Published. The rest of the APIs used below work back to iOS 14–16, but the syntax here assumes iOS 17+/macOS 14+.


1) Overview & Architecture

You will learn: what “media playback” means on Apple platforms, when to use AVFoundation, AVKit, MediaPlayer, and CoreMedia, and how data flows for local files vs. streaming (HLS).

Plain definition: Media playback means loading audio/video from a file or URL, decoding it, and rendering sound/pixels while reacting to user controls (play/pause/seek) and system events (interruptions, route changes).

Which framework does what (short):

Data flow (simplified):

Quick Test: In your head, map: “lock screen artwork” → which framework? (MediaPlayer). “inline playback UI” → (AVKit). “read current time” → (AVFoundation + CoreMedia).

Gotchas:

References:


2) Project Setup (Latest Swift & Xcode)

You will learn: exact tool versions, targets, SwiftUI vs. UIKit/AppKit project options, and required entitlements/Info.plist keys.

New Project choices:

Capabilities & keys you often need:

Quick Test: Which key enables background audio? (UIBackgroundModes = audio). Gotchas: Adding audio without actually playing background audio can cause App Review rejection. (Apple Developer)

References:


3) The Basics: Play a Video

You will learn: minimal, runnable players for SwiftUI and UIKit/AppKit, with play/pause/seek/rate/mute.

3.1 SwiftUI (iOS & macOS) — VideoPlayer + AVPlayer

import SwiftUI
import AVKit
import Observation

@Observable
final class PlayerStore {
    let player: AVPlayer
    var isMuted = false
    var rate: Float = 1.0
    var duration: Double = 0
    var current: Double = 0    // seconds
    
    private var timeObserver: Any?
    
    init(url: URL) {
        self.player = AVPlayer(url: url)
        // Update slider as playback progresses (every 0.5s)
        timeObserver = player.addPeriodicTimeObserver(
            forInterval: CMTime(seconds: 0.5, preferredTimescale: 600),
            queue: .main
        ) { [weak self] time in
            guard let self else { return }
            self.current = time.seconds
            if let item = self.player.currentItem {
                let d = item.duration.seconds
                if d.isFinite { self.duration = d }
            }
        }
    }
    
    deinit {
        if let obs = timeObserver { player.removeTimeObserver(obs) }
    }
}

struct ContentView: View {
    @State private var store = PlayerStore(
        url: URL(string:"https://devstreaming-cdn.apple.com/videos/streaming/examples/bipbop_4x3/gear1/prog_index.m3u8")!
    )
    @State private var seeking = false
    @State private var tempValue: Double = 0
    
    var body: some View {
        VStack(spacing: 16) {
            VideoPlayer(player: store.player)
                .frame(height: 240)
            
            Slider(value: Binding(
                get: { seeking ? tempValue : store.current },
                set: { newValue in
                    seeking = true; tempValue = newValue
                }),
                in: 0...(store.duration > 0 ? store.duration : 1)
            )
            .onChange(of: seeking) { _, isSeeking in
                if !isSeeking {
                    let time = CMTime(seconds: tempValue, preferredTimescale: 600)
                    store.player.seek(to: time)
                }
            }
            .onChange(of: tempValue) { _, _ in } // keeps binding live
            
            HStack {
                Button("⏯ Play/Pause") {
                    if store.player.timeControlStatus == .playing { store.player.pause() }
                    else { store.player.play() }
                }
                Button(store.isMuted ? "🔈 Unmute" : "🔇 Mute") {
                    store.isMuted.toggle()
                    store.player.isMuted = store.isMuted
                }
                Menu("Rate \(String(format: "%.1fx", store.rate))") {
                    ForEach([0.5, 1.0, 1.5, 2.0], id: \.self) { r in
                        Button("\(r)x") { store.rate = Float(r); store.player.rate = Float(r) }
                    }
                }
            }
        }
        .padding()
        .onAppear { store.player.play() }
    }
}

Expected UI: a player, a scrubber slider, and simple controls.

3.2 UIKit (iOS) — AVPlayerViewController

import UIKit
import AVKit

final class PlayerVC: UIViewController {
    private let url = URL(string: "https://devstreaming-cdn.apple.com/videos/streaming/examples/bipbop_4x3/gear1/prog_index.m3u8")!
    private let player = AVPlayer()
    private let vc = AVPlayerViewController()
    
    override func viewDidLoad() {
        super.viewDidLoad()
        vc.player = player
        vc.canStartPictureInPictureAutomaticallyFromInline = true
        addChild(vc)
        vc.view.frame = view.bounds
        vc.view.autoresizingMask = [.flexibleWidth, .flexibleHeight]
        view.addSubview(vc.view)
        vc.didMove(toParent: self)
        
        vc.player = AVPlayer(url: url)
        vc.player?.play()
    }
}

3.3 AppKit (macOS) — AVPlayerView

import Cocoa
import AVKit

final class ViewController: NSViewController {
    @IBOutlet weak var playerView: AVPlayerView!
    let player = AVPlayer(url: URL(string: "https://devstreaming-cdn.apple.com/videos/streaming/examples/bipbop_4x3/gear1/prog_index.m3u8")!)
    override func viewDidLoad() {
        super.viewDidLoad()
        playerView.player = player
        player.play()
    }
}

3.4 Custom view (both platforms) — AVPlayerLayer

import SwiftUI
import AVFoundation

final class PlayerLayerView: UIView {
    override static var layerClass: AnyClass { AVPlayerLayer.self }
    var playerLayer: AVPlayerLayer { layer as! AVPlayerLayer }
    var player: AVPlayer? {
        get { playerLayer.player }
        set { playerLayer.player = newValue }
    }
}

struct PlayerLayerRepresentable: UIViewRepresentable {
    let player: AVPlayer
    func makeUIView(context: Context) -> PlayerLayerView { let v = PlayerLayerView(); v.player = player; return v }
    func updateUIView(_ uiView: PlayerLayerView, context: Context) { uiView.player = player }
}

Quick Test: Which control automatically brings PiP on iOS when inline? (AVPlayerViewController with canStartPictureInPictureAutomaticallyFromInline). (Apple Developer) Gotchas: VideoPlayer is SwiftUI-only UI; logic still lives in AVPlayer underneath.

References:


4) Streaming with HLS

You will learn: how to play remote .m3u8, tune buffering/bitrate, attach headers via a resource loader, and retry on errors.

4.1 Minimal HLS player with sane defaults

import AVFoundation

func makeHLSPlayer(url: URL) -> AVPlayer {
    let asset = AVURLAsset(url: url)
    let item = AVPlayerItem(asset: asset)
    // Reduce rebuffering; AVPlayer manages ABR automatically.
    let player = AVPlayer(playerItem: item)
    player.automaticallyWaitsToMinimizeStalling = true
    // Optional: limit network usage (bits per second)
    item.preferredPeakBitRate = 2_000_000
    return player
}

4.2 Custom headers/tokens (read-only example with AVAssetResourceLoader)

import AVFoundation

final class HeaderInjectingLoader: NSObject, AVAssetResourceLoaderDelegate {
    private let realURL: URL
    private let headers: [String:String]
    private let session = URLSession(configuration: .default)
    
    init(realURL: URL, headers: [String:String]) {
        self.realURL = realURL; self.headers = headers
    }
    
    func asset(withFakeSchemeURL fake: URL) -> AVURLAsset {
        let asset = AVURLAsset(url: fake)
        asset.resourceLoader.setDelegate(self, queue: .main)
        return asset
    }

    func resourceLoader(_ resourceLoader: AVAssetResourceLoader,
                        shouldWaitForLoadingOfRequestedResource req: AVAssetResourceLoadingRequest) -> Bool {
        // Redirect to the real URL with your custom headers.
        var request = URLRequest(url: realURL)
        headers.forEach { request.addValue($0.value, forHTTPHeaderField: $0.key) }
        // Range requests are common; forward them if present.
        if let range = req.dataRequest {
            let lower = range.requestedOffset
            let upper = lower + Int64(range.requestedLength) - 1
            request.addValue("bytes=\(lower)-\(upper)", forHTTPHeaderField: "Range")
        }
        let task = session.dataTask(with: request) { data, response, error in
            if let r = response as? HTTPURLResponse {
                req.response = r
                if let info = req.contentInformationRequest {
                    info.isByteRangeAccessSupported = (r.value(forHTTPHeaderField: "Accept-Ranges") == "bytes")
                    info.contentType = r.value(forHTTPHeaderField: "Content-Type")
                    if let len = r.value(forHTTPHeaderField: "Content-Length"), let n = Int64(len) {
                        info.contentLength = n
                    }
                }
            }
            if let data { req.dataRequest?.respond(with: data) }
            if let error { req.finishLoading(with: error) } else { req.finishLoading() }
        }
        task.resume()
        return true
    }
}

Usage: change the scheme to custom+https://…, build a fake URL with that scheme, and pass your real HTTPS URL + headers into HeaderInsertingLoader. Apple’s resource loader gives you a delegate to fulfill requests. (Apple Developer)

4.3 Simple retry with backoff

import AVFoundation

actor ResilientStreamer {
    private(set) var player: AVPlayer?
    
    func play(_ url: URL) async {
        var delay: UInt64 = 1_000_000_000 // 1s
        for attempt in 1...3 {
            let item = AVPlayerItem(url: url)
            let p = AVPlayer(playerItem: item); p.automaticallyWaitsToMinimizeStalling = true
            player = p; p.play()
            // Wait for status or error
            try? await Task.sleep(nanoseconds: 2_000_000_000)
            if item.status == .readyToPlay { return }
            if let err = item.error as NSError? { print("Attempt \(attempt) failed: \(err)") }
            p.pause()
            try? await Task.sleep(nanoseconds: delay)
            delay *= 2
        }
    }
}

Quick Test: Where do you set a bitrate cap? (AVPlayerItem.preferredPeakBitRate). Gotchas: If you need DRM (FairPlay), you’ll use the resource loader to supply keys; that’s out of scope here. See Apple’s FairPlay docs. (Apple Developer)

References:


5) Audio Playback & Sessions (iOS)

You will learn: how to set AVAudioSession, handle interruptions (calls) and route changes (headphones/Bluetooth), and run audio in the background.

5.1 Configure the audio session

import AVFAudio

@MainActor
func configurePlaybackSession() throws {
    let session = AVAudioSession.sharedInstance()
    try session.setCategory(.playback, mode: .moviePlayback, options: [.allowAirPlay, .allowBluetooth])
    try session.setActive(true)
}

5.2 Interruptions & route changes

import AVFAudio

final class AudioSessionObserver {
    private var tokens: [NSObjectProtocol] = []
    init() {
        let nc = NotificationCenter.default
        tokens.append(nc.addObserver(forName: AVAudioSession.interruptionNotification, object: nil, queue: .main) { note in
            guard let info = note.userInfo,
                  let typeValue = info[AVAudioSessionInterruptionTypeKey] as? UInt,
                  let type = AVAudioSession.InterruptionType(rawValue: typeValue) else { return }
            if type == .began {
                // Pause your player
            } else {
                // Resume if appropriate
            }
        })
        tokens.append(nc.addObserver(forName: AVAudioSession.routeChangeNotification, object: nil, queue: .main) { note in
            // Inspect AVAudioSessionRouteChangeReasonKey and adapt (e.g., headphones unplugged)
        })
    }
    deinit { tokens.forEach(NotificationCenter.default.removeObserver) }
}

Quick Test: Which category should a video streaming app use? (.playback). Gotchas: Don’t forget to activate the session; background audio also requires UIBackgroundModes=audio. (Apple Developer)

References:


6) Now Playing & Remote Controls

You will learn: how to show metadata on the lock screen and handle remote transport commands.

import MediaPlayer
import AVFoundation

final class NowPlaying {
    private let center = MPNowPlayingInfoCenter.default()
    private let player: AVPlayer

    init(player: AVPlayer) { self.player = player }

    func update(title: String, artist: String? = nil, artwork: UIImage? = nil) {
        var info: [String : Any] = [
            MPMediaItemPropertyTitle: title,
            MPNowPlayingInfoPropertyElapsedPlaybackTime: player.currentTime().seconds,
            MPMediaItemPropertyPlaybackDuration: player.currentItem?.duration.seconds ?? 0,
            MPNowPlayingInfoPropertyPlaybackRate: player.rate
        ]
        if let artist { info[MPMediaItemPropertyArtist] = artist }
        if let artwork { info[MPMediaItemPropertyArtwork] = MPMediaItemArtwork(boundsSize: artwork.size) { _ in artwork } }
        center.nowPlayingInfo = info
    }

    func wireCommands() {
        let cmd = MPRemoteCommandCenter.shared()
        cmd.playCommand.addTarget { [weak player] _ in player?.play(); return .success }
        cmd.pauseCommand.addTarget { [weak player] _ in player?.pause(); return .success }
        cmd.togglePlayPauseCommand.addTarget { [weak player] _ in
            guard let p = player else { return .commandFailed }
            p.timeControlStatus == .playing ? p.pause() : p.play()
            return .success
        }
    }
}

Quick Test: Which property keeps the lock-screen progress in sync? (MPNowPlayingInfoPropertyElapsedPlaybackTime). Gotchas: Update elapsedPlaybackTime periodically if you manage rate/time manually.

References:


7) Subtitles, Closed Captions, and Audio Tracks

You will learn: how to discover media selection groups and switch tracks at runtime.

import AVFoundation

func availableSubtitleOptions(for item: AVPlayerItem) -> [AVMediaSelectionOption] {
    guard let group = item.asset.mediaSelectionGroup(forMediaCharacteristic: .legible) else { return [] }
    return group.options
}

func select(subtitle option: AVMediaSelectionOption?, for item: AVPlayerItem) {
    guard let group = item.asset.mediaSelectionGroup(forMediaCharacteristic: .legible) else { return }
    if let option { item.select(option, in: group) }
    else { item.select(nil, in: group) } // Off
}

func availableAudioOptions(for item: AVPlayerItem) -> [AVMediaSelectionOption] {
    item.asset.mediaSelectionGroup(forMediaCharacteristic: .audible)?.options ?? []
}

Quick Test: How do you turn off subtitles? (select(nil, in: group)). Gotchas: Not every asset has legible tracks; always check for nil.

References:


8) Picture in Picture (PiP)

You will learn: how to enable PiP using AVKit’s standard controller or the AVPictureInPictureController for custom UIs.

8.1 Easiest: AVPlayerViewController (iOS)

vc.canStartPictureInPictureAutomaticallyFromInline = true

8.2 Custom player with AVPictureInPictureController

import AVKit

final class PiPBridge: NSObject, AVPictureInPictureControllerDelegate {
    private var pip: AVPictureInPictureController?
    private weak var playerLayer: AVPlayerLayer?

    init(playerLayer: AVPlayerLayer) {
        self.playerLayer = playerLayer
        super.init()
        if AVPictureInPictureController.isPictureInPictureSupported() {
            let source = AVPictureInPictureController.ContentSource(playerLayer: playerLayer)
            pip = AVPictureInPictureController(contentSource: source)
            pip?.delegate = self
        }
    }
    func start() { pip?.startPictureInPicture() }
    func stop()  { pip?.stopPictureInPicture() }
    // Restore UI when PiP ends
    func pictureInPictureController(_ controller: AVPictureInPictureController,
                                    restoreUserInterfaceForPictureInPictureStopWithCompletionHandler completionHandler: @escaping (Bool) -> Void) {
        completionHandler(true)
    }
}

Quick Test: Which API tells you if PiP is supported? (AVPictureInPictureController.isPictureInPictureSupported()). Gotchas: On iOS, PiP works best with background audio category set appropriately; see Section 5. (Apple Developer)

References:


9) Queues, Looping, and Playlists

You will learn: AVQueuePlayer for playlists and AVPlayerLooper for seamless loops.

import AVFoundation

final class LoopingExample {
    let player = AVQueuePlayer()
    private var looper: AVPlayerLooper?

    func startLooping(_ url: URL) {
        let item = AVPlayerItem(url: url)
        looper = AVPlayerLooper(player: player, templateItem: item)
        player.play()
    }
}

Quick Test: Which class do you need to loop seamlessly? (AVPlayerLooper). Gotchas: Keep a strong reference to the AVPlayerLooper; otherwise looping stops.

References:


10) Downloads & Offline Playback (HLS)

You will learn: how to download HLS for offline use with AVAssetDownloadURLSession, track progress, and manage storage.

import Foundation
import AVFoundation

final class HLSDownloader: NSObject, AVAssetDownloadDelegate {
    private lazy var session: AVAssetDownloadURLSession = {
        let cfg = URLSessionConfiguration.background(withIdentifier: "com.example.hlsdownloads")
        return AVAssetDownloadURLSession(configuration: cfg, assetDownloadDelegate: self, delegateQueue: .main)
    }()

    func start(url: URL, title: String) {
        let asset = AVURLAsset(url: url)
        let task = session.makeAssetDownloadTask(
            asset: asset, assetTitle: title, assetArtworkData: nil,
            options: [AVAssetDownloadTaskMinimumRequiredMediaBitrateKey: 265_000]
        )
        task?.resume()
    }

    func urlSession(_ session: URLSession, assetDownloadTask: AVAssetDownloadTask,
                    didFinishDownloadingTo location: URL) {
        // Move the on-disk package to your app’s storage and save the bookmark.
        print("Downloaded to:", location)
    }

    func urlSession(_ session: URLSession, assetDownloadTask: AVAssetDownloadTask,
                    didLoad timeRange: CMTimeRange, totalTimeRangesLoaded loaded: [CMTimeRange],
                    timeRangeExpectedToLoad: CMTimeRange) {
        let loadedSeconds = loaded.reduce(0) { $0 + $1.duration.seconds }
        let pct = loadedSeconds / timeRangeExpectedToLoad.duration.seconds
        print("Progress:", pct)
    }
}

Quick Test: Which class fires progress callbacks for HLS downloads? (AVAssetDownloadTask via delegate). Gotchas: DRM (FairPlay) requires a proper key/lease flow—see Apple’s FairPlay docs. (Apple Developer)

References:


11) Metrics, Logging, and Diagnostics

You will learn: how to read access/error logs from AVPlayerItem, and log key events using the Unified Logging system.

import AVFoundation
import OSLog

let log = Logger(subsystem: "com.example.player", category: "playback")

func logPlaybackStats(for item: AVPlayerItem) {
    if let ev = item.accessLog()?.events.last {
        log.info("throughput=\(ev.observedBitrate, privacy: .public)bps stalls=\(ev.numberOfStalls)")
    }
    if let e = item.errorLog()?.events.last {
        log.error("lastError=\(e.errorComment ?? "n/a", privacy: .public)")
    }
}

Instruments to know: Time Profiler, Allocations, Energy Log. (Apple Developer)

Quick Test: Which log stores stall counts? (AVPlayerItemAccessLog). Gotchas: Don’t spam logs; use appropriate levels and privacy annotations.

References:


12) Performance & Power Best Practices

You will learn: how to reduce rebuffering and energy drain.

import Network

final class NetworkWatcher {
    private let monitor = NWPathMonitor()
    func start() {
        monitor.pathUpdateHandler = { path in
            if path.status == .satisfied { print("Network OK") }
            else { print("Network lost") }
        }
        monitor.start(queue: DispatchQueue(label: "net"))
    }
}

Quick Test: Which API should you use to observe connectivity? (NWPathMonitor). Gotchas: Don’t block the main thread when you parse logs or HLS playlists.

References:


13) macOS-Specific Notes

You will learn: macOS sandbox file access, security-scoped bookmarks, and AVPlayerView conveniences.

13.1 User-selected file access + security-scoped bookmarks

import AppKit

func pickAndPersistFileAccess() throws -> Data? {
    let panel = NSOpenPanel(); panel.canChooseFiles = true; panel.canChooseDirectories = false
    guard panel.runModal() == .OK, let url = panel.url else { return nil }
    guard url.startAccessingSecurityScopedResource() else { return nil }
    defer { url.stopAccessingSecurityScopedResource() }
    // Save bookmark data for next launches
    return try url.bookmarkData(options: .withSecurityScope,
                                includingResourceValuesForKeys: nil,
                                relativeTo: nil)
}

Quick Test: Which entitlement grants read-only access to user-selected files? (com.apple.security.files.user-selected.read-only). Gotchas: Always start/stop security-scoped access around file I/O.

References:


14) Testing

You will learn: deterministic unit tests and basic UI tests for transport controls and PiP availability.

14.1 Unit test: wait for AVPlayerItem.status == .readyToPlay

import XCTest
import AVFoundation

final class PlayerTests: XCTestCase {
    func testReadyToPlay() {
        let url = URL(string:"https://devstreaming-cdn.apple.com/videos/streaming/examples/bipbop_4x3/gear1/prog_index.m3u8")!
        let item = AVPlayerItem(url: url)
        let exp = expectation(description: "ready")
        let obs = item.observe(\.status, options: [.new]) { item, _ in
            if item.status == .readyToPlay { exp.fulfill() }
        }
        let player = AVPlayer(playerItem: item); player.play()
        wait(for: [exp], timeout: 15)
        obs.invalidate()
    }
}

14.2 UI test: basic transport

import XCTest

final class UITests: XCTestCase {
    func testPlayPauseButton() {
        let app = XCUIApplication(); app.launch()
        app.buttons["⏯ Play/Pause"].tap()
        // Assert your label or state changed...
    }
}

Quick Test: Which KVO keypath do you observe for readiness? (status on AVPlayerItem). Gotchas: Use Apple’s HLS examples for stable, deterministic tests. (Apple Developer)

References:


15) Troubleshooting & FAQ

You will learn: common errors and network/ATS/HLS issues.

Quick Test: Which Info.plist key is related to ATS? (NSAppTransportSecurity). Gotchas: Some players fail if the server doesn’t support byte-range requests for segments.

References:


16) Appendix

A) Glossary (short & friendly)

B) Mini Checklists ✅

Basic Player

Background Audio (iOS)

Picture in Picture

HLS Download

C) Sample project structure (if you build one)


— End of Guide —

If you want, I can turn this into a tiny sample Xcode project you can run immediately (SwiftUI target for iOS & macOS).