1. 程式人生 > >AVFounction學習筆記之--音視訊播放.md

AVFounction學習筆記之--音視訊播放.md

AVFounction學習筆記之–音視訊播放

AVFounction是用於處理音視訊的框架。它位於Core Audio、Core Video、Core Media、Core Animation框架之上。
Core Audio是處理所有音訊事件的框架。為音訊提供錄製、播放、處理等介面。
Core Video是針對視訊處理的框架。為框架Core Media提供圖片快取和快取池的支援。
Core Media 提供音訊樣本和視訊幀處理需要的資料型別和介面。它還提供了AVFounction用到的基於CMTime資料型別的時基模型。
Core Animation用於處理動畫相關的框架。

AVSpeechSynthesizer

AVSpeechSynthesizer實現文字轉語音的功能。

import AVFoundation

class SpeechSynthesizerViewController: UIViewController {

    private let speechSynthesizer = AVSpeechSynthesizer()
    private let voices: [AVSpeechSynthesisVoice] = [AVSpeechSynthesisVoice(language: "en-US")!, AVSpeechSynthesisVoice(language: "en-GB")!]
    private let speechString: [String] = ["The forum, which will bring together over 1,000 delegates, will be attended by special guests including the former French Prime Minister Jean-Pierre Raffarin, the 2011 Nobel Prize winner for Economics Thomas J.", "Sargent, and Zhu Yeyu, the vice3 president of the Hong Kong University of Science and Technology.", " Co-hosted by China Media Group and the People's Government of Guangdong Province, the forum will showcase the achievements of Guangdong Province in becoming a major gateway4 linking China with the world.", "The province also provides an example of the benefits of China's policy of Reform and Opening Up, which celebrates its 40th anniversary this year."]
    
    override func viewDidLoad() {
        super.viewDidLoad()
        // 獲取所有聲音支援列表
        print(AVSpeechSynthesisVoice.speechVoices())
    }
    
    @IBAction func clickPlay(_ sender: UIButton) {
        for index in 0..<speechString.count {
            let utterance = AVSpeechUtterance(string: speechString[index])
            utterance.voice = voices[index % 2]
            utterance.rate = 0.4
            utterance.pitchMultiplier = 0.8
            utterance.postUtteranceDelay = 0.1
            speechSynthesizer.speak(utterance)
        }
    }
}

播放和錄製音訊

音訊會話分類表:

分類 作用 是否允許混音 音訊輸入 音訊輸出
Ambient 遊戲、效率應用軟體
Solo Ambient(預設) 遊戲、效率應用軟體
Playback 音訊和視訊播放器 可選
Record 錄音機、音訊捕捉
Play and Record VoIP、語音聊天 可選
Audio Processing 離線會話和處理
Multi-Route 使用外部硬體的高階A/V應用程式
  • 使用AVAudioPlayer播放本地音訊

AVAudioPlayer可以實現音訊的播放、迴圈、音訊計量等,除非需要從網路流中播放音訊、需要訪問原始音訊樣本或者需要非常低的時延等,AVAudioPlayerdo都能勝任。

AVAudioPlayer功能:
1、修改播放器的音量
2、修改播放器的Pan值,允許立體聲播放聲音,範圍(-1.0~1.0)
3、調整播放率,範圍(0.5~2.0),半速到2倍速
4、通過設定numberOfLoops屬性實現音訊無縫迴圈,n大於0,實現迴圈n次迴圈,-1為無限迴圈
5、進行音訊計量,獲取播放音訊力度的平均值和峰值

private func settingSession() {
        // 配置音訊會話
        let audioSession = AVAudioSession.sharedInstance()
        do {
            try audioSession.setCategory(AVAudioSessionCategoryPlayback)
            try audioSession.setActive(true)
        } catch let error {
            print("error = \(error)")
        }
        // 配置音訊後臺播放
        // 在info.plist 中新增  Required background modes  item = App plays audio or streams audio/video using AirPlay
}

// MARK: - AVAudioPlayer
    private func audioPlayer() {
        // AVAudioPlayer 播放音訊
        let url = Bundle.main.url(forResource: "test", withExtension: "mp3")
        do {
            player = try AVAudioPlayer.init(contentsOf: url!)
            // 製造和處理中斷事件  例如: 當有電話呼入的時候
            NotificationCenter.default.addObserver(self, selector: #selector(handleNotification(_:)), name: .AVAudioSessionInterruption, object: nil)
            // 音訊線路改變的通知 例如插入耳機
            NotificationCenter.default.addObserver(self, selector: #selector(handleRouteNotification(_:)), name: .AVAudioSessionRouteChange, object: nil)
            player?.prepareToPlay()
        } catch let error {
            print("error = \(error)")
        }
    }
    
    @IBAction func clickPlayMp3(_ sender: Any) {
        player?.play()
    }
    
    @objc func handleNotification(_ sender: Notification) {
        let info = sender.userInfo!
        let type = info["AVAudioSessionInterruptionTypeKey"] as! UInt
        if type == AVAudioSessionInterruptionType.began.rawValue {
            print("開始")
            player?.pause()
        } else {
            print("結束")
            player?.play()
        }
    }
    
    @objc func handleRouteNotification(_ sender: Notification) {
        let info = sender.userInfo!
        let reasonKey = info["AVAudioSessionRouteChangeReasonKey"] as! UInt
        if AVAudioSessionRouteChangeReason.oldDeviceUnavailable.rawValue == reasonKey  {
            print("耳機取出, 暫停播放")
            player?.pause()
        }
    }
    
  • 使用AVAudioRecorder錄製音訊

AVAudioRecorder 支援無限時長的錄製,支援錄製一段時間後暫停,再從這個點開始繼續錄製。

錄製音訊關鍵步驟:
1、提供本地儲存檔案的URL
2、配置錄製音訊會話的資訊
3、容錯處理

private func settingSession() {
    // 配置音訊會話
    let audioSession = AVAudioSession.sharedInstance()
    do {
        try audioSession.setCategory(AVAudioSessionCategoryPlayback)
        try audioSession.setActive(true)
    } catch let error {
        print("error = \(error)")
    }
}

// MARK: - AVAudioRecorder
private func audioRecorderDemo() {
    // 需要在info.list中配置麥克風許可權
    let path = NSSearchPathForDirectoriesInDomains(FileManager.SearchPathDirectory.documentDirectory, FileManager.SearchPathDomainMask.allDomainsMask, true).last! + "/voice.caf"
    let url = URL(fileURLWithPath: path)
    do {
        /*
         AVFormatIDKey 音訊格式
         AVSampleRateKey 取樣率
         AVNumberOfChannelsKey 聲道數
         */
        recoder = try AVAudioRecorder.init(url: url, settings: [AVFormatIDKey : kAudioFormatAppleIMA4, AVSampleRateKey: 44100.0, AVNumberOfChannelsKey: 1, AVEncoderBitDepthHintKey: 16, AVEncoderAudioQualityKey: AVAudioQuality.medium])
        recoder?.prepareToRecord()
        recoder?.delegate = self
        
        
        // 開啟音訊資料測量
//            recoder?.isMeteringEnabled = true
        // 獲取音訊平均分貝值的大小 0 ~ -160db
//            recoder?.averagePower(forChannel: <#T##Int#>)
        // 獲取音訊峰值分貝資料大小
//            recoder?.peakPower(forChannel: <#T##Int#>)
    } catch let error {
        print("error = \(error)")
    }
}
    
@IBAction func clickRecoder(_ sender: Any) {
    print("開始錄製")
    recoder?.record()
}
    
    
@IBAction func pauseRecoder(_ sender: Any) {
    recoder?.pause()
    print("暫停錄製")
}
    
@IBAction func stopRecoder(_ sender: Any) {
    recoder?.stop()
    print("結束錄製")
}

extension ViewController: AVAudioRecorderDelegate {
    func audioRecorderDidFinishRecording(_ recorder: AVAudioRecorder, successfully flag: Bool) {
        print("錄製音訊停止 對錄製的音訊做處理,儲存?刪除? audioRecorderDidFinishRecording")
    }
}

視訊播放

  • 基礎知識

AVPlayer是AVFounction中的核心播放類,但是它是一個不可見的元件,需要AVPlayerLayer來顯示播放介面。AVPlayer是一個單獨資源的播放,如何需要在一個序列中播放多個條目需要用AVQueuePlayer來實現。

AVPlayerLayer構建在Core Animation上用於視訊內容的渲染介面。

AVPlayerItem會建立媒體資源動態資料模型,可以獲取播放視訊中的curretTime和presentationSize等多個動態屬性。

  • 示例程式碼
private var avPlayer: AVPlayer!
private var playerItem: AVPlayerItem!
private var asset: AVAsset!
private var imageGenerator: AVAssetImageGenerator!
    
override func viewDidLoad() {
    super.viewDidLoad()
    let url = Bundle.main.url(forResource: "Test", withExtension: "mov")
    // 網路視訊
    // let url = URL.init(string: "http://clips.vorwaerts-gmbh.de/big_buck_bunny.mp4")
    asset = AVAsset(url: url!)
    playerItem = AVPlayerItem(asset: asset)
    playerItem.addObserver(self, forKeyPath:"status", options: [NSKeyValueObservingOptions.old, NSKeyValueObservingOptions.new] , context: nil)
    avPlayer = AVPlayer(playerItem: playerItem)
    let playerView = PlayerView.init(UIScreen.main.bounds, avPlayer)
    view.addSubview(playerView)
    avPlayer.play()
    
}

override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
    if playerItem.status == .readyToPlay {
        print("readyToPlay")
        // 設定播放監聽
        // 監聽時間
        addPlayerItemTimeObserver()
        
        // 監聽視訊播放完成
        addItemEndObserverForPlayerItem()
        
        // 獲取視訊指定時間點的縮圖
        getGenerateThumbnails()
        
        // 獲取字幕資訊
        loadMediaOptions()
    }
}
    
    
private func loadMediaOptions() {
    // 獲取視訊包含的字幕資訊
    let gropup = asset.mediaSelectionGroup(forMediaCharacteristic: AVMediaCharacteristic.legible)
    if let gropup = gropup {
        var subtitles: [String] = Array()
        for item in gropup.options {
            print("displayName = \(item.displayName)")
            subtitles.append(item.displayName)
        }
    } else {
        print("gropup = nil, 沒有字幕資訊")
    }
    
}
    
private func getGenerateThumbnails() {
    imageGenerator = AVAssetImageGenerator(asset: asset)
    // 設定生成圖片的寬高
    imageGenerator.maximumSize = CGSize(width: 200, height: 0)
    
    // 獲取20張縮圖
    let duration = asset.duration
    var times: [NSValue] = Array()
    let increment = duration.value / 20
    var currentValue = kCMTimeZero.value
    while currentValue <= duration.value {
        let time = CMTimeMake(currentValue, duration.timescale)
        times.append(NSValue.init(time: time))
        currentValue += increment
    }
    
    var images: [UIImage] = Array()
    imageGenerator.generateCGImagesAsynchronously(forTimes: times) { (requestedTime, cgImage, actualTime, result, error) in
        if result == AVAssetImageGeneratorResult.succeeded {
            let image = UIImage(cgImage: cgImage!)
            images.append(image)
            // 將圖片更新到UI元件
        } else {
            print("生成縮圖失敗")
        }
    }
}
    
private func addItemEndObserverForPlayerItem() {
    NotificationCenter.default.addObserver(forName: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: nil, queue: OperationQueue.main) { (notification) in
        print("播放完成")
    }
}
    
// 監聽播放時間
private func addPlayerItemTimeObserver() {
    /*
     監聽時間
     1、定期監聽 利用AVPlayer的方法 addPeriodicTimeObserverForInterval:<#(CMTime)#> queue:<#(nullable dispatch_queue_t)#> usingBlock:<#^(CMTime time)block#>
     2、邊界時間監聽 利用AVPlayer的方法定義邊界標記 addBoundaryTimeObserverForTimes:<#(nonnull NSArray<NSValue *> *)#> queue:<#(nullable dispatch_queue_t)#> usingBlock:<#^(void)block#>
     */
    
    // 定義0.5秒時間間隔來更新時間
    let time = CMTimeMakeWithSeconds(0.5, Int32(NSEC_PER_SEC))
    avPlayer.addPeriodicTimeObserver(forInterval: time, queue: DispatchQueue.main) { [weak self] (time) in
        let currentTime = CMTimeGetSeconds(time)
        let duration = CMTimeGetSeconds((self?.playerItem.duration)!)
        print("更新當前播放的時間 = \(currentTime), 視訊總時長 = \(duration)")
    }
}
    
deinit {
    NotificationCenter.default.removeObserver(self)
}
  • 利用AVKit播放視訊

AVKit是iOS8新出的一個框架,可用於快速構建一個簡單的播放功能。在MediaPlayer框架中的MPMoviePlayerViewController也有類似的功能,只不過在iOS9.0已經廢棄了。

// 播放視訊就是這麼簡單
let avplayer = AVPlayerViewController()
// 是否顯示底部播放控制條
avplayer.showsPlaybackControls = false
let url = Bundle.main.url(forResource: "Test", withExtension: "mov")
avplayer.player = AVPlayer(url: url!)
avplayer.view.frame = UIScreen.main.bounds
view.addSubview(avplayer.view)
avplayer.player?.play()