美文网首页ios适配&崩溃&打包上架
AudioToolbox.Framework深入了解

AudioToolbox.Framework深入了解

作者: MoShengLive | 来源:发表于2016-10-10 13:45 被阅读837次

    What Is Core Audio?

    Core Audio is the digital audio infrastructure of iOS and OS X. It includes a set of software frameworks designed to handle the audio needs in your applications. Read this chapter to learn what you can do with Core Audio.核心音频是iOS的数字音频基础设施和OS x,它包括一组软件框架设计用于处理音频需要在您的应用程序。阅读这一章,了解你能做什么与核心音频

    Core Audio is tightly integrated into iOS and OS X for high performance and low latency.

    In OS X, the majority of Core Audio services are layered on top of the Hardware Abstraction Layer (HAL) as shown in Figure 1-1. Audio signals pass to and from hardware through the HAL. You can access the HAL using Audio Hardware Services in the Core Audio framework when you require real-time audio. The Core MIDI (Musical Instrument Digital Interface) framework provides similar interfaces for working with MIDI data and devices.核心音频紧密集成到iOS和mac OS X高性能和低延迟。

    在OS X中,大部分核心音频服务是分层的硬件抽象层(HAL)如图1 - 1所示。音频信号通过与硬件通过HAL。您可以访问的HAL使用音频硬件服务核心音频框架时需要实时音频。核心MIDI(乐器数字接口)框架提供了类似的接口处理MIDI数据和设备。

    You find Core Audio application-level services in the Audio Toolbox and Audio Unit frameworks.你找到核心音频音频应用程序级服务工具箱和音频单元框架。

    Use Audio Queue Services to record, play back, pause, loop, and synchronize audio.使用音频队列服务记录,播放,暂停,循环,和音频同步。

    Use Audio File, Converter, and Codec Services to read and write from disk and to perform audio data format transformations. In OS X you can also create custom codecs.使用音频文件、转换器和编解码器服务从磁盘读取和写入和执行音频数据格式转换。在OS X您还可以创建一个自定义编解码器

    Use Audio Unit Services and Audio Processing Graph Services (represented in the figure as “Audio units”) to host audio units (audio plug-ins) in your application. In OS X you can also create custom audio units to use in your application or to provide for use in other applications.使用音频服务和音频处理单元图服务(在图中表示为“音频单元”)主办单位音频(音频插件)在您的应用程序。在OS X你也可以创建自定义音频单元在应用程序中使用或提供其他应用程序使用

    Use Music Sequencing Services to play MIDI-based control and music data.使用音乐测序服务MIDI-based控制和音乐数据

    Use Core Audio Clock Services for audio and MIDI synchronization and time format management.使用核心音频和MIDI音频时钟服务同步和时间格式管理。

    Use System Sound Services (represented in the figure as “System sounds”) to play system sounds and user-interface sound effects.使用系统声音服务(在图中表示为“系统声音”)系统声音和用户界面音效。

    Core Audio in iOS is optimized for the computing resources available in a battery-powered mobile platform. There is no API for services that must be managed very tightly by the operating system—specifically, the HAL and the I/O Kit. However, there are additional services in iOS not present in OS X. For example, Audio Session Services lets you manage the audio behavior of your application in the context of a device that functions as a mobile telephone and an iPod. Figure 1-2 provides a high-level view of the audio architecture in iOS.核心音频iOS中可用的计算资源优化是一个电池驱动的移动平台。没有API服务必须由操作系统管理的非常严格,(HAL)和I / O设备。然而,在iOS有附加服务不存在例如在OS x,音频会议服务允许您管理应用程序的上下文中的音频行为的设备功能手机和iPod。图1 - 2提供了一个在iOS音频架构的高级视图。

    Frameworks Available in iOS and OS X

    The frameworks listed in this section are available in iOS 2.0 and OS X v10.5.框架在iOS和mac OS X

    在这一节中列出的框架在iOS 2.0和OS X v10.5可用

    AudioToolbox.framework:

    The Audio Toolbox framework contains the APIs that provide application-level services. The Audio Toolbox framework includes these header files:音频工具箱框架包含api,提供应用程序级别的服务。音频工具箱框架包括这些头文件:

    AudioConverter.h: Audio Converter API. Defines the interface used to create and use audio converters.AudioConverter。h:音频转换器API。定义接口用于创建和使用音频转换器。

    Audio File Services

    Audio File services lets you read or write audio data to and from a file or buffer. You use it in conjunction with Audio Queue Services to record or play audio. In iOS and OS X, Audio File Services consists of the functions, data types, and constants declared in the AudioFile.h header file in AudioToolbox.framework.音频文件服务让你读或写音频数据和从文件或缓冲区。你使用它与音频队列服务记录或播放音频。在iOS和mac OS X中,音频文件服务包括函数、数据类型和常量AudioFile中声明。在AudioToolbox.framework h头文件。

    AudioFile.h: Defines an interface for reading and writing audio data in files.定义一个接口,用于读取和写入音频文件中的数据。

    Audio File Stream Services 音频文件流服务

    Audio File Stream services lets you parse audio file streams—that is, audio data for which you don’t necessarily have access to the entire file. You can also use it to parse file data from disk, although Audio File Services is designed for that purpose.音频文件流服务允许您解析音频文件流、音频数据你不一定对整个文件的访问。您还可以使用它来解析从磁盘文件数据,尽管音频文件服务是为这个目的而设计的。

    Audio File Stream services returns audio data and metadata to your application via callbacks. which you typically then play back using Audio Queue Services. In iOS and OS X, Audio File Stream Services consists of the functions, data types, and constants declared in the AudioFileStream.h header file in AudioToolbox.framework.音频文件流服务返回音频数据和元数据到您的应用程序通过回调。然后您通常使用音频播放队列服务。在iOS和mac OS X中,音频文件流服务包括函数、数据类型和常量AudioFileStream中声明。在AudioToolbox.framework h头文件。

    AudioFileStream.h: Defines an interface for parsing audio file streams.用于解析音频文件流。

    Audio Format Services 音频格式的服务

    Audio Format Services lets you work with audio data format information. Other services, such as Audio File Services, have functions for this use as well. You use Audio Format Services when all you want to do is obtain audio data format information. In OS X, you can also use this service to get system characteristics such as the available sample rates for encoding. Audio Format Services consists of the functions, data types, and constants declared in the AudioFormat.h header file in AudioToolbox.framework.音频格式服务允许您处理音频数据格式的信息。其他服务,如音频文件服务功能使用。您使用音频格式服务当所有你要做的是获得音频数据格式的信息。在OS X中,您还可以使用该服务来获得系统可用的样本率等特征编码。音频格式的服务包括函数、数据类型和常量AudioFormat中声明。在AudioToolbox.framework h头文件。

    AudioFormat.h: Defines the interface used to assign and read audio format metadata in audio files.用于分配和读取音频格式音频文件的元数据

    Audio Queue Services 音频队列服务

    Audio Queue Services lets you play or record audio. It also lets you pause and resume playback, perform looping, and synchronize multiple channels of audio. In iOS and OS X, Audio Queue Services consists of the functions, data types, and constants declared in the AudioQueue.h header file in AudioToolbox.framework.音频队列服务允许你播放或记录音频。它还允许你暂停和恢复播放,执行循环,多个通道的音频同步。在iOS和mac OS X中,音频队列服务包括函数、数据类型和常量AudioQueue中声明。在AudioToolbox.framework h头文件。

    AudioQueue.h: Defines an interface for playing and recording audio.用于播放和录制音频。

    AudioServices.h: Defines three interfaces. System Sound Services lets you play short sounds and alerts. Audio Hardware Services provides a lightweight interface for interacting with audio hardware. Audio Session Services lets iPhone and iPod touch applications manage audio sessions.定义了三个接口。系统声音服务允许你短声音和警报。音频硬件服务提供了一个轻量级的界面与音频硬件进行交互。音频会话服务让iPhone和iPod touch的应用程序管理音频会话

    Audio Processing Graph Services 音频处理图服务

    Audio Processing Graph Services lets you create and manipulate audio processing graphs in your application. In iOS and in OS X, it consists of the functions, data types, and constants declared in AUGraph.h header file in AudioToolbox.framework.音频处理图服务允许您创建和操纵音频处理图形应用程序中。在iOS和mac OS X,它包括功能、数据类型和常量AUGraph中声明。在AudioToolbox.framework h头文件。

    AudioToolbox.h: Top-level include file for the Audio Toolbox framework.顶级音频工具箱包括文件框架。

    AUGraph.h: Defines the interface used to create and use audio processing graphs.定义了接口用于创建和使用音频处理图。

    ExtendedAudioFile.h: Defines the interface used to translate audio data from files directly into linear PCM, and vice versa.定义接口用于将音频数据从文件直接转化为线性PCM,反之亦然。

    AVFoundation.framework

    The AV Foundation framework provides an Objective-C interface for playing back audio with the control needed by most applications. The AV Foundation framework in iOS includes one header file:AV基础框架提供了一个objective - c接口播放音频的控制所需的大多数应用程序。iOS的AV基础框架包含一个头文件

    AVAudioPlayer.h: Defines an interface for playing audio from a file or from memory.定义一个接口,用于播放音频文件或从内存。

    OpenAL.framework

    The OpenAL framework provides an implementation of the the OpenAL specification. This framework includes these two header files:OpenAL框架提供了一个实现OpenAL规范。这个框架包括这两个头文件al.h alc.h

    In iOS you have these additional header files:

    oalMacOSX_OALExtensions.h    oalStaticBufferExtension.h

    framework

    You get another view of Core Audio by considering its API frameworks, located in /System/Library/Frameworks/. This section quickly lists them to give you a sense of where to find the pieces that make up the Core Audio layers.

    Take note that the Core Audio framework is not an umbrella to the other frameworks here, but rather a peer.

    The Audio Toolbox framework (AudioToolbox.framework) provides interfaces for the mid- and high-level services in Core Audio. In iOS, this framework includes Audio Session Services, the interface for managing your application’s audio behavior in the context of a device that functions as a mobile phone and iPod.音频工具箱框架(AudioToolbox.framework)为中级和高级服务提供接口核心音频。在iOS,这个框架包括音频会话服务,界面来管理您的应用程序的上下文中的音频行为的设备功能手机和iPod。

    The Audio Unit framework (AudioUnit.framework) lets applications work with audio plug-ins, including audio units and codecs.音频单元框架(AudioUnit.framework)允许应用程序使用音频插件,包括音频单元和编解码器。

    The AV Foundation framework (AVFoundation.framework), available in iOS, provides the AVAudioPlayer class, a streamlined and simple Objective-C interface for audio playback.AV基础框架(AVFoundation.framework),可用在iOS,AVAudioPlayer类,提供了一个简单和简单的objective - c接口音频回放。

    The Core Audio framework (CoreAudio.framework) supplies data types used across Core Audio as well as interfaces for the low-level services.音频核心框架(CoreAudio.framework)供应数据类型用于核心音频以及接口的底层服务

    The Core Audio Kit framework (CoreAudioKit.framework) provides a small API for creating user interfaces for audio units. This framework is not available in iOS.核心音频设备框架(CoreAudioKit.framework)提供了一个小型的API为音频单元创建用户界面。这个框架在iOS是不可用的。

    The Core MIDI framework (CoreMIDI.framework) lets applications work with MIDI data and configure MIDI networks. This framework is not available in iOS.MIDI核心框架(CoreMIDI.framework)允许应用程序使用MIDI数据和配置MIDI网络。这个框架在iOS是不可用的。

    The Core MIDI Server framework (CoreMIDIServer.framework) lets MIDI drivers communicate with the OS X MIDI server. This framework is not available in iOS.MIDI核心服务器框架(CoreMIDIServer.framework)允许MIDI司机与OS X MIDI服务器通信。这个框架在iOS是不可用的。

    The OpenAL framework (OpenAL.framework) provides the interfaces to work with OpenAL, an open source, positional audio technology.OpenAL框架(OpenAL.framework)提供的接口与OpenAL工作,一个开源的、位置音频技术。

    The Appendix Core Audio Frameworks describes all these frameworks, as well as each of their included header files.附录核心音频框架描述所有这些框架,以及他们各自包含的头文件

    相关文章

      网友评论

        本文标题:AudioToolbox.Framework深入了解

        本文链接:https://www.haomeiwen.com/subject/dxamyttx.html