1.2-ARAnchor
1.3-ARCamera
1.4-ARError
1.5-ARFrame
1.6-ARHitTestResult
1.7-ARLightEstimate
1.8-ARPlaneAnchor
1.9-ARPointCloud
1.10-ARSCNView
1.11-ARSession
1.12-ARSessionConfiguration
1.13-ARSKView
查看源代码库,目前注明的还都是oc 代码,swift代码还没有完全注释。
1.5-ARFrame
ARFrame主要是追踪相机当前的状态,这个状态不仅仅只是位置,还有图像帧及时间等参数
/**
An object encapsulating the state of everything being tracked for a given moment in time.
@discussion The model provides a snapshot of all data needed to render a given frame.
*/
@available(iOS 11.0, *)
open class ARFrame : NSObject, NSCopying {
/**时间戳
A timestamp identifying the frame.
*/
open var timestamp: TimeInterval { get }
/**缓冲区图像帧
The frame's captured image.
*/
open var capturedImage: CVPixelBuffer { get }
/**相机(表示这个ARFrame是哪一个相机的,iphone7plus有两个摄像机)
The camera used to capture the frame's image.
@discussion The camera provides the device's position and orientation as well as camera parameters.
*/
@NSCopying open var camera: ARCamera { get }
/**返回当前相机捕捉到的锚点数据(当一个3D虚拟模型加入到ARKit中时,锚点指的就是这个模型在AR中的位置)
A list of anchors in the scene.
*/
open var anchors: [ARAnchor] { get }
/**灯光,详情可见本章节ARLightEstimate类介绍(指的是灯光强度 一般是0-2000,系统默认1000)
A light estimate representing the light in the scene.
@discussion Returns nil if there is no light estimation.
*/
@NSCopying open var lightEstimate: ARLightEstimate? { get }
/**特征点(应该是捕捉平地或者人脸的,比较苹果有自带的人脸识别功能)
Feature points in the scene with respect to the frame's origin.
@discussion The feature points are only provided for configurations using world tracking.
*/
open var rawFeaturePoints: ARPointCloud? { get }
/**根据2D坐标点搜索3D模型,这个方法通常用于,当我们手机屏幕点击某一个点的时候,可以捕捉到这一个点所在的3D模型的位置,至于为什么是一个数组非常好理解。手机屏幕一个是长方形,这是一个二维空间,而相机捕捉到的是一个由这个二维空间射出去的长方体,我们点击屏幕一个点可以理解为在这个长方体的边缘射出一条线,这一条线可能会有多个3D物体模型
point:2D坐标点(手机屏幕某一点)
ARHitTestResult.ResultType:捕捉类型 点还是面
[ARHitTestResult]:追踪结果数组 详情见本章节ARHitTestResult类介绍
Searches the frame for objects corresponding to a point in the captured image.
@discussion A 2D point in the captured image's coordinate space can refer to any point along a line segment
in the 3D coordinate space. Hit-testing is the process of finding objects in the world located along this line segment.
@param point A point in the image-space coordinate system of the captured image.
Values should range from (0,0) - upper left corner to (1,1) - lower right corner.
@param types The types of results to search for.
@return An array of all hit-test results sorted from nearest to farthest.
*/
open func hitTest(_ point: CGPoint, types: ARHitTestResult.ResultType) -> [ARHitTestResult]
/**相机窗口的坐标变换(可用于相机横竖屏的旋转适配)
Returns a display transform for the provided viewport size and orientation.
@discussion The display transform can be used to convert normalized points in the image-space coordinate system
of the captured image to normalized points in the view's coordinate space. The transform provides an aspect-fill
and the correct rotation for presenting the captured image in the given size and orientation.
@param viewportSize The size of the viewport.
@param orientation The orientation of the viewport.
*/
open func displayTransform(withViewportSize viewportSize: CGSize, orientation: UIInterfaceOrientation) -> CGAffineTransform
}
1.12-ARSessionConfiguration
extension ARSessionConfiguration {
/**会话的对齐方式,这里的对齐指的是3D世界的坐标,枚举值见下方。
Enum constants for indicating the world alignment.
*/
@available(iOS 11.0, *)
public enum WorldAlignment : Int {
/** Aligns the world with gravity that is defined by vector (0, -1, 0).
坐标系的y轴与重力平行,其原点是设备的初始位置。*/
case gravity
/** Aligns the world with gravity that is defined by the vector (0, -1, 0)
and heading (w.r.t. True North) that is given by the vector (0, 0, -1).
坐标系的y轴与重力平行,其x轴和z轴定向为罗盘方向,其原点为设备的初始位置。*/
case gravityAndHeading
/** Aligns the world with the camera's orientation.
场景坐标系被锁定以匹配摄像机的方向。
*/
case camera
}
}
//世界会话追踪配置,苹果建议我们使用这个类,这个子类只有一个属性,也就是可以帮助我们追踪相机捕捉到的平地
extension ARWorldTrackingSessionConfiguration {
/**
Option set indicating the type of planes to detect.
*/
@available(iOS 11.0, *)
public struct PlaneDetection : OptionSet {
public init(rawValue: UInt)
/** Plane detection determines horizontal planes in the scene. */
public static var horizontal: ARWorldTrackingSessionConfiguration.PlaneDetection { get }
}
}
/**描述和配置在ARSession中使用的增强现实技术的对象。
An object to describe and configure the Augmented Reality techniques to be used in an ARSession.
*/
@available(iOS 11.0, *)
open class ARSessionConfiguration : NSObject, NSCopying {
/**当前设备是否支持,一般A9芯片以下设备不支持
Determines whether this device supports the ARSessionConfiguration.
*/
open class var isSupported: Bool { get }
/**ARKit如何构建基于真实世界设备运动的场景坐标系的选项。
Determines how the session's coordinate system should be aligned with the world.
@discussion The default is ARWorldAlignmentGravity.
*/
open var worldAlignment: ARSessionConfiguration.WorldAlignment
/**是否需要自适应灯光效果,默认为YES
Enable or disable light estimation.
@discussion Enabled by default.
*/
open var isLightEstimationEnabled: Bool
}
/**
所有AR配置在设备居住的现实世界和可以对内容建模的虚拟3D坐标空间之间建立对应关系。当您的应用程序将内容与实时摄像机图像一起显示时,用户体验到您的虚拟内容是真实世界的一部分。
创建和维护空格之间的对应关系需要跟踪设备的运动。所述类跟踪设备的具有六个自由度(6DOF)移动:即,三个旋转轴(滚动,俯仰和偏转),三个平移轴(运动中的x,y和z)。ARWorldTrackingSessionConfiguration
这种级别的跟踪可以创建沉浸式的AR体验:虚拟对象似乎可以保持在相对于现实世界的相同位置,即使用户倾斜设备来查看对象的上方或下方,也可以移动设备以查看物体的侧面和背面。
A session configuration for world tracking.
@discussion World tracking provides 6 degrees of freedom tracking of the device.
By finding feature points in the scene, world tracking enables performing hit-tests against the frame.
Tracking can no longer be resumed once the session is paused.
*/
@available(iOS 11.0, *)
open class ARWorldTrackingSessionConfiguration : ARSessionConfiguration {
/**侦查类型 默认为不侦查,不侦查ARPlaneDetectionNone = 0
Type of planes to detect in the scene.
@discussion If set, new planes will continue to be detected and updated over time. Detected planes will be added to the session as
ARPlaneAnchor objects. In the event that two planes are merged, the newer plane will be removed. Defaults to ARPlaneDetectionNone.
*/
//平地侦查
open var planeDetection: ARWorldTrackingSessionConfiguration.PlaneDetection
}
网友评论