Creating ARFrame
ARFrame is a type of frame data that can be used by the PoseTracker. By passing an ARFrame to the PoseTracker for each frame, the logic of VLSDK is executed. In the case of ARFoundationPoseTracker, which uses ARFoundation, ARFrame is created as follows:
protected ARFrame CreateARFrame(ARCameraFrameEventArgs eventArgs)
{
// Create an ARFrame instance.
ARFrame frame = new ARFrame();
// Camera preview.
frame.texture = GetCameraPreviewTexture();
// Camera model matrix.
frame.localPosition = m_ARCamera.transform.localPosition;
frame.localRotation = m_ARCamera.transform.localRotation;
AquireCameraIntrinsic(out float fx, out float fy, out float cx, out float cy);
frame.intrinsic = new ARIntrinsic(fx, fy, cx, cy);
// Projection matrix.
frame.projMatrix = eventArgs.projectionMatrix ?? Camera.main.projectionMatrix;
// Display matrix.
frame.displayMatrix = eventArgs.displayMatrix ?? Matrix4x4.identity;
frame.displayMatrix = MakeDisplayMatrix(frame.displayMatrix);
return frame;
}
How to Assign Data
After creating an ARFrame, you need to manually assign the frame data. Below are the methods for assigning data to each field.
texture
This field stores the camera image. You can choose between the Texture
type or the UnityYuvCpuImage?
type to create an ARFrame and execute the loop. When using the Texture
type, you can assign a Texture2D
or RenderTexture
to the texture field.
ARFrame frame = new ARFrame();
frame.texture = GetCameraTexture();
When using YUV images, do not assign any value to the texture
field. Instead, assign values to the yuvBuffer
and disposable
fields. Assign a UnityYuvCpuImage?
object to yuvBuffer
and a UnityAction
to disposable
to release the image memory. The YUV image feature supports only the AndroidYuv420_888 and IosYpCbCr420_8BiPlanarFullRange formats.
ARFrame frame = new ARFrame();
UnityYuvCpuImage? yuvBuffer = new UnityYuvCpuImage();
if (IsAndroidYuv420_888())
{
// image: raw image containing YUV data
var yPlane = image.GetPlane(0);
var uPlane = image.GetPlane(1);
var vPlane = image.GetPlane(2);
// Set image size.
yuvBuffer.width = image.width;
yuvBuffer.height = image.height;
yuvBuffer.format = image.format;
yuvBuffer.numberOfPlanes = image.planeCount;
// y plane
yuvBuffer.yPixels = new IntPtr(yPlane.data.GetUnsafePtr());
yuvBuffer.yLength = yPlane.data.Length;
yuvBuffer.yRowStride = yPlane.rowStride;
yuvBuffer.yPixelStride = yPlane.pixelStride;
// u plane
yuvBuffer.uPixels = new IntPtr(uPlane.data.GetUnsafePtr());
yuvBuffer.uLength = uPlane.data.Length;
yuvBuffer.uRowStride = uPlane.rowStride;
yuvBuffer.uPixelStride = uPlane.pixelStride;
// v plane
yuvBuffer.vPixels = new IntPtr(vPlane.data.GetUnsafePtr());
yuvBuffer.vLength = vPlane.data.Length;
yuvBuffer.vRowStride = vPlane.rowStride;
yuvBuffer.vPixelStride = vPlane.pixelStride;
yuvBuffer.rotationMode = YuvRotationMode.YUV_ROTATION_90;
frame.yuvBuffer = yuvBuffer;
frame.disposable = () =>
{
image.Dispose();
};
}
else if (IsIosYpCbCr420_8BiPlanarFullRange())
{
// image: raw image containing YUV data
var yPlane = image.GetPlane(0);
var uvPlane = image.GetPlane(1);
// Set image size.
yuvBuffer.width = image.width;
yuvBuffer.height = image.height;
yuvBuffer.format = image.format;
yuvBuffer.numberOfPlanes = image.planeCount;
// y plane
yuvBuffer.yPixels = new IntPtr(yPlane.data.GetUnsafePtr());
yuvBuffer.yLength = yPlane.data.Length;
yuvBuffer.yRowStride = yPlane.rowStride;
yuvBuffer.yPixelStride = yPlane.pixelStride;
// uv plane
yuvBuffer.uPixels = new IntPtr(uvPlane.data.GetUnsafePtr());
yuvBuffer.uLength = uvPlane.data.Length;
yuvBuffer.uRowStride = uvPlane.rowStride;
yuvBuffer.uPixelStride = uvPlane.pixelStride;
yuvBuffer.rotationMode = YuvRotationMode.YUV_ROTATION_90;
frame.yuvBuffer = yuvBuffer;
frame.disposable = () =>
{
image.Dispose();
};
}
localPosition, localRotation
These fields store the position of the device. Input the VIO pose information of the device. If the transform of Camera.main is automatically updated by the AR framework, assign the localPosition and localRotation of m_ARCamera
.
ARFrame frame = new ARFrame();
frame.localPosition = m_ARCamera.localPosition;
frame.localRotation = m_ARCamera.localRotation;
m_ARCamera
is the same as Camera.main.
Make sure to assign localPosition and localRotation. If you assign position and rotation instead, VLSDK will not function properly.
intrinsic
This field stores the intrinsic parameters of the device camera. Create an instance of ARIntrinsic
and assign it. ARIntrinsic
takes fx, fy, cx, and cy as parameters during instantiation.
ARFrame frame = new ARFrame();
float fx = 900.0f;
float fy = 900.0f;
float cx = 640.0f;
float cy = 320.0f;
frame.intrinsic = new ARIntrinsic(fx, fy, cx, cy);
projMatrix
This is a Matrix4x4 type projection matrix used for rendering 3D objects.
ARFrame frame = new ARFrame();
frame.projMatrix = eventArgs.projectionMatrix ?? Camera.main.projectionMatrix;
displayMatrix
This is a Matrix4x4 type matrix used for rendering the preview. The original image from the device is always delivered in the same orientation regardless of the device's orientation. The displayMatrix is used to rotate the preview to match the device's orientation.
ARFrame frame = new ARFrame();
frame.displayMatrix = eventArgs.displayMatrix ?? Matrix4x4.identity;