1# Dual-Channel Preview (ArkTS)
2
3Before developing a camera application, request permissions by following the instructions provided in [Camera Development Preparations](camera-preparation.md).
4
5Dual-channel preview means that an application can use two preview streams at the same time. One preview stream is used for display on the screen, and the other is used for other operations such as image processing, so as to improve the processing efficiency.
6
7The camera application controls the camera device to implement basic operations such as image display (preview), photo saving (photo capture), and video recording. The camera model is developed on the surface model, meaning that an application transfers data through the surface. Specifically, it obtains the photo stream data through the surface of an ImageReceiver object and the preview stream data through the surface of an **XComponent**.
8
9To implement dual-channel preview (there are two preview streams instead of one preview stream plus one photo stream), you must create a previewOutput object through the surface of an ImageReceiver object. Other processes are the same as those of the photo stream and preview stream.
10
11Read [Camera](../../reference/apis-camera-kit/js-apis-camera.md) for the API reference.
12
13## Constraints
14
15- Currently, streams cannot be dynamically added. In other words, you cannot call [addOutput](../../reference/apis-camera-kit/js-apis-camera.md#addoutput11) to add streams without calling [session.stop](../../reference/apis-camera-kit/js-apis-camera.md#stop11) first.
16- After an ImageReceiver object processes image data obtained, it must release the image buffer so that the buffer queue of the surface properly rotates.
17
18## API Calling Process
19
20The figure below shows the recommended API calling process of the dual-channel preview solution.
21
22![dual-preview-streams-instructions](figures/dual-preview-streams-instructions.png)
23
24## How to Develop
25
26- For the first preview stream used for image processing, create an ImageReceiver object, obtain the surface ID to create the first preview stream, register an image listener, and process each frame of image data in the preview stream as required.
27- For the second preview stream used for image display, create an **XComponent**, obtain the surface ID to create the second preview stream, and render the preview stream data in the component.
28- To enable both preview streams to obtain data, configure a camera session for both preview streams, and start the session.
29
30### First Preview Stream Used for Image Processing
31
321. Obtain the surface ID for the first preview stream. Specifically, create an ImageReceiver object, and obtain the surface ID through the object.
33
34    ```ts
35    import { image } from '@kit.ImageKit';
36    imageWidth: number = 1920; // Use the width in the profile size supported by the device.
37    imageHeight: number = 1080; // Use the height in the profile size supported by the device.
38
39    async function initImageReceiver():Promise<void>{
40      // Create an ImageReceiver object.
41      let size: image.Size = { width: this.imageWidth, height: this.imageHeight };
42      let imageReceiver = image.createImageReceiver(size, image.ImageFormat.JPEG, 8);
43      // Obtain the surface ID for the first preview stream.
44      let imageReceiverSurfaceId = await imageReceiver.getReceivingSurfaceId();
45      console.info(`initImageReceiver imageReceiverSurfaceId:${imageReceiverSurfaceId}`);
46    }
47    ```
48
492. Register a listener to process each frame of image data in the preview stream. Specifically, use the **imageArrival** event in the ImageReceiver object to obtain the image data returned by the bottom layer. For details, see [Image API Reference](../../reference/apis-image-kit/js-apis-image.md).
50
51    ```ts
52    import { BusinessError } from '@kit.BasicServicesKit';
53    import { image } from '@kit.ImageKit';
54
55    function onImageArrival(receiver: image.ImageReceiver): void {
56      // Subscribe to the imageArrival event.
57      receiver.on('imageArrival', () => {
58        // Obtain an image.
59        receiver.readNextImage((err: BusinessError, nextImage: image.Image) => {
60          if (err || nextImage === undefined) {
61            console.error('readNextImage failed');
62            return;
63          }
64          // Parse the image.
65          nextImage.getComponent(image.ComponentType.JPEG, async (err: BusinessError, imgComponent: image.Component) => {
66            if (err || imgComponent === undefined) {
67              console.error('getComponent failed');
68            }
69            if (imgComponent.byteBuffer) {
70              // For details, see the description of parsing the image buffer data below. This example uses method 1.
71              let width = nextImage.size.width; // Obtain the image width.
72              let height = nextImage.size.height; // Obtain the image height.
73              let stride = imgComponent.rowStride; // Obtain the image stride.
74              console.debug(`getComponent with width:${width} height:${height} stride:${stride}`);
75              // The value of stride is the same as that of width.
76              if (stride == width) {
77                let pixelMap = await image.createPixelMap(imgComponent.byteBuffer, {
78                  size: { height: height, width: width },
79                  srcPixelFormat: 8,
80                })
81              } else {
82                // The value of stride is different from that of width.
83                const dstBufferSize = width * height * 1.5
84                const dstArr = new Uint8Array(dstBufferSize)
85                for (let j = 0; j < height * 1.5; j++) {
86                  const srcBuf = new Uint8Array(imgComponent.byteBuffer, j * stride, width)
87                  dstArr.set(srcBuf, j * width)
88                }
89                let pixelMap = await image.createPixelMap(dstArr.buffer, {
90                  size: { height: height, width: width },
91                  srcPixelFormat: 8,
92                })
93              }
94            } else {
95              console.error('byteBuffer is null');
96            }
97            // Release the resource when the buffer is not in use.
98            // If an asynchronous operation is performed on the buffer, call nextImage.release() to release the resource after the asynchronous operation is complete.
99            nextImage.release();
100          })
101        })
102      })
103    }
104    ```
105
106    The following methods are available for parsing the image buffer data by using [image.Component](../../reference/apis-image-kit/js-apis-image.md#component9).
107
108    > **NOTE**
109    > Check whether the width of the image is the same as **rowStride**. If they are different, perform the following operations:
110
111    Method 1: Remove the stride data from **imgComponent.byteBuffer**, obtain a new buffer by means of copy, and process the buffer by calling the API that does not support stride.
112
113    ```ts
114    // For example, for NV21 (images in YUV_420_SP format), the formula for calculating the YUV_420_SP memory is as follows: YUV_420_SP memory = Width * Height + (Width * Height)/2.
115    const dstBufferSize = width * height * 1.5;
116    const dstArr = new Uint8Array(dstBufferSize);
117    // Read the buffer data line by line.
118    for (let j = 0; j < height * 1.5; j++) {
119      // Copy the first width bytes of each line of data in imgComponent.byteBuffer to dstArr.
120      const srcBuf = new Uint8Array(imgComponent.byteBuffer, j * stride, width);
121      dstArr.set(srcBuf, j * width);
122    }
123    let pixelMap = await image.createPixelMap(dstArr.buffer, {
124      size: { height: height, width: width }, srcPixelFormat: 8
125    });
126    ```
127
128    Method 2: Create a PixelMap based on the value of stride * height, and call **cropSync** of the PixelMap to crop redundant pixels.
129
130    ```ts
131    // Create a PixelMap, with width set to the value of stride.
132    let pixelMap = await image.createPixelMap(imgComponent.byteBuffer, {
133      size:{height: height, width: stride}, srcPixelFormat: 8});
134    // Crop extra pixels.
135    pixelMap.cropSync({size:{width:width, height:height}, x:0, y:0});
136    ```
137
138    Method 3: Pass **imgComponent.byteBuffer** and **stride** to the API that supports stride.
139
140
141
142### Second Preview Stream Used for Image Display
143
144To obtain the surface ID of the second preview stream, you must first create an **XComponent** for displaying the preview stream. For details about how to obtain the surface ID, see [getXcomponentSurfaceId](../../reference/apis-arkui/arkui-ts/ts-basic-components-xcomponent.md#getxcomponentsurfaceid9). The **XComponent** capability is provided by the UI. For details, see [XComponent](../../reference/apis-arkui/arkui-ts/ts-basic-components-xcomponent.md).
145
146```ts
147@Component
148struct example {
149  xComponentCtl: XComponentController = new XComponentController();
150  surfaceId:string = '';
151  imageWidth: number = 1920;
152  imageHeight: number = 1080;
153
154  build() {
155    XComponent({
156      id: 'componentId',
157      type: 'surface',
158      controller: this.xComponentCtl
159    })
160      .onLoad(async () => {
161        console.info('onLoad is called');
162        this.surfaceId = this.xComponentCtl.getXComponentSurfaceId(); // Obtain the surface ID of the component.
163        // Use the surface ID to create a preview stream and start the camera. The component renders the preview stream data of each frame in real time.
164      })
165      .width(px2vp(this.imageHeight))
166      .height(px2vp(this.imageWidth))
167  }
168}
169```
170
171
172
173### Enabling a Preview Stream to Obtain Data
174
175Create two preview outputs with two surface IDs, add the outputs to a camera session, and start the camera session to obtain the preview stream data.
176
177```ts
178function createDualPreviewOutput(cameraManager: camera.CameraManager, previewProfile: camera.Profile,
179session: camera.Session,
180imageReceiverSurfaceId: string, xComponentSurfaceId: string): void {
181    // Create the first preview output by using imageReceiverSurfaceId.
182    let previewOutput1 = cameraManager.createPreviewOutput(previewProfile, imageReceiverSurfaceId);
183    if (!previewOutput1) {
184    console.error('createPreviewOutput1 error');
185    }
186    // Create the second preview output by using xComponentSurfaceId.
187    let previewOutput2 = cameraManager.createPreviewOutput(previewProfile, xComponentSurfaceId);
188    if (!previewOutput2) {
189    console.error('createPreviewOutput2 error');
190    }
191    // Add the output of the first preview stream.
192    session.addOutput(previewOutput1);
193    // Add the output of the second preview stream.
194    session.addOutput(previewOutput2);
195}
196```
197
198
199
200## Sample
201
202```ts
203import { camera } from '@kit.CameraKit';
204import { image } from '@kit.ImageKit';
205import { BusinessError } from '@kit.BasicServicesKit';
206
207@Entry
208@Component
209struct Index {
210  private imageReceiver: image.ImageReceiver | undefined = undefined;
211  private imageReceiverSurfaceId: string = '';
212  private xComponentCtl: XComponentController = new XComponentController();
213  private xComponentSurfaceId: string = '';
214  @State imageWidth: number = 1920;
215  @State imageHeight: number = 1080;
216  private cameraManager: camera.CameraManager | undefined = undefined;
217  private cameras: Array<camera.CameraDevice> | Array<camera.CameraDevice> = [];
218  private cameraInput: camera.CameraInput | undefined = undefined;
219  private previewOutput1: camera.PreviewOutput | undefined = undefined;
220  private previewOutput2: camera.PreviewOutput | undefined = undefined;
221  private session: camera.VideoSession | undefined = undefined;
222
223  onPageShow(): void {
224    console.info('onPageShow');
225    this.initImageReceiver();
226    if (this.xComponentSurfaceId !== '') {
227      this.initCamera();
228    }
229  }
230
231  onPageHide(): void {
232    console.info('onPageHide');
233    this.releaseCamera();
234  }
235
236  /**
237   * Obtain the surface ID of the ImageReceiver object.
238   * @param receiver
239   * @returns
240   */
241  async initImageReceiver(): Promise<void> {
242    if (!this.imageReceiver) {
243      // Create an ImageReceiver object.
244      let size: image.Size = { width: this.imageWidth, height: this.imageHeight };
245      this.imageReceiver = image.createImageReceiver(size, image.ImageFormat.JPEG, 8);
246      // Obtain the surface ID for the first preview stream.
247      this.imageReceiverSurfaceId = await this.imageReceiver.getReceivingSurfaceId();
248      console.info(`initImageReceiver imageReceiverSurfaceId:${this.imageReceiverSurfaceId}`);
249      // Register a listener to listen for and process the image data of each frame in the preview stream.
250      this.onImageArrival(this.imageReceiver);
251    }
252  }
253
254  /**
255   * Register a listener for the ImageReceiver object.
256   * @param receiver
257   */
258  onImageArrival(receiver: image.ImageReceiver): void {
259    // Subscribe to the imageArrival event.
260    receiver.on('imageArrival', () => {
261      console.info('image arrival');
262      // Obtain an image.
263      receiver.readNextImage((err: BusinessError, nextImage: image.Image) => {
264        if (err || nextImage === undefined) {
265          console.error('readNextImage failed');
266          return;
267        }
268        // Parse the image.
269        nextImage.getComponent(image.ComponentType.JPEG, async (err: BusinessError, imgComponent: image.Component) => {
270          if (err || imgComponent === undefined) {
271            console.error('getComponent failed');
272          }
273          if (imgComponent.byteBuffer) {
274            // Parse the buffer data by referring to step 7. This example uses method 1 as an example.
275            let width = nextImage.size.width; // Obtain the image width.
276            let height = nextImage.size.height; // Obtain the image height.
277            let stride = imgComponent.rowStride; // Obtain the image stride.
278            console.debug(`getComponent with width:${width} height:${height} stride:${stride}`);
279            // The value of stride is the same as that of width.
280            if (stride == width) {
281              let pixelMap = await image.createPixelMap(imgComponent.byteBuffer, {
282                size: { height: height, width: width },
283                srcPixelFormat: 8,
284              })
285            } else {
286              // The value of stride is different from that of width.
287              const dstBufferSize = width * height * 1.5 // For example, for NV21 (images in YUV_420_SP format), the formula for calculating the YUV_420_SP memory is as follows: YUV_420_SP memory = Width * Height + (Width * Height)/2
288              const dstArr = new Uint8Array(dstBufferSize)
289              for (let j = 0; j < height * 1.5; j++) {
290                const srcBuf = new Uint8Array(imgComponent.byteBuffer, j * stride, width)
291                dstArr.set(srcBuf, j * width)
292              }
293              let pixelMap = await image.createPixelMap(dstArr.buffer, {
294                size: { height: height, width: width },
295                srcPixelFormat: 8,
296              })
297            }
298          } else {
299            console.error('byteBuffer is null');
300          }
301          // Release the resource when the buffer is not in use.
302          // If an asynchronous operation is performed on the buffer, call nextImage.release() to release the resource after the asynchronous operation is complete.
303          nextImage.release();
304          console.info('image process done');
305        })
306      })
307    })
308  }
309
310  build() {
311    Column() {
312      XComponent({
313        id: 'componentId',
314        type: 'surface',
315        controller: this.xComponentCtl
316      })
317        .onLoad(async () => {
318          console.info('onLoad is called');
319          this.xComponentSurfaceId = this.xComponentCtl.getXComponentSurfaceId(); // Obtain the surface ID of the component.
320          // Initialize the camera. The component renders the preview stream data of each frame in real time.
321          this.initCamera()
322        })
323        .width(px2vp(this.imageHeight))
324        .height(px2vp(this.imageWidth))
325    }.justifyContent(FlexAlign.Center)
326    .height('100%')
327    .width('100%')
328  }
329
330  // Initialize the camera.
331  async initCamera(): Promise<void> {
332    console.info(`initCamera imageReceiverSurfaceId:${this.imageReceiverSurfaceId} xComponentSurfaceId:${this.xComponentSurfaceId}`);
333    try {
334      // Obtain the camera manager instance.
335      this.cameraManager = camera.getCameraManager(getContext(this));
336      if (!this.cameraManager) {
337        console.error('initCamera getCameraManager');
338      }
339      // Obtain the list of cameras supported by the device.
340      this.cameras = this.cameraManager.getSupportedCameras();
341      if (!this.cameras) {
342        console.error('initCamera getSupportedCameras');
343      }
344      // Select a camera device and create a CameraInput object.
345      this.cameraInput = this.cameraManager.createCameraInput(this.cameras[0]);
346      if (!this.cameraInput) {
347        console.error('initCamera createCameraInput');
348      }
349      // Open the camera.
350      await this.cameraInput.open().catch((err: BusinessError) => {
351        console.error(`initCamera open fail: ${JSON.stringify(err)}`);
352      })
353      // Obtain the profile supported by the camera device.
354      let capability: camera.CameraOutputCapability =
355        this.cameraManager.getSupportedOutputCapability(this.cameras[0], camera.SceneMode.NORMAL_VIDEO);
356      if (!capability) {
357        console.error('initCamera getSupportedOutputCapability');
358      }
359      // Select a supported preview stream profile based on service requirements.
360      let previewProfile: camera.Profile = capability.previewProfiles[0];
361      this.imageWidth = previewProfile.size.width; // Update the width of the XComponent.
362      this.imageHeight = previewProfile.size.height; // Update the height of the XComponent.
363      console.info(`initCamera imageWidth:${this.imageWidth} imageHeight:${this.imageHeight}`);
364      // Create the first preview output by using imageReceiverSurfaceId.
365      this.previewOutput1 = this.cameraManager.createPreviewOutput(previewProfile, this.imageReceiverSurfaceId);
366      if (!this.previewOutput1) {
367        console.error('initCamera createPreviewOutput1');
368      }
369      // Create the second preview output by using xComponentSurfaceId.
370      this.previewOutput2 = this.cameraManager.createPreviewOutput(previewProfile, this.xComponentSurfaceId);
371      if (!this.previewOutput2) {
372        console.error('initCamera createPreviewOutput2');
373      }
374      // Create a camera session in recording mode.
375      this.session = this.cameraManager.createSession(camera.SceneMode.NORMAL_VIDEO) as camera.VideoSession;
376      if (!this.session) {
377        console.error('initCamera createSession');
378      }
379      // Start configuration for the session.
380      this.session.beginConfig();
381      // Add a camera input.
382      this.session.addInput(this.cameraInput);
383      // Add the output of the first preview stream.
384      this.session.addOutput(this.previewOutput1);
385      // Add the output of the second preview stream.
386      this.session.addOutput(this.previewOutput2);
387      // Commit the session configuration.
388      await this.session.commitConfig();
389      // Start the configured input and output streams.
390      await this.session.start();
391    } catch (error) {
392      console.error(`initCamera fail: ${JSON.stringify(error)}`);
393    }
394  }
395
396  // Release the camera.
397  async releaseCamera(): Promise<void> {
398    console.info('releaseCamera E');
399    try {
400      // Stop the session.
401      await this.session?.stop();
402      // Release the camera input stream.
403      await this.cameraInput?.close();
404      // Release the preview output stream.
405      await this.previewOutput1?.release();
406      // Release the photo output stream.
407      await this.previewOutput2?.release();
408      // Release the session.
409      await this.session?.release();
410    } catch (error) {
411      console.error(`initCamera fail: ${JSON.stringify(error)}`);
412    }
413  }
414}
415```
416