Tuesday, August 5, 2014

Dissecting RosyWriter - Part 1

This is a sample application which demonstrates how to use the AVFoundation framework to capture, process, preview and save video on iOS devices. 

When the application launches, it creates an AVCaptureSession with audio and video device inputs, and outputs for audio and video data. These outputs continuously supply frames of audio and video to the app via the captureOutput:didOutputSampleBuffer:fromConnection  delegate method. 

The app applies a very simple processing step to each video frame. Specifically it sets the green element of each pixel to zero, which gives the entire frame a purple tint. Audio frames are not processed. 

After a frame of video is processed, RosyWriter uses OpenGL ES2 to display it on the screen. This step uses the CVOpenGLESTextureCache API, new in iOS 5.0, for enhanced performance. 
When the user chooses to record a movie, an AVAssetWriter is used to write the processed video and un-processed audio to a QuickTime movie file. 

In the RosyWriterViewController, it creates the instance of RosyWriterVideoProcessor and calls the  setupAndStartCaptureSession method. Also it creates instance of RosyWriterPreviewView which is an Open GL display. This view controller has  toggleRecording method which is for start and stop recording. 

Once the recording begins, the processor will send the recordingWillStart. Also calls back saying recordingDidStart once the recording started

In startRecording, below is the code

- assetWriter is allocated like assetWriter = [AVAssetWriter alloc] initWithURL:movieURL fileType:(NSString*)kUTTypeQuickTimeMovie error:&error];

- In the setupAndStartCaptureSession in this class, application is doing some important tasks. 

- Create the preview buffer queue. err = CMBufferQueueCreate(kAllocatorDefault, 1, CMBuffer);
- Create a movie writing queue which is a dispatch queue. dispatch_queue_create("Movie writing queue", DISPATCH_QUEUE_SERIAL);
- if the capture session is not setup yet, then setup the capture session. In the capture session setup, below are the main tasks done. 
- crete a capture session with captureSession = [AVCaptureSession alloc]init];
- Create the audio connection
- create the video connection 
- If the recording is not started yet, then start the recording. 

Now the major operation is in the didOutputSampleBuffer method. This returns the connection and the sample buffer received. 
If it is a video connection, application does the below few tasks 

- Get the frame rate
- Does the processing of the pixel buffer 
- Enques the processed buffer to the preview buffer 
- It calls back the pixelBufferReadyForDisplay once the queue operation is done. 
- Write the sample buffer to the asset writer stream. 

References: 
https://developer.apple.com/library/ios/samplecode/RosyWriter/Introduction/Intro.html

No comments:

Post a Comment