Mini Tutorial: How to capture video of iPhone app in Cocos2D?

Someone asked me before if I knew how to do record the screen in Cocos2d as a video. I didn’t know how to record a video, so this guy sent me some codes, but his problem is that his code is recording the screen (taking screenshots) as a UIWindow. So my idea for him was to replace his screenshot code with AWScreenshot (by Manucorporat, search the Cocos2d forums for his code).

And here are the code bits:

#import <AVFoundation/AVFoundation.h>
#import <AVFoundation/AVAssetWriter.h>
#import <CoreVideo/CVPixelBuffer.h>
#import <CoreMedia/CMTime.h>

#import “AWScreenshot.h”

#define FRAME_WIDTH 320
#define FRAME_HEIGHT 480
#define TIME_SCALE 60 // frames per second

-(void) startScreenRecording
{  
    NSLog(@”start screen recording”);
   
    // create the AVAssetWriter
    NSString *moviePath = [[self pathToDocumentsDirectory] stringByAppendingPathComponent: @”video.mov”];
    if ([[NSFileManager defaultManager] fileExistsAtPath:moviePath])
    {   [[NSFileManager defaultManager] removeItemAtPath:moviePath error:nil];
    }
   
    NSURL *movieURL = [NSURL fileURLWithPath:moviePath];
    NSError *movieError = nil;
   
    [assetWriter release];
    assetWriter = [[AVAssetWriter alloc] initWithURL:movieURL
                                            fileType: AVFileTypeQuickTimeMovie
                                               error: &movieError];
    NSDictionary *assetWriterInputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                              AVVideoCodecH264, AVVideoCodecKey,
                                              [NSNumber numberWithInt:FRAME_WIDTH], AVVideoWidthKey,
                                              [NSNumber numberWithInt:FRAME_HEIGHT], AVVideoHeightKey,
                                              nil];
    assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeVideo
                                                          outputSettings:assetWriterInputSettings];
    assetWriterInput.expectsMediaDataInRealTime = YES;
    [assetWriter addInput:assetWriterInput];
   
    [assetWriterPixelBufferAdaptor release];
    assetWriterPixelBufferAdaptor =  [[AVAssetWriterInputPixelBufferAdaptor  alloc]
                                     initWithAssetWriterInput:assetWriterInput
                                     sourcePixelBufferAttributes:nil];
    [assetWriter startWriting];
   
    firstFrameWallClockTime = CFAbsoluteTimeGetCurrent();
    [assetWriter startSessionAtSourceTime: CMTimeMake(0, TIME_SCALE)];
   
    // start writing samples to it
    [assetWriterTimer release];
    assetWriterTimer = [NSTimer scheduledTimerWithTimeInterval:0.1
                                                        target:self
                                                      selector:@selector (writeSample:)
                                                      userInfo:nil
                                                       repeats:YES] ;
   
}

-(void) stopScreenRecording
{   [assetWriterTimer invalidate];
    assetWriterTimer = nil;
   
    [assetWriter finishWriting];
    NSLog (@”finished writing”);
}

As you can see startScreenRecording is calls writeSample.

-(void) writeSample: (NSTimer*) _timer
{   if (assetWriterInput.readyForMoreMediaData)
    {
        CVReturn cvErr = kCVReturnSuccess;
       
        // get screenshot image!
        CGImageRef image = (CGImageRef) [[self createARGBImageFromRGBAImage:[self screenshot]] CGImage];
       
        // prepare the pixel buffer
        CVPixelBufferRef pixelBuffer = NULL;
        CFDataRef imageData= CGDataProviderCopyData(CGImageGetDataProvider(image));
        cvErr = CVPixelBufferCreateWithBytes(kCFAllocatorDefault,
                                             FRAME_WIDTH,
                                             FRAME_HEIGHT,
                                             kCVPixelFormatType_32ARGB,
                                             (void*)CFDataGetBytePtr(imageData),
                                             CGImageGetBytesPerRow(image),
                                             NULL,
                                             NULL,
                                             NULL,
                                             &pixelBuffer);
       
        // calculate the time
        CFAbsoluteTime thisFrameWallClockTime = CFAbsoluteTimeGetCurrent();
        CFTimeInterval elapsedTime = thisFrameWallClockTime – firstFrameWallClockTime;
        //NSLog (@”elapsedTime: %f”, elapsedTime);
        CMTime presentationTime =  CMTimeMake (elapsedTime * TIME_SCALE, TIME_SCALE);
       
        // write the sample
        BOOL appended = [assetWriterPixelBufferAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:presentationTime];
   
        if (appended)
        {   NSLog (@”appended sample at time %lf”, CMTimeGetSeconds(presentationTime));
        } else
        {   NSLog (@”failed to append”);
            [self stopScreenRecording];
        }
    }
}

And the code I used to take screenshot:

– (UIImage*)screenshot
{   return [AWScreenshot takeAsImage];
}

Notice how I called [[self createARGBImageFromRGBAImage: [self screenshot]], it’s because my UIImage is a RGBAImage, while the CVPixelBuffer’s format type is kCVPixelFormatType_32ARGB, so I had to fix thing so they match or else, my video would come up in weird tints.

I found the Googled for the createARGBImageFromRGBAImage code, and here it is:

-(UIImage *) createARGBImageFromRGBAImage: (UIImage*)image
{   CGSize dimensions = [image size];
   
    NSUInteger bytesPerPixel = 4;
    NSUInteger bytesPerRow = bytesPerPixel * dimensions.width;
    NSUInteger bitsPerComponent = 8;
   
    unsigned char *rgba = malloc(bytesPerPixel * dimensions.width * dimensions.height);
    unsigned char *argb = malloc(bytesPerPixel * dimensions.width * dimensions.height);
   
    CGColorSpaceRef colorSpace = NULL;
    CGContextRef context = NULL;
   
    colorSpace = CGColorSpaceCreateDeviceRGB();
    context = CGBitmapContextCreate(rgba, dimensions.width, dimensions.height, bitsPerComponent, bytesPerRow, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrderDefault); // kCGBitmapByteOrder32Big
    CGContextDrawImage(context, CGRectMake(0, 0, dimensions.width, dimensions.height), [image CGImage]);
    CGContextRelease(context);
    CGColorSpaceRelease(colorSpace);
   
    for (int x = 0; x < dimensions.width; x++) {
        for (int y = 0; y < dimensions.height; y++) {
            NSUInteger offset = ((dimensions.width * y) + x) * bytesPerPixel;
            argb[offset + 0] = rgba[offset + 3];
            argb[offset + 1] = rgba[offset + 0];
            argb[offset + 2] = rgba[offset + 1];
            argb[offset + 3] = rgba[offset + 2];
        }
    }
   
    colorSpace = CGColorSpaceCreateDeviceRGB();
    context = CGBitmapContextCreate(argb, dimensions.width, dimensions.height, bitsPerComponent, bytesPerRow, colorSpace, kCGImageAlphaPremultipliedFirst | kCGBitmapByteOrderDefault); // kCGBitmapByteOrder32Big
    CGImageRef imageRef = CGBitmapContextCreateImage(context);
    image = [UIImage imageWithCGImage: imageRef];
    CGImageRelease(imageRef);
    CGContextRelease(context);
    CGColorSpaceRelease(colorSpace);
   
    free(rgba);
    free(argb);
   
    return image;
}

And there we go, I managed to record the screen of my Cocos2d app and then save it as a video file.

My next problem is, how do I add audio to my video?

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s