将图像制作为视频时,iOS- CVPixelBuffer创建内存无法正确释放

问题描述:

我正在将图像制作为视频。 但由于内存警告,CVPixelBufferCreate上的分配太多,总是崩溃。 不知道如何处理它的权利。我见过很多类似的话题,他们都没有解决我的问题。将图像制作为视频时,iOS- CVPixelBuffer创建内存无法正确释放

enter image description here

这里是我的代码:

- (void) writeImagesArray:(NSArray*)array asMovie:(NSString*)path 
{ 
    NSError *error = nil; 
    UIImage *first = [array objectAtIndex:0]; 
    CGSize frameSize = first.size; 
    AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL: 
            [NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie 
                   error:&error]; 
    NSParameterAssert(videoWriter); 

    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys: 
            AVVideoCodecH264, AVVideoCodecKey, 
            [NSNumber numberWithDouble:frameSize.width],AVVideoWidthKey, 
            [NSNumber numberWithDouble:frameSize.height], AVVideoHeightKey, 
            nil]; 

    AVAssetWriterInput* writerInput = [AVAssetWriterInput 
             assetWriterInputWithMediaType:AVMediaTypeVideo 
             outputSettings:videoSettings]; 

    self.adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput 
                sourcePixelBufferAttributes:nil]; 

    [videoWriter addInput:writerInput]; 

    //Start Session 
    [videoWriter startWriting]; 
    [videoWriter startSessionAtSourceTime:kCMTimeZero]; 

    int frameCount = 0; 
    CVPixelBufferRef buffer = NULL; 
    for(UIImage *img in array) 
    { 
     buffer = [self newPixelBufferFromCGImage:[img CGImage] andFrameSize:frameSize]; 
     if (self.adaptor.assetWriterInput.readyForMoreMediaData) 
     { 
      CMTime frameTime = CMTimeMake(frameCount,FPS); 
      [self.adaptor appendPixelBuffer:buffer withPresentationTime:frameTime]; 
     } 
     if(buffer) 
      CVPixelBufferRelease(buffer); 

     frameCount++; 
    } 

    [writerInput markAsFinished]; 
    [videoWriter finishWritingWithCompletionHandler:^{ 

     if (videoWriter.status == AVAssetWriterStatusFailed) { 

      NSLog(@"Movie save failed."); 

     }else{ 

      NSLog(@"Movie saved."); 
     } 
    }]; 

    NSLog(@"Finished.); 
} 


- (CVPixelBufferRef)newPixelBufferFromCGImage: (CGImageRef) image andFrameSize:(CGSize)frameSize 
{ 
    NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys: 
          [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey, 
          [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey, 
          nil]; 

    CVPixelBufferRef pxbuffer = NULL; 

    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, 
              frameSize.width, 
              frameSize.height, kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef) options, 
              &pxbuffer); 

    NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL); 

    CVPixelBufferLockBaseAddress(pxbuffer, 0); 
    void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer); 
    NSParameterAssert(pxdata != NULL); 

    CGBitmapInfo bitmapInfo = (CGBitmapInfo) kCGImageAlphaNoneSkipFirst; 
    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB(); 
    CGContextRef context = CGBitmapContextCreate(pxdata, 
               frameSize.width, 
               frameSize.height, 
               8, 
               4*frameSize.width, 
               rgbColorSpace, 
               bitmapInfo); 

    NSParameterAssert(context); 
    CGContextConcatCTM(context, CGAffineTransformMakeRotation(0)); 
    CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image), 
              CGImageGetHeight(image)), image); 
    CGColorSpaceRelease(rgbColorSpace); 
    CGContextRelease(context); 
    CVPixelBufferUnlockBaseAddress(pxbuffer, 0); 

    return pxbuffer; 
} 

更新:

我做我的视频转换成小片段。 添加[NSThread sleepForTimeInterval:0.00005];在循环。 内存只是神奇地释放。

但是,这导致我的用户界面卡住了秒,因为这条线。有更好的解决方案

for(UIImage *img in array) 
{ 
    buffer = [self newPixelBufferFromCGImage:[img CGImage] andFrameSize:frameSize]; 
    //CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, adaptor.pixelBufferPool, &buffer); 
    if (adaptor.assetWriterInput.readyForMoreMediaData) 
    { 
     CMTime frameTime = CMTimeMake(frameCount,FPS); 
     [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime]; 
    } 

    if(buffer) 
     CVPixelBufferRelease(buffer); 

    frameCount++; 

    [NSThread sleepForTimeInterval:0.00005]; 
} 

这里的记忆:

enter image description here

从你的代码的快速审查,我看不出什么错在CVBuffer本身的管理。
我认为这可能是你的问题的根源是UIImages数组。
UIImage具有此行为,直到您请求CGImage属性或绘制它,附加的图像未在内存中解码,因此未使用的图像在内存中的影响很低。
您的枚举调用每个图像上的CGImage属性,并且永远不会摆脱它们,这可以解释内存分配的不断增加。

+0

谢谢。但是,我发现乐器中的图像没有记忆问题。我实际上看到了CVPixelBufferCreate的分配情况,并且从未正确释放。 – 2015-02-10 05:17:47

+0

我不知道,缓冲区是否已创建,但在使用后正确释放,是否尝试过在真实设备上尝试? – Andrea 2015-02-10 08:22:55

+0

该时间起作用,因为缓冲区适配器可能仍在使用缓冲区。我建议你:1 - 不使用图像数组,但使用图像路径数组2 - 缓冲区适配器有它自己的时间是最好的,如果您使用_pixelBufferAdaptor.assetWriterInput requestMediaDataWhenReadyOnQueue:3 - 使用CVBufferPool – Andrea 2015-02-10 08:31:14

如果以后不使用Images。你可以这样做:

[images enumerateObjectsUsingBlock:^(UIImage * _Nonnull img, NSUInteger idx, BOOL * _Nonnull stop) { 
     CVPixelBufferRef pixelBuffer = [self pixelBufferFromCGImage:img.CGImage frameSize:[VDVideoEncodeConfig globalConfig].size]; 

     CMTime frameTime = CMTimeMake(frameCount, (int32_t)[VDVideoEncodeConfig globalConfig].frameRate); 
     frameCount++; 
     [_assetRW appendNewSampleBuffer:pixelBuffer pst:frameTime]; 

     CVPixelBufferRelease(pixelBuffer); 
     // This can release the memory 
     // The Image.CGImageRef result in the memory leak you see in the Instruments 
     images[idx] = [NSNull null]; 
    }];