当长度不同时合并音频和视频

问题描述:

Im使用this tutorial制作一个视频合并来自一个图像(仅有一帧)和几秒钟音频的视频文件。当长度不同时合并音频和视频

在iPhone设备中,视频持续时间相当于音频持续时间,我沿着所有视频看到图像。

但是,当我分享到android设备(通过whatsapp),我按播放播放时间是从图像持续时间(一帧)的电影。我做了一个测试,如果我从一个图像重复一百次(10fps,十秒)创建一个电影文件,在Android设备playblack的持续时间是十秒钟。

我认为,Android设备只能播放视频最短轨道,但如果我修改TIMERANGE视频的addMutableTrackWithMediaType到音频持续时间没有任何反应。

有什么建议吗?

感谢您的支持

我把所有的代码在这里:

-(void) writeImagesToMovieAtPath:(NSString *)path withSize:(CGSize) size { 

    NSMutableArray *m_PictArray = [NSMutableArray arrayWithCapacity:1]; 
    [m_PictArray addObject:[UIImage imageNamed:@"prueba.jpg"]]; 

    NSString *documentsDirectoryPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0]; 
    NSArray *dirContents = [[NSFileManager defaultManager] contentsOfDirectoryAtPath:documentsDirectoryPath error:nil]; 
    for (NSString *tString in dirContents) { 
     if ([tString isEqualToString:@"essai.mp4"]) 
     { 
      [[NSFileManager defaultManager]removeItemAtPath:[NSString stringWithFormat:@"%@/%@",documentsDirectoryPath,tString] error:nil]; 

     } 
    } 

    NSLog(@"Write Started"); 

    NSError *error = nil; 

    AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL: 
            [NSURL fileURLWithPath:path] fileType:AVFileTypeMPEG4 
                   error:&error];  
    NSParameterAssert(videoWriter); 

    NSDictionary *codecSettings = [NSDictionary dictionaryWithObjectsAndKeys: 
            [NSNumber numberWithInt:128000], AVVideoAverageBitRateKey, 
            [NSNumber numberWithInt:15],AVVideoMaxKeyFrameIntervalKey, 
            AVVideoProfileLevelH264Main30, AVVideoProfileLevelKey, 
            nil];  

    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys: 
            AVVideoCodecH264, AVVideoCodecKey, 
            codecSettings,AVVideoCompressionPropertiesKey, 
            [NSNumber numberWithInt:size.width], AVVideoWidthKey, 
            [NSNumber numberWithInt:size.height], AVVideoHeightKey, 
            nil];  

    AVAssetWriterInput* videoWriterInput = [[AVAssetWriterInput 
              assetWriterInputWithMediaType:AVMediaTypeVideo 
              outputSettings:videoSettings] retain]; 

    AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor 
                assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput 
                sourcePixelBufferAttributes:nil]; 

    NSParameterAssert(videoWriterInput); 

    NSParameterAssert([videoWriter canAddInput:videoWriterInput]); 
    videoWriterInput.expectsMediaDataInRealTime = YES; 
    [videoWriter addInput:videoWriterInput]; 
    //Start a session: 
    [videoWriter startWriting]; 
    [videoWriter startSessionAtSourceTime:kCMTimeZero]; 


    //Video encoding 

    CVPixelBufferRef buffer = NULL; 

    //convert uiimage to CGImage. 

    int frameCount = 0; 

    for(int i = 0; i<[m_PictArray count]; i++) 
    { 
     buffer = [self newPixelBufferFromCGImage:[[m_PictArray objectAtIndex:i] CGImage] andSize:size]; 

     BOOL append_ok = NO; 
     int j = 0; 
     while (!append_ok && j < 30) 
     { 
      if (adaptor.assetWriterInput.readyForMoreMediaData) 
      { 
       printf("appending %d attemp %d\n", frameCount, j); 

       CMTime frameTime = CMTimeMake(frameCount,(int32_t) 10); 
       /* 
       Float64 seconds = 1; 
       int32_t preferredTimeScale = 10; 
       CMTime frameTime = CMTimeMakeWithSeconds(seconds, preferredTimeScale); 
       */ 
       append_ok = [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime]; 
       CVPixelBufferPoolRef bufferPool = adaptor.pixelBufferPool; 
       NSParameterAssert(bufferPool != NULL); 

       [NSThread sleepForTimeInterval:0.05]; 
      } 
      else 
      { 
       printf("adaptor not ready %d, %d\n", frameCount, j); 
       [NSThread sleepForTimeInterval:0.1]; 
      } 
      j++; 
     } 
     if (!append_ok) { 
      printf("error appending image %d times %d\n", frameCount, j); 
     } 
     frameCount++; 
     CVBufferRelease(buffer); 
    } 

    [videoWriterInput markAsFinished]; 
    [videoWriter finishWriting]; 

    [videoWriterInput release]; 
    [videoWriter release]; 

    [m_PictArray removeAllObjects]; 

    NSLog(@"Write Ended"); 

    [self saveVideoToAlbum:path]; 
} 


-(void)CompileFilesToMakeMovie { 

    NSLog(@"CompileFilesToMakeMovie"); 

    AVMutableComposition* mixComposition = [AVMutableComposition composition]; 

    NSString *documentsDirectoryPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];  

    //Audio file in AAC 
    NSString* audio_inputFileName = @"zApY4o8QY.m4a"; 

    NSString* audio_inputFilePath = [NSString stringWithFormat:@"%@/%@",[[NSBundle mainBundle] resourcePath],audio_inputFileName]; 
    NSURL* audio_inputFileUrl = [NSURL fileURLWithPath:audio_inputFilePath]; 

    NSString* video_inputFileName = @"essai.mp4"; 
    NSString* video_inputFilePath = [NSString stringWithFormat:@"%@/%@",documentsDirectoryPath,video_inputFileName]; 
    NSURL* video_inputFileUrl = [NSURL fileURLWithPath:video_inputFilePath]; 

    NSString* outputFileName = @"outputFile.mov"; 
    NSString* outputFilePath = [NSString stringWithFormat:@"%@/%@",documentsDirectoryPath,outputFileName]; 

    NSURL* outputFileUrl = [NSURL fileURLWithPath:outputFilePath]; 

    if ([[NSFileManager defaultManager] fileExistsAtPath:outputFilePath]) 
     [[NSFileManager defaultManager] removeItemAtPath:outputFilePath error:nil]; 


    CMTime nextClipStartTime = kCMTimeZero; 

    AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:video_inputFileUrl options:nil]; 
    AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:audio_inputFileUrl options:nil]; 


    //CMTimeRange video_timeRange = CMTimeRangeMake(kCMTimeZero,videoAsset.duration); 
    CMTimeRange video_timeRange = CMTimeRangeMake(kCMTimeZero,videoAsset.duration); 
    AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; 
    [a_compositionVideoTrack insertTimeRange:video_timeRange ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:nextClipStartTime error:nil]; 

    CMTimeRange audio_timeRange = CMTimeRangeMake(kCMTimeZero, audioAsset.duration); 
    AVMutableCompositionTrack *b_compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; 
    [b_compositionAudioTrack insertTimeRange:audio_timeRange ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:nextClipStartTime error:nil]; 



    AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetLowQuality]; 
    _assetExport.shouldOptimizeForNetworkUse = YES; 
    _assetExport.outputFileType = @"com.apple.quicktime-movie"; 
    _assetExport.outputURL = outputFileUrl; 
    _assetExport.timeRange = CMTimeRangeMake(kCMTimeZero, audioAsset.duration); 

    [_assetExport exportAsynchronouslyWithCompletionHandler: 
    ^(void) { 
     [self saveVideoToAlbum:outputFilePath]; 
    }  
    ]; 

    NSLog(@"CompileFilesToMakeMovie Finish"); 
} 

- (void) saveVideoToAlbum:(NSString*)path { 

    NSLog(@"saveVideoToAlbum"); 

    if(UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(path)){ 
     UISaveVideoAtPathToSavedPhotosAlbum (path, self, @selector(video:didFinishSavingWithError: contextInfo:), nil); 
    } 
} 

-(void) video:(NSString *)videoPath didFinishSavingWithError:(NSError *)error contextInfo:(void *)contextInfo { 
    if(error) 
     NSLog(@"Exportado con error: %@", error); 
    else 
     NSLog(@"Exportado OK"); 
} 

- (CVPixelBufferRef) newPixelBufferFromCGImage: (CGImageRef)image andSize:(CGSize)frameSize { 

    CGAffineTransform frameTransform = CGAffineTransformMake(0, 0, 0, 0, 0, 0); 

    NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys: 
          [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey, 
          [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey, 
          nil]; 
    CVPixelBufferRef pxbuffer = NULL; 

    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, frameSize.width, 
              frameSize.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options, 
              &pxbuffer); 
    NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL); 

    CVPixelBufferLockBaseAddress(pxbuffer, 0); 
    void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer); 
    NSParameterAssert(pxdata != NULL); 

    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB(); 
    CGContextRef context = CGBitmapContextCreate(pxdata, frameSize.width, 
               frameSize.height, 8, 4*frameSize.width, rgbColorSpace, 
               kCGImageAlphaNoneSkipFirst); 
    NSParameterAssert(context); 
    //CGContextConcatCTM(context, frameTransform); 
    CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image), 
              CGImageGetHeight(image)), image); 
    CGColorSpaceRelease(rgbColorSpace); 
    CGContextRelease(context); 

    CVPixelBufferUnlockBaseAddress(pxbuffer, 0); 

    return (CVPixelBufferRef)pxbuffer; 
} 

就修好了​​!

我创作的电影文件重复图像X次,然后在组合过程我缩放到audioAsset.duration的大小

CMTimeRange video_timeRange = CMTimeRangeMake(kCMTimeZero,videoAsset.duration); 
AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; 
[a_compositionVideoTrack insertTimeRange:video_timeRange ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil]; 
[a_compositionVideoTrack scaleTimeRange:video_timeRange toDuration:audioAsset.duration]; 

您需要重复一下图片让道是缩放,但如果电影只有2帧,在Android只播放八秒钟,所以我做了一个视频与图像重复10次,让我超过45 secons的限制在视频分享whatsapp

Inside CompileFilesToMakeMovie方法,在需要的地方使用video_timeRange而不是audio_timeRange ...