iOS - 在OpenGL ES中使用BiPlanar像素格式渲染YUV420p图像2
问题描述:
我正尝试使用iOS 10.3.3在iPhone 6S上使用Swift 3渲染yuv420p编码视频到OpenGL ES2纹理。iOS - 在OpenGL ES中使用BiPlanar像素格式渲染YUV420p图像2
纹理设置:
var formatType = kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange
var lumaTexture: CVOpenGLESTexture?
var chromaTexture: CVOpenGLESTexture?
var mediapixelBuffer: CVPixelBuffer?
var ioSurfaceBuffer: CVPixelBuffer?
media.videoSamplesBuffer = media.assetReaderOutput?.copyNextSampleBuffer()
mediapixelBuffer = CMSampleBufferGetImageBuffer(media.videoSamplesBuffer!)!
CVPixelBufferLockBaseAddress(mediapixelBuffer!, .readOnly)
let bufferWidth0: Int = CVPixelBufferGetWidthOfPlane(mediapixelBuffer!, 0)
let bufferWidth1: Int = CVPixelBufferGetWidthOfPlane(mediapixelBuffer!, 1)
let bufferHeight0: Int = CVPixelBufferGetWidthOfPlane(mediapixelBuffer!, 0)
let bufferHeight1: Int = CVPixelBufferGetWidthOfPlane(mediapixelBuffer!, 1)
let bytesPerRow0: Int = CVPixelBufferGetBytesPerRowOfPlane(mediapixelBuffer!, 0)
let bytesPerRow1: Int = CVPixelBufferGetBytesPerRowOfPlane(mediapixelBuffer!, 1)
let pixelBufferBaseAddress = CVPixelBufferGetBaseAddress(mediapixelBuffer!)
let pixelBufferPlaneAddress0 = CVPixelBufferGetBaseAddressOfPlane(mediapixelBuffer!, 0)
let pixelBufferPlaneAddress1 = CVPixelBufferGetBaseAddressOfPlane(mediapixelBuffer!, 1)
let ioBufferRet = CVPixelBufferCreate(kCFAllocatorDefault,
bufferWidth_,
bufferHeight_,
self.formatType,
attr,
&ioSurfaceBuffer)
if ioBufferRet != 0 { print("error at `CVPixelBufferCreate`", ioBufferRet) }
CVPixelBufferLockBaseAddress(ioSurfaceBuffer!, .readOnly)
var copyBufferPlaneAddress0 = CVPixelBufferGetBaseAddressOfPlane(ioSurfaceBuffer!, 0)
var copyBufferPlaneAddress1 = CVPixelBufferGetBaseAddressOfPlane(ioSurfaceBuffer!, 1)
memcpy(copyBufferPlaneAddress0, pixelBufferPlaneAddress0, bufferHeight0 * bytesPerRow0/2) // Y
memcpy(copyBufferPlaneAddress1, pixelBufferPlaneAddress1, bufferHeight1 * bytesPerRow1/2) // UV
glActiveTexture(GLenum(GL_TEXTURE0))
if nil != ioSurfaceBuffer && nil != media.vidTexCachePtr {
var cvRet = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault,
media.vidTexCachePtr!,
ioSurfaceBuffer!,
nil,
GLenum(GL_TEXTURE_2D),
GLint(GL_RED_EXT),
GLsizei(bufferWidth0),
GLsizei(bufferHeight0),
GLenum(GL_RED_EXT),
GLenum(GL_UNSIGNED_BYTE),
0,
&lumaTexture)
if cvRet != 0 { print("0 error at `CVOpenGLESTextureCacheCreateTextureFromImage`", cvRet) }
}
if nil != lumaTexture {
glBindTexture(CVOpenGLESTextureGetTarget(lumaTexture!), CVOpenGLESTextureGetName(lumaTexture!))
}
glTexParameteri(GLenum(GL_TEXTURE_2D), GLenum(GL_TEXTURE_MIN_FILTER), GL_LINEAR)
glTexParameteri(GLenum(GL_TEXTURE_2D), GLenum(GL_TEXTURE_WRAP_S), GL_CLAMP_TO_EDGE)
glTexParameteri(GLenum(GL_TEXTURE_2D), GLenum(GL_TEXTURE_WRAP_T), GL_CLAMP_TO_EDGE)
glActiveTexture(GLenum(GL_TEXTURE1))
if nil != ioSurfaceBuffer && nil != media.vidTexCachePtr {
var cvRet = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault,
media.vidTexCachePtr!,
ioSurfaceBuffer!,
nil,
GLenum(GL_TEXTURE_2D),
GLint(GL_RG_EXT),
GLsizei(bufferWidth1),
GLsizei(bufferHeight1),
GLenum(GL_RG_EXT),
GLenum(GL_UNSIGNED_BYTE),
1,
&chromaTexture)
if cvRet != 0 { print("1 error at `CVOpenGLESTextureCacheCreateTextureFromImage`", cvRet) }
}
if nil != chromaTexture {
glBindTexture(CVOpenGLESTextureGetTarget(chromaTexture!), CVOpenGLESTextureGetName(chromaTexture!))
}
glTexParameteri(GLenum(GL_TEXTURE_2D), GLenum(GL_TEXTURE_MIN_FILTER), GL_LINEAR)
glTexParameteri(GLenum(GL_TEXTURE_2D), GLenum(GL_TEXTURE_WRAP_S), GL_CLAMP_TO_EDGE)
glTexParameteri(GLenum(GL_TEXTURE_2D), GLenum(GL_TEXTURE_WRAP_T), GL_CLAMP_TO_EDGE)
CVPixelBufferUnlockBaseAddress(mediapixelBuffer!, .readOnly)
CVPixelBufferUnlockBaseAddress(ioSurfaceBuffer!, .readOnly)
片段着色器:
#version 100
precision mediump float;
varying vec2 vUV;
uniform sampler2D SamplerY;
uniform sampler2D SamplerUV;
void main() {
mediump vec3 yuv;
lowp vec3 rgb;
yuv.x = texture2D(SamplerY, vUV).r;
yuv.yz = texture2D(SamplerUV, vUV).rg - vec2(0.5, 0.5);
// Using BT.709 which is the standard for HDTV
rgb = mat3( 1, 1, 1,
0, -.18732, 1.8556,
1.57481, -.46813, 0) * yuv;
gl_FragColor = vec4(rgb, 1);
}
了独立的亮度质地看起来右,但独立的色度质地似乎只有CR通道。我知道因为视频是4:2:0,第二个色度通道是空的,所以也许我不应该“看见”Cb通道,但最终结果(应该是彩条色)看起来像this。 它缺少红色。 (我认为这是因为输出是BGRA,如果是RGBA,蓝色会丢失)。我如何获得红色回来?
This post描述了一个类似的问题,我正在经历。但该解决方案使用3个平面(分别为Y,U和V),而我试图用2个平面(Y和UV)实现此平面。我尝试使用kCVPixelFormatType_420YpCbCr8Planar
格式类型来访问3架飞机,但是然后CVOpenGLESTextureCacheCreateTextureFromImage
未能创建IOSurface。我也尝试了一些不同的YUV-> RGB着色器方程,并研究了使用ffmpeg来提供CVPixelBuffer,但我无法为我的iPhone架构(arm64)构建它。预先感谢您,任何帮助将不胜感激!
答
所以事实证明SamplerUV
纹理实际上并没有被发送到着色器。 (但它在GPU捕获的帧中可见,这是误导)。我认为(错误地)是因为SamplerY
被自动发送到着色器,第二纹理SamplerUV
也是如此。所以我之前看到的结果是用于Y和UV纹理的亮度纹理。
是固定的问题,缺少的行:
var SamplerY: GLint = 0
var SamplerUV: GLint = 1
SamplerY = glGetUniformLocation(shaderProgram, "SamplerY")
SamplerUV = glGetUniformLocation(shaderProgram, "SamplerUV")
glUniform1i(SamplerY, 0)
glUniform1i(SamplerUV, 1)