I just wanted to emphasize that the API returns a texture handle to you.
size_t frameWidth = CVPixelBufferGetWidth(pixelBuffer); // Pixel Buffers provided by CPU decoding video
size_t frameHeight = CVPixelBufferGetHeight(pixelBuffer);
CVOpenGLESTextureRef texture = NULL; // create OpengGLESTexture Ref
CVReturn err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault,
videoTextureCache,
pixelBuffer,
NULL,
GL_TEXTURE_2D,
GL_RGBA,
frameWidth,
frameHeight,
GL_BGRA,
GL_UNSIGNED_BYTE,
0,
&texture);
Noticed that we passed in actual pixel buffer with its attributes (the frame width and height) .
Note the '&texture' in the the API call .... This information is passed back to you ... You can also query the size of the texture given back to you. The API determines the size of the texture because you passed in the attributes of the pixel buffer (as compared ot your snippet).
Now since it exists you can bind it as needed ...
glBindTexture(CVOpenGLESTextureGetTarget(texture), CVOpenGLESTextureGetName(texture));
Yes...its a bit different ... others call it weird .. The API is giving the texture handle back to you instead of you submitting it.
Some game engines (e.g, Unity) have a rigid rendering pipeline that only allows for the traditional rendering approach I mention above using glTexImage2d. This approach is 'old school' ... not as old as Bainer Hall from my UCD days ... but still 'old school'
Eventhough you may not have a LUA script for this I am hoping that your rendering pipeline allows for this type of option.
It would be nice to have the LUA layer of absraction.. then you could use LUA for rapid prototyping of multimedia assets in
virtual worlds ....