I disagree. Being able to render video to texture would be an excellent feature for middleware. I wouldn't mind implementing it for Ogg Theora video, but don't know enough about the engine to do so. I don't think there's a way to do it w/ the SDK w/o a little more information. Could someone who knows more than me answer these?
If we could somehow get a byte* to the actual texture information, would filling the texture buffer w/ a frame update the texture on screen? We can currently create a texture using the TTexture CreateTexture( int width, int height, int format=TEXTURE_RGBA ) function, but what does that give us? What's the format of a TTexture?
Actually, is there anything we can do w/ CreateTexture? Seems like we can create a texture, but there's no way to fill the created texture w/ data unless we have the TTexture data format, and none of the other texture functions seem to give access to it. Unless it's just there for copying the color information from a buffer so you can render to texture (remote video camera, reflection, etc)?
Anyways, I may be way off, my 3D-fu is weak, but I would love to take a shot at it.