DO NOT IMPLEMENT!!!
WEBGL_webcodecs_video_frame
WebGL working group (public_webgl 'at' khronos.org)
Jie Chen (jie.a.chen 'at' intel.com)
Kenneth Russell (kbr 'at' google.com)
Dale Curtis (dalecurtis 'at' google.com)
Dan Sanders (sandersd 'at' google.com)
Members of the WebGL working group
Last modified date: December 28, 2020
Revision: 2
WebGL extension #NN
Written against the WebGL API 1.0 specification.
This extension imports WebCodecs
VideoFrame to WebGL, and
returns a WebGLWebCodecsVideoFrameHandle
, which contains all information about
the imported textures for VideoFrame
, such as texture target, sampler type,
colorspace, pixel format, compliant GL extension, and etc. With the information, WebGL can
flexibly support more platform-specific extensions, including but not limitted to
GL_OES_EGL_image_external,
GL_NV_EGL_stream_consumer_external,
and
GL_ARB_texture_rectangle,
to manipulate VideoFrame in WebGL.
When this extension is enabled:
importVideoFrame
imports a VideoFrame
from WebCodecs, and returns
a WebGLWebCodecsVideoFrameHandle
. If the VideoFrame
can't be
imported, a TypeError exception will be raised. The VideoFrame
may keep being
locked until releaseVideoFrame
is called. While being locked, WebCodecs must
NOT manipulate the VideoFrame
anymore.
dictionary WebGLWebCodecsTextureInfo { // webgl texture WebGLTexture texture; // texture target: { GL_TEXTURE_2D, GL_TEXTURE_EXTERNAL_OES, ...} GLenum target; // {"sampler", "samplerExternalOES", ...} DOMString samplerType; // {"texture2D", ...} DOMString samplerFunc; // {"r", "rg", "rgb", ...} DOMString components; }; dictionary WebGLWebCodecsVideoFrameHandle { // Fixme: angle brackets are not support by xsltproc. FrozenArray WebGLWebCodecsTextureInfo? textureInfoArray; // {"GL_NV_EGL_stream_consumer_external", ...} DOMString? requiredExtension; VideoFrameColorSpace colorSpace; // {"NV12", "I420", "ABGR", ...} DOMString? pixelFormat; // This defines a GLSL "vec3 DoColorConversion(vec3 color)" function, which // can be used to convert the video frame from its original color space to the // current WebGL context's canvas color space. DOMString colorConversionShaderFunc; }; [ RuntimeEnabled=WebCodecs, LegacyNoInterfaceObject ] interface WebGLWebCodecsVideoFrame { [CallWith=ExecutionContext, RaisesException] WebGLWebCodecsVideoFrameHandle importVideoFrame(VideoFrame videoFrame); [CallWith=ExecutionContext, RaisesException] boolean releaseVideoFrame(WebGLWebCodecsVideoFrameHandle handle); };
First we need to import a VideoFrame
from WebCodecs.
let ext = gl.getExtension('WEBGL_webcodecs_video_frame'); let videoFrameHandle = ext.importVideoFrame(webcodecsVideoFrame);
Next we can assemble the GLSL fragment shader to access the video frame textures.
// Note: there could be many textures for 1 VideoFrame. To be simple the sample here assumes only // 1 texture. let texInfo0 = videoFrameHandle.textureInfoArray[0]; let fSource = "#extension " + videoFrameHandle.requiredExtension + " : require\n" + "precision mediump float;\n" + "varying mediump vec2 vTexCoord;\n" + "uniform " + texInfo0.samplerType + " _aSampler_0_;\n" + "void main() {\n" + " vec4 texel;\n" + " texel = " + texInfo0.samplerFunc + "(_aSampler_0_, vTexCoord);\n" + " gl_FragColor = texel;\n" + "}\n";
After we compiled the shader and linked the program, one last thing that we need to do is to bind textures to the right texture target.
// Note: we also assume only 1 texture for the VideoFrame. gl.activeTexture(gl.TEXTURE0); gl.bindTexture(texInfo0.target, texInfo0.texture); gl.texParameteri(texInfo0.target, gl.TEXTURE_MIN_FILTER, gl.LINEAR); gl.uniform1i(gl.getUniformLocation(program, "_aSampler_0_"), 0);
Finally everything is ready now. we can draw the VideoFrame. Always don't forget to release the VideoFrame.
gl.clearColor(0.0, 0.0, 0.0, 0.0); gl.clear(gl.COLOR_BUFFER_BIT); gl.drawArrays(gl.TRIANGLES, 0, 6); gl.bindTexture(texInfo0.target, null); ext.releaseVideoFrame(webcodecsVideoFrame);
For a full-fledged sample, please refer to this webgl_webcodecs_video_frame helper. It handles all the complexity, and makes your life much easier.
Revision 1, 2020/11/25
Revision 2, 2020/12/28