wasabia / flutter_gl

cross-platform call OpenGL API by Dart through dart:ffi. Provides OpenGL with Texture Widget on Flutter.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Interoperability with Native Flutter Context

hwinnemo opened this issue · comments

Love this project and was able to get some nice multi-pass shaders going with it.
One issue I don't understand is how to set up a shared context properly.
Specifically, let's say I want to process frames from the video_player plugin.
The video_player keeps a textureId that holds the content of the current frame, which can then be rendered onto a Texture Widget.
However, when I try to use that textureId in flutter_gl it doesn't work. And, in fact, the same textureId is handed out by flutter_gl, which indicates that they are not using the same context.
I am assuming that video_player uses some sort of "standard" fluter context since it doesn't have to do any magic with flutterGlPlugin.updateTexture() in the end.
So then, how would I make flutter_gl leverage that same, shared context?
Thanks for your help!

maybe you have to write yourself video decoder or edit video_player for yourself

for example video decoder on android use Kotlin.

  1. get share context from flutter_gl
var _shareContext = ThreeEgl.getContext("shareContext");
  1. create new video decoder thread and create new OpenGLContext with _shareContext, bind new OpenGLContext to thread
eglContext = EGL14.eglCreateContext(eglDisplay, eglConfig, _shareContext, attributes, 0)
// make current context
  1. prepare new Surface
val textureIds = IntArray(1)
glGenTextures(1, textureIds, 0)
textureId = textureIds[0]

glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureId!!)
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, null);
glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL_TEXTURE_MIN_FILTER, GL_LINEAR)
glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL_TEXTURE_MAG_FILTER, GL_LINEAR)

surfaceTexture = SurfaceTexture(textureId!!)
surfaceTexture.setDefaultBufferSize(width, height);

surfaceTexture.setOnFrameAvailableListener(this)
surface = Surface(surfaceTexture)
  1. textureId handle video frame data when run video decode or play
    you can use textureId with flutter_gl

Thank you for the answer!! I am happy to modify the video_player sources (I already did, to expose the textureId member variable). However, there are some issues I don't quite understand:

  • What is ThreeEgl? I know they are some Java classes inside of threeegl.aar, but I was not able to find any documentation on it.
  • I can use the video_player textureId to render the video texture into a Texture Widget in flutter, so it seems like the textures of video_player and flutter are compatible.
  • I can use the textures generated in flutter_gl on a Texture Widget in flutter, so it seems like the textures of flutter_gl and flutter are compatible.
  • However, I cannot use the texture from video_player in flutter_gl -- why is that? It almost seems like textures are attached to an individual GLContext (unless that context is shared)... But under the hood the textures are Surfaces, which can be shared more easily? What really is a Surface as it relates to textures?

Specifically, the way that video_player creates a texture is not via OpenGL calls, but via the TextureRegistry. The corresponding call can be found here.

Or are you saying that I could create a surface, like you describe in step 3, and then pass that surface to the video_player here?

ThreeEgl is just a library for keep share opengl context, so other flutter plugin can get share opengl context use it.

yes need use new Surface replace video_player Surface.

1.create new thread for video decode
2.create new opengl context with shareContext and bind to new thread
convert video texture to normal texture
convert GL_TEXTURE_EXTERNAL_OES to GL_TEXTURE

the following code maybe help

package com.futouapp.texture_render

import android.content.Context
import android.graphics.Bitmap
import android.graphics.SurfaceTexture
import android.media.*
import android.opengl.*
import android.opengl.GLES32.*
import android.opengl.GLU.gluErrorString
import android.os.*
import android.provider.MediaStore
import android.util.Log
import android.view.Surface
import com.futouapp.threeegl.ThreeEgl
import java.io.BufferedOutputStream
import java.io.FileOutputStream
import java.io.IOException
import java.nio.ByteBuffer
import java.nio.ByteOrder
import java.nio.FloatBuffer
import java.util.*
import java.util.concurrent.Semaphore
import kotlin.collections.ArrayList
import kotlin.math.pow
import kotlin.math.roundToInt


class VideoOutput : SurfaceTexture.OnFrameAvailableListener {


    private var filePath: String? = null
    private var surface: Surface? = null

    private lateinit var mediaExtractor: MediaExtractor;
    private lateinit var videoDecoder: MediaCodec;
    private lateinit var videoMediaFormat: MediaFormat;

    lateinit var context: Context;

    lateinit var surfaceTexture: SurfaceTexture;


    var waitDecodeFrames: MutableList<Double> = ArrayList();
    var videoFramesFrameDecoder: VideoFrameDecoder? = null;

    var degrees: Int = 0;
    private var duration: Double = 0.0
    private var playing = false;
    private var seeking = false;
    private var seekingTo: Long = 0;

    var width: Int = 0
    var height: Int = 0


    var renderWidth: Int;
    var renderHeight: Int;

    var screenScale: Float = 1.0f;


    var textureId: Int? = null;

    var openGLProgram: OpenGLProgram? = null;

    lateinit var vertexBuffer1: FloatBuffer;
    lateinit var textureBuffer1: FloatBuffer;

    lateinit var vertexBuffer2: FloatBuffer;
    lateinit var textureBuffer2: FloatBuffer;


    lateinit var vertexBuffer90: FloatBuffer;
    lateinit var textureBuffer90: FloatBuffer;

    lateinit var vertexBuffer180: FloatBuffer;
    lateinit var textureBuffer180: FloatBuffer;

    lateinit var vertexBuffer270: FloatBuffer;
    lateinit var textureBuffer270: FloatBuffer;

    lateinit var vertexBuffer0: FloatBuffer;
    lateinit var textureBuffer0: FloatBuffer;

    var tmpFrameBuffer = IntArray(1);
    var tmpTextures = IntArray(1);


    var videoBufferInfo = MediaCodec.BufferInfo();

    var disposed: Boolean = false;

    var running: Boolean = false;
    var currentVideoTime: Long = -999999;
    var readEnd = false

    var renderToVideo: Boolean = false;


    var syncSampleTimes = ArrayList<Long>();
    var lastRenderTime: Long = 0L;

    var videoFps: Int = 25;

    // 微秒
    var videoFrameDuration: Long = 40 * 1000L;

    var decoderThread: HandlerThread = HandlerThread("decoderThread");
    private var decoderThreadHandler: Handler? = null;

    var lastTime: Long = 0L;

    var ready = false;

    lateinit var eglEnv: EglEnv;

    constructor(filePath: String, renderWidth: Int, renderHeight: Int, renderToVideo: Boolean) {
        this.filePath = filePath;
        this.renderToVideo = renderToVideo;
        this.renderWidth = renderWidth;
        this.renderHeight = renderHeight;
    }

    fun getInfo(): Map<String, Any> {
        var info = mutableMapOf<String, Any>();

        playVideo(0.0, "running");

        this.executeSync {
            info["width"] = this.width;
            info["height"] = this.height;

            val _textureID = getTextureAt(0.0);

            info["texture"] = _textureID!!;
        }



        return info;
    }


    // 异步
    fun execute(task: () -> Unit) {
        decoderThreadHandler?.post {
            task.invoke()
        }
    }

    fun executeSync(task: () -> Unit) {
        val semaphore = Semaphore(0)
        decoderThreadHandler?.post {
            task.invoke()
            semaphore.release()
        }
        semaphore.acquire()
    }

    fun setup() {
        decoderThread.start()
        decoderThreadHandler = Handler(decoderThread.looper)

        this.executeSync {
            this.setup2();
        }
    }

    fun setup2() {

        this.videoFramesFrameDecoder = VideoFrameDecoder(this.filePath!!, renderWidth, renderHeight);

        this.openGLProgram = OpenGLProgram();

        screenScale = 1.0f;


        mediaExtractor = MediaExtractor()
        try {
            mediaExtractor.setDataSource(filePath!!)
        } catch (e: IOException) {
            e.printStackTrace()
        }

        calcVideoInfo();

        var _shareContext = ThreeEgl.getContext("shareContext");
        
        this.eglEnv = EglEnv();
        eglEnv.setupRender(_shareContext!!)
        eglEnv.buildOffScreenSurface(width, height);

        eglEnv.makeCurrent();
        glViewport(0, 0, width, height);


        this.createSurfaceAndTexture();

        initVideoDecoder();


        setupFBO();

        setupVBO0();
        setupVBO1();
        setupVBO2();


        setupVBO90();
        setupVBO180();
        setupVBO270();

        checkGlError("mediacodec video output setup ")

        getAllKeyframeTimes();

        ready = true;
    }


    fun initVideoDecoder() {
        for (i in 0 until mediaExtractor.trackCount) {
            val mediaFormat = mediaExtractor.getTrackFormat(i)
            val mime = mediaFormat.getString(MediaFormat.KEY_MIME)!!

            if (mime.startsWith(KEY_VIDEO)) { //匹配视频对应的轨道
                val videoWidth = mediaFormat.getInteger(MediaFormat.KEY_WIDTH)
                val videoHeight = mediaFormat.getInteger(MediaFormat.KEY_HEIGHT)


                mediaExtractor.selectTrack(i) //选择视频对应的轨道
                videoMediaFormat = mediaFormat;

                videoFps = videoMediaFormat.getInteger(MediaFormat.KEY_FRAME_RATE);

                videoFrameDuration = (1000 * 1000 / videoFps).toLong();

//                println(" videoFps: ${videoFps} ");

                try {
                    videoDecoder = MediaCodec.createDecoderByType(mime)
                    showSupportedColorFormat(videoDecoder.getCodecInfo().getCapabilitiesForType(mime));
                    videoMediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Flexible);

                    // flag=1的时候为encode
                    videoDecoder.configure(videoMediaFormat, this.surface, null, 0)
                    videoDecoder.start()
                } catch (e: IOException) {
                    e.printStackTrace()
                }
                break
            }
        }
    }

    private fun showSupportedColorFormat(caps: MediaCodecInfo.CodecCapabilities) {
        print("supported color format: ")
        for (c in caps.colorFormats) {
            print(c.toString() + "\t")
        }
        println()
    }

    fun calcVideoInfo() {
        val retriever = MediaMetadataRetriever()
        retriever.setDataSource(filePath)

        degrees = retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_VIDEO_ROTATION)!!.toInt()
        duration = retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_DURATION)!!.toDouble();

        val _width = retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_VIDEO_WIDTH)!!.toDouble()
        val _height = retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_VIDEO_HEIGHT)!!.toDouble()

        retriever.release()

        var nw = _width;
        var nh = _height;

        if (degrees == 90 || degrees == 270) {
            nw = _height;
            nh = _width;
        }

//        println("1 calcVideoInfo width: ${nw} height: ${nh} degrees: ${degrees}  ");
//        println("filePath: ${filePath}");

        this.width = nw.toInt();
        this.height = nh.toInt();
    }

    fun getAllKeyframeTimes() {

        syncSampleTimes.add(0L);

        while (true) {
            if (mediaExtractor!!.sampleFlags == MediaExtractor.SAMPLE_FLAG_SYNC)
                syncSampleTimes.add(mediaExtractor!!.sampleTime)

            if (!mediaExtractor!!.advance())
                break
        }

        lastTime = syncSampleTimes.last();

        syncSampleTimes.reverse();

        println(" getAllKeyframeTimes =============>>>: ${syncSampleTimes.count()} ");
//        println(syncSampleTimes);

    }

    fun isPlaying(): Boolean {
        return playing;
    }


    fun createSurfaceAndTexture() {

        val textureIds = IntArray(1)
        glGenTextures(1, textureIds, 0)


        if (EGL14.eglGetCurrentContext().equals(EGL14.EGL_NO_CONTEXT)) {
            println(" ------------------- EGL_NO_CONTEXT ---------------------- ")
        }


        if (textureIds[0] <= 0) {
            val error: Int = glGetError();
            val info = gluErrorString(error);

            println(" glGenTextures error ${info} ")

            glGenTextures(1, textureIds, 0)
            println(" createSurfaceAndTexture2: ${textureIds[0]} ");
            return;
        }


        textureId = textureIds[0]

        glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureId!!)
        glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, null);


        glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL_TEXTURE_MIN_FILTER, GL_LINEAR)
        glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL_TEXTURE_MAG_FILTER, GL_LINEAR)

        // 创建SurfaceTexture、Surface,并绑定到MediaPlayer上,接收画面驱动回调
        surfaceTexture = SurfaceTexture(textureId!!)
        surfaceTexture.setDefaultBufferSize(width, height);

        println("surfaceTexture.setDefaultBufferSize width: ${width} ${height} ")

        surfaceTexture.setOnFrameAvailableListener(this)
        surface = Surface(surfaceTexture)
    }


    fun playVideo(time: Double, status: String) {
        playing = status == "playing"
        seek(time);
    }


    fun pause() {
        playing = false;
        running = false;
    }


    fun seek(time: Double) {
//        println(" seeking.... ");

        if (!ready) {
            return;
        }

        playing = false;
        seekingTo = (time * 1000 * 1000).toLong();


        this.videoRunner(seekingTo);
    }

    fun getPrevTime(time: Long): Long {
        var _t = time;
        if (_t > lastTime) {
            _t = lastTime;
        }
        if (_t < 0) {
            _t = 0;
        }

//        println("getPrevTime time: ${time}  ");

        if(syncSampleTimes.count() > 0) {
            return syncSampleTimes.first { it <= _t };
        } else {
            return 0L;
        }
    }

    override fun onFrameAvailable(surfaceTexture: SurfaceTexture?) {
        this.surfaceTexture.updateTexImage();

    }

    fun getTextureAt(time: Double): Int? {

//        println("getTextureAt: ${time} ready: ${ready} ");

        if (!ready) {
            return null;
        }
        var lastTextures = convertTexture();

//        println(" lastTextures: ${lastTextures} ")

        return lastTextures?.get(0);
    }

    fun videoRunner(time: Long) {
        var _start = System.currentTimeMillis();

//        println(" time: ${time} videoRunner d: ${_start - lastRenderTime} ");


        var _tf = ((time / 1000.0 / 1000.0) * videoFps).roundToInt();
        var _vf = ((currentVideoTime / 1000.0 / 1000.0) * videoFps).roundToInt();

//        println(" _tf: ${_tf} _vf: ${_vf} ");

        if (_tf == _vf) {
            return;
        }

        var targetTime = (_tf * (1000 * 1000.0 / videoFps)).toLong();

        this.executeSync {
            decodeVideo(targetTime);
        }

//        var _start2 = System.currentTimeMillis();
//        println("videoRunner cost : ${_start2 - _start}  ")
    }

    fun decodeVideo(time: Long) {
//        println(" decodevideo ${time} ------------------- ")

        running = true;

        if (readEnd) {
            videoDecoder.flush();
            currentVideoTime = -99999;
            readEnd = false;
        }

        var _prevTime = getPrevTime(time)

        if (currentVideoTime >= _prevTime && currentVideoTime < time) {
            // 不需要seek 直接向前解码
//            mediaExtractor.seekTo(_prevTime, MediaExtractor.SEEK_TO_CLOSEST_SYNC);
        } else {
            if(currentVideoTime > _prevTime) {
                videoDecoder.flush();
            }

            mediaExtractor.seekTo(_prevTime, MediaExtractor.SEEK_TO_CLOSEST_SYNC);
        }


        var _start = System.currentTimeMillis();

        var doRender = false
        var isTimeout = false;
        var timeOutCount = 0;

//
//        println("seekto: ${time}  _prevTime: ${_prevTime} currentVideoTime: ${currentVideoTime} ");
//        println("mediaExtractor time: ${mediaExtractor.sampleTime}  ")


        while (running) {
            if (disposed) {
                break;
            }

            var mt = mediaExtractor.sampleTime

            var _start1 = System.currentTimeMillis();
            var _cost = _start1 - _start;

            if (_cost > 2000) {
                println(" decode time: ${time} cost: ${_cost} time out mt: ${mt} ")
                break;
            }


            val outputBufferIndex = videoDecoder.dequeueOutputBuffer(videoBufferInfo, TIMEOUT_US)


//            println("dequeueOutputBuffer t: ${time}  sampleTime: ${mt}   ")

            if (outputBufferIndex >= 0) {
                if (videoBufferInfo.size > 0) {
                    val currentSampleTime = videoBufferInfo.presentationTimeUs

                    currentVideoTime = currentSampleTime;

                    val diffTime = time - currentSampleTime;

                    if (isTimeout) {
//                        println(" dorendertime: ${time} currentSampleTime: ${currentSampleTime} diffTime: ${diffTime}")
                    }

                    if (diffTime < videoFrameDuration) {
                        doRender = true;
                    } else {
                        doRender = false;
                    }
                }

                //渲染 释放缓冲区
                videoDecoder.releaseOutputBuffer(outputBufferIndex, doRender)

                if (doRender) {
                    var _start1 = System.currentTimeMillis();
                    if (VERBOSE) println("time: ${time} doRender true: ${_start1 - _start}  ");
                    break;
                }

            } else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
                Log.v(TAG, "format changed")
            } else if (outputBufferIndex == MediaCodec.INFO_TRY_AGAIN_LATER) {
//                Log.v(TAG, "视频解码当前帧超时 time: ${time} lastTime: ${lastTime} currentVideoTime: ${currentVideoTime} _prevTime: ${_prevTime} ")
            }

//            println("1 readEnd: ${readEnd} ")

            if (!readEnd) {
                readEnd = putBufferToCoder(mediaExtractor, videoDecoder)
            }

//            println("2 readEnd: ${readEnd} videoBufferInfo.flags: ${videoBufferInfo.flags}   MediaCodec.BUFFER_FLAG_END_OF_STREAM != 0: ${ MediaCodec.BUFFER_FLAG_END_OF_STREAM != 0}")

            if (videoBufferInfo.flags and MediaCodec.BUFFER_FLAG_END_OF_STREAM != 0) {
                Log.v(TAG, "buffer stream end time: ${time}");
                currentVideoTime = -99999;
                videoDecoder.flush();
                mediaExtractor.seekTo(0, MediaExtractor.SEEK_TO_CLOSEST_SYNC);
                readEnd = true;
                break
            }
        }

//        println("decode end : ${time}  ");

        running = false;
    }


    /**
     * 将缓冲区传递至解码器
     *
     * @param extractor
     * @param decoder
     * @param inputBuffers
     * @return 如果到了文件末尾,返回true;否则返回false
     */
    private fun putBufferToCoder(extractor: MediaExtractor, decoder: MediaCodec): Boolean {
//        println(" putBufferToCoder ..........  ");
        var isMediaEnd = false
        val inputBufferIndex = decoder.dequeueInputBuffer(TIMEOUT_US)
        if (inputBufferIndex >= 0) {
            val inputBuffer = decoder.getInputBuffer(inputBufferIndex);

            val sampleSize = extractor.readSampleData(inputBuffer!!, 0)

//            println(" putBufferToCoder sampleSize: ${sampleSize}..........  ");

            if (sampleSize < 0) {
                decoder.queueInputBuffer(inputBufferIndex, 0, 0, 0, MediaCodec.BUFFER_FLAG_END_OF_STREAM)
                isMediaEnd = true
            } else {
                val flag = extractor.sampleFlags
                decoder.queueInputBuffer(inputBufferIndex, 0, sampleSize, extractor.sampleTime, 0)
                extractor.advance()
            }
        }
        return isMediaEnd
    }



    fun durationInDouble(): Double {
        return duration;
    }

    fun dispose() {
        decoderThread.quit();

        disposed = true;
    }

    // convert GL_TEXTURE_EXTERNAL_OES to GL_TEXTURE
    fun convertTexture(): IntArray? {
        // 纹理ID
        var textureID0 = textureId!!;


        var _tmpProgram = OpenGLProgram().getProgramOES();


        glUseProgram(_tmpProgram)

        glBindFramebuffer(GL_FRAMEBUFFER, tmpFrameBuffer[0]);


        val _filterTexture0Uniform = glGetUniformLocation(_tmpProgram, "Texture0");

        glActiveTexture(GL_TEXTURE0);

        glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureID0);

        glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
        glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL_TEXTURE_MAG_FILTER, GL_LINEAR);

//        checkGlError("mediacodec video output convertTexture 3 ")

        glUniform1i(_filterTexture0Uniform, 0);
        // 激活纹理单元GL_TEXTURE0,绑定纹理,
        //  GL_TEXTURE0
        // 将 textureSlot 赋值为 0,而 0 与 GL_TEXTURE0 对应,这里如果写 1,上面也要改成 GL_TEXTURE1

        val _positionSlot = glGetAttribLocation(_tmpProgram, "Position")
        val _textureSlot = glGetAttribLocation(_tmpProgram, "TextureCoords")
        glEnableVertexAttribArray(_positionSlot);
        glEnableVertexAttribArray(_textureSlot);

//        println(" convertTexture degrees ${degrees}  wdith: ${width} height: ${height} ")

        //TODO
//        if (degrees == 90) {
//            vertexBuffer90.position(0);
//            glVertexAttribPointer(_positionSlot, 3, GL_FLOAT, false, 0, vertexBuffer90);
//            textureBuffer90.position(0);
//            glVertexAttribPointer(_textureSlot, 2, GL_FLOAT, false, 0, textureBuffer90);
//        } else if (degrees == 180) {
//            vertexBuffer180.position(0);
//            glVertexAttribPointer(_positionSlot, 3, GL_FLOAT, false, 0, vertexBuffer1);
//            textureBuffer180.position(0);
//            glVertexAttribPointer(_textureSlot, 2, GL_FLOAT, false, 0, textureBuffer1);
//        } else if (degrees == 270) {
//            vertexBuffer270.position(0);
//            glVertexAttribPointer(_positionSlot, 3, GL_FLOAT, false, 0, vertexBuffer270);
//            textureBuffer270.position(0);
//            glVertexAttribPointer(_textureSlot, 2, GL_FLOAT, false, 0, textureBuffer270);
//        } else {

//            vertexBuffer2.position(0);
//            textureBuffer2.position(0);
//
//            glVertexAttribPointer(_positionSlot, 3, GL_FLOAT, false, 0, vertexBuffer2);
//            glVertexAttribPointer(_textureSlot, 2, GL_FLOAT, false, 0, textureBuffer2);
//        }

        if(degrees == 0) {
            vertexBuffer0.position(0);
            textureBuffer0.position(0);

            glVertexAttribPointer(_positionSlot, 3, GL_FLOAT, false, 0, vertexBuffer0);
            glVertexAttribPointer(_textureSlot, 2, GL_FLOAT, false, 0, textureBuffer0);
        } else if(degrees == 90) {
            vertexBuffer1.position(0);
            textureBuffer1.position(0);

            glVertexAttribPointer(_positionSlot, 3, GL_FLOAT, false, 0, vertexBuffer1);
            glVertexAttribPointer(_textureSlot, 2, GL_FLOAT, false, 0, textureBuffer1);
        } else if(degrees == 180) {
            vertexBuffer180.position(0);
            textureBuffer180.position(0);

            glVertexAttribPointer(_positionSlot, 3, GL_FLOAT, false, 0, vertexBuffer180);
            glVertexAttribPointer(_textureSlot, 2, GL_FLOAT, false, 0, textureBuffer180);
        } else if(degrees == 270) {
            vertexBuffer270.position(0);
            textureBuffer270.position(0);

            glVertexAttribPointer(_positionSlot, 3, GL_FLOAT, false, 0, vertexBuffer270);
            glVertexAttribPointer(_textureSlot, 2, GL_FLOAT, false, 0, textureBuffer270);
        } else {
            println(" VideoOutput degrees is not support ${degrees} ");
            vertexBuffer0.position(0);
            textureBuffer0.position(0);

            glVertexAttribPointer(_positionSlot, 3, GL_FLOAT, false, 0, vertexBuffer0);
            glVertexAttribPointer(_textureSlot, 2, GL_FLOAT, false, 0, textureBuffer0);
        }

//        println(" VideoOutput degrees is  ${degrees} ");

        glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
        glFinish();

//        if(seekingTo < 10 * 1000 * 1000L) {
//            saveFrame();
//        }

//        saveFrame();

        return tmpTextures;
    }

    fun decodeVideoAt(filePath: String, time: Double) : Int {

        var _texture: Int = 0;

        seek(time);

        this.executeSync {
            _texture = getTextureAt(0.0)!!;
        }

        return _texture;
    }


    fun saveFrame() {

        var filename = "${seekingTo}.png";

        var mPixelBuf = ByteBuffer.allocateDirect(width * height * 4);
        mPixelBuf.order(ByteOrder.LITTLE_ENDIAN);

        mPixelBuf.rewind()
        glReadPixels(0, 0, width, height, GL_RGBA, GL_UNSIGNED_BYTE, mPixelBuf)

        println(" mPixelBuf ------------------------------------------------ ")



        println(" ${mPixelBuf.get(0)} ${mPixelBuf.get(1)} ${mPixelBuf.get(2)} ${mPixelBuf.get(3)} ${mPixelBuf.get(4)} ")

        var bos: BufferedOutputStream? = null;

        try {
            var path = TextureRenderPlugin.context.getExternalFilesDir(null)!!.getAbsolutePath();
            bos = BufferedOutputStream(FileOutputStream("${path}${filename}"))
            val bmp = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888)
            mPixelBuf.rewind()
            bmp.copyPixelsFromBuffer(mPixelBuf)
            bmp.compress(Bitmap.CompressFormat.PNG, 90, bos)
            saveBitmapTofile(bmp, filename);


            bmp.recycle()


        } finally {
            if (bos != null) bos!!.close()
        }

    }

    fun saveBitmapTofile(bmp: Bitmap, filename: String) {
        if (bmp == null || filename == null) {
            return;
        }

        val context = TextureRenderPlugin.context;

        MediaStore.Images.Media.insertImage(
                context.contentResolver,
                bmp,
                filename,
                "Image of $filename"
        )
    }


    fun setupFBO() {
        glGenFramebuffers(1, tmpFrameBuffer, 0);
        glGenTextures(1, tmpTextures, 0);


        var glWidth = this.width;
        var glHeight = this.height;

        glActiveTexture(GL_TEXTURE10);
        glBindTexture(GL_TEXTURE_2D, tmpTextures[0]);
        glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, glWidth, glHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, null);

        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
        glBindFramebuffer(GL_FRAMEBUFFER, tmpFrameBuffer[0]);
        glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, tmpTextures[0], 0);
        //    glFramebufferTexture2D 是把帧缓存的颜色输出定位到纹理中,这样shader的绘制结果就会成为纹理;

    }

    fun getValueFor(value: Int): Int {
        var i: Int = 1;

        while (2.0.pow(i) < value) {
            i = i + 1;
        }

        return 2.0.pow(i).toInt();
    }


    fun setupVBO2() {
        var w: Float = 1.0f;
        var h: Float = 1.0f;

        var verticesPoints = floatArrayOf(-w, -h, 0.0f, w, -h, 0.0f, -w, h, 0.0f, w, h, 0.0f);
        var texturesPoints = floatArrayOf(0.0f, 0.0f, 1.0f, 0.0f, 0.0f, 1.0f, 1.0f, 1.0f);


        // 创建顶点缓存
        vertexBuffer2 = BufferUtils.createFloatBuffer(verticesPoints);

        // 设置纹理坐标数据
        textureBuffer2 = BufferUtils.createFloatBuffer(texturesPoints);
    }

    fun setupVBO0() {
        var w: Float = 1.0f;
        var h: Float = 1.0f;

        var verticesPoints = floatArrayOf(-w, h, 0.0f, w, h, 0.0f, -w, -h, 0.0f, w, -h, 0.0f);
        var texturesPoints = floatArrayOf(0.0f, 0.0f, 1.0f, 0.0f, 0.0f, 1.0f, 1.0f, 1.0f);

        // 创建顶点缓存
        vertexBuffer0 = BufferUtils.createFloatBuffer(verticesPoints);

        // 设置纹理坐标数据
        textureBuffer0 = BufferUtils.createFloatBuffer(texturesPoints);
    }

    fun setupVBO1() {
        var w: Float = 1.0f;
        var h: Float = 1.0f;

        var verticesPoints = floatArrayOf( w, h, 0.0f, w, -h, 0.0f, -w, h, 0.0f, -w, -h, 0.0f );
        var texturesPoints = floatArrayOf( 0.0f, 0.0f,  1.0f, 0.0f, 0.0f, 1.0f, 1.0f, 1.0f );

        // 创建顶点缓存
        vertexBuffer1 = BufferUtils.createFloatBuffer(verticesPoints);

        // 设置纹理坐标数据
        textureBuffer1 = BufferUtils.createFloatBuffer(texturesPoints);
    }

    fun setupVBO270() {
        var w: Float = 1.0f;
        var h: Float = 1.0f;

        var verticesPoints = floatArrayOf(-w, -h, 0.0f, -w, h, 0.0f,  w, -h, 0.0f, w, h, 0.0f

        );
        var texturesPoints = floatArrayOf( 0.0f,0.0f, 1.0f,0.0f, 0.0f,1.0f, 1.0f,1.0f);

        // 创建顶点缓存
        vertexBuffer270 = BufferUtils.createFloatBuffer(verticesPoints);

        // 设置纹理坐标数据
        textureBuffer270 = BufferUtils.createFloatBuffer(texturesPoints);
    }


    fun setupVBO90() {
        var w: Float = 1.0f;
        var h: Float = 1.0f;

        var verticesPoints = floatArrayOf(w, -h, 0.0f, w, h, 0.0f, -w, -h, 0.0f, -w, h, 0.0f);
        var texturesPoints = floatArrayOf(0.0f, 0.0f, 1.0f, 0.0f, 0.0f, 1.0f, 1.0f, 1.0f);

        // 创建顶点缓存
        vertexBuffer90 = BufferUtils.createFloatBuffer(verticesPoints);

        // 设置纹理坐标数据
        textureBuffer90 = BufferUtils.createFloatBuffer(texturesPoints);
    }

    fun setupVBO180() {
        var w: Float = 1.0f;
        var h: Float = 1.0f;

        var verticesPoints = floatArrayOf(-w, h, 0.0f, w, h, 0.0f, -w, -h, 0.0f, w, -h, 0.0f);
        var texturesPoints = floatArrayOf(0.0f, 0.0f, 1.0f, 0.0f, 0.0f, 1.0f, 1.0f, 1.0f);

        // 创建顶点缓存
        vertexBuffer180 = BufferUtils.createFloatBuffer(verticesPoints);

        // 设置纹理坐标数据
        textureBuffer180 = BufferUtils.createFloatBuffer(texturesPoints);
    }


    //检查每一步操作是否有错误的方法
    fun checkGlError(op: String) {
        val error: Int = glGetError();
        if (error != GL_NO_ERROR) {
            Log.e("ES20_ERROR", "$op: glError $error")
            throw RuntimeException("$op: glError $error")
        }
    }


    companion object {
        private const val VERBOSE = true;
        private const val TAG = "hwtPlay"
        private const val KEY_VIDEO = "video/"
        private const val KEY_AUDIO = "audio/"
        private const val TIMEOUT_US: Long = 0L;
    }
}