google / filament

Filament is a real-time physically based rendering engine for Android, iOS, Windows, Linux, macOS, and WebGL2

Home Page:https://google.github.io/filament/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

why another triangle not render camera texture?help!

steven-gao opened this issue · comments

Describe the bug
A clear and concise description of what the bug is.
hello,every one,
why another triangle not render camera texture?help me thank you very mush!
To Reproduce
Steps to reproduce the behavior:

  1. I adopted the sample-hello-camera project ,want to render a camera stream as backgroud,but i am failed.

Expected behavior
A clear and concise description of what you expected to happen.
The camera is fully rendered
Screenshots
If applicable, add screenshots to help explain your problem.
Screenshot_20220317-155750_Hello Camera

Logs
If applicable, copy logs from your console here. Please do not
use screenshots of logs, copy them as text.
My MainActivity.java like follows,If the code is not written correctly, please correct me:

`
class MainActivity : Activity(), ActivityCompat.OnRequestPermissionsResultCallback {
companion object {
init {
Filament.init()
}
}
private lateinit var surfaceView: SurfaceView
private lateinit var uiHelper: UiHelper
private lateinit var displayHelper: DisplayHelper
private lateinit var choreographer: Choreographer
private lateinit var engine: Engine
private lateinit var renderer: Renderer
private lateinit var scene: Scene
private lateinit var view: View
// This helper wraps the Android camera2 API and connects it to a Filament material.
private lateinit var cameraHelper: CameraHelper
// This is the Filament camera, not the phone camera. :)
private lateinit var camera: Camera
// Other Filament objects:
private lateinit var material: Material
private lateinit var materialInstance: MaterialInstance
private lateinit var vertexBuffer: VertexBuffer
private lateinit var indexBuffer: IndexBuffer
// Filament entity representing a renderable object
@entity private var renderable = 0
@entity private var light = 0
// A swap chain is Filament's representation of a surface
private var swapChain: SwapChain? = null
// Performs the rendering and schedules new frames
private val frameScheduler = FrameCallback()
private val animator = ValueAnimator.ofFloat(0.0f, 50.0f)

override fun onCreate(savedInstanceState: Bundle?) {
    super.onCreate(savedInstanceState)
    surfaceView = SurfaceView(this)
    setContentView(surfaceView)
    if (ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
        ActivityCompat.requestPermissions(this, arrayOf(Manifest.permission.CAMERA), 1);
    }else{
        initViews()
    }
}

fun initViews(){
    choreographer = Choreographer.getInstance()
    displayHelper = DisplayHelper(this)
    setupSurfaceView()
    setupFilament()
    setupView()
    setupScene()
    cameraHelper = CameraHelper(this, engine,materialInstance)
    cameraHelper.openCamera()
}

private fun setupSurfaceView() {
    uiHelper = UiHelper(UiHelper.ContextErrorPolicy.DONT_CHECK)
    uiHelper.renderCallback = SurfaceCallback()
    uiHelper.attachTo(surfaceView)
}

private fun setupFilament() {
    engine = Engine.create()
    renderer = engine.createRenderer()
    scene = engine.createScene()
    view = engine.createView()
    camera = engine.createCamera(engine.entityManager.create())
}

private fun setupView() {
    scene.skybox = Skybox.Builder().color(0.035f, 0.035f, 0.035f, 1.0f).build(engine)
    view.camera = camera
    view.scene = scene
}

private fun setupScene() {
    loadMaterial()
    setupMaterial()
    createTriangle()
    // To create a renderable we first create a generic entity
    renderable = EntityManager.get().create()

    var builder = RenderableManager.Builder(1)
        // Overall bounding box of the renderable
        builder.castShadows(false)
            .receiveShadows(false)
            .culling(false)
            .priority(7)
        // Sets the mesh data of the first primitive
        .geometry(0, PrimitiveType.TRIANGLES, vertexBuffer, indexBuffer,0,2*3)
        // Sets the material of the first primitive
        .material(0, materialInstance)

        .build(engine, renderable)

    scene.addEntity(renderable)

    // We now need a light, let's create a directional light
    light = EntityManager.get().create()
    // Create a color from a temperature (5,500K)
    val (r, g, b) = Colors.cct(5_500.0f)
    LightManager.Builder(LightManager.Type.DIRECTIONAL)
            .color(r, g, b)
            // Intensity of the sun in lux on a clear day
            .intensity(110_000.0f)
            // The direction is normalized on our behalf
            .direction(0.0f, -0.5f, -1.0f)
            .castShadows(true)
            .build(engine, light)

    // Add the entity to the scene to light it
    scene.addEntity(light)
    // Set the exposure on the camera, this exposure follows the sunny f/16 rule
    // Since we've defined a light that has the same intensity as the sun, it
    // guarantees a proper exposure
    camera.setExposure(16.0f, 1.0f / 125.0f, 100.0f)
    // Move the camera back to see the object
    camera.lookAt(0.0, 0.0, 6.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0)
}

private fun loadMaterial() {
    readUncompressedAsset("materials/camera_feed.filamat").let {
        material = Material.Builder().payload(it, it.remaining()).build(engine)
    }
}

private fun setupMaterial() {
    materialInstance = material.createInstance()
}

val CAMERA_UVS = floatArrayOf(1f, 1f, 0f, 1f, 0f, 0f, 0f, 0f, 0f, 0f, 1f, 0f)

private fun createTriangle() {
    val floatSize = 4
    val shortSize = 2
    // A vertex is a position + a color:
    // 3 floats for XYZ position, 1 integer for color
    val vertexSize = 3 * floatSize
    // Define a vertex and a function to put a vertex in a ByteBuffer
    data class Vertex(val x: Float, val y: Float, val z: Float)
    fun ByteBuffer.put(v: Vertex): ByteBuffer {
        putFloat(v.x)
        putFloat(v.y)
        putFloat(v.z)
        return this
    }

    // We are going to generate a single triangle
    val vertexCount = 4

    var v1 = Vertex(-1f, 1f,0.0f)
    var v2 = Vertex(-1f, -1f,0.0f )
    var v3 = Vertex(1f, 1f, 0.0f)
    var v4 = Vertex(1f, -1f,0.0f)

    val uvBuffer =
        ByteBuffer.allocateDirect(CAMERA_UVS.size * Float.SIZE_BYTES)
            .order(ByteOrder.nativeOrder())
            .asFloatBuffer()
    uvBuffer.put(CAMERA_UVS)
    uvBuffer.rewind()

    val vertexData = ByteBuffer.allocate(vertexCount * vertexSize)
        // It is important to respect the native byte order
        .order(ByteOrder.nativeOrder())
        .put(v1)
        .put(v2)
        .put(v3)
        .put(v4)
        // Make sure the cursor is pointing in the right place in the byte buffer
        .flip()

    // Declare the layout of our mesh
    vertexBuffer = VertexBuffer.Builder()
        .bufferCount(2)
        .vertexCount(vertexCount)
        // Because we interleave position and color data we must specify offset and stride
        // We could use de-interleaved data by declaring two buffers and giving each
        // attribute a different buffer index
        .attribute(VertexAttribute.POSITION, 0, AttributeType.FLOAT3, 0, vertexSize)

// .attribute(VertexAttribute.COLOR, 0, AttributeType.UBYTE4, 3 * floatSize, vertexSize)
// We store colors as unsigned bytes but since we want values between 0 and 1
// in the material (shaders), we must mark the attribute as normalized
// .normalized(VertexAttribute.COLOR)
.attribute(
VertexAttribute.UV0,
1,
VertexBuffer.AttributeType.FLOAT2,
0,
(CAMERA_UVS.size/vertexCount) * 4)
.build(engine)

    // Feed the vertex data to the mesh
    // We only set 1 buffer because the data is interleaved
    //设置顶点属性
    vertexBuffer.setBufferAt(engine, 0, vertexData)

    vertexBuffer.setBufferAt(engine, 1, uvBuffer)

    var indexCount = 6

    // Create the indices
    val indexData = ByteBuffer.allocate(indexCount * shortSize)
        .order(ByteOrder.nativeOrder())
        .putShort(0)
        .putShort(1)
        .putShort(2)
        .putShort(1)
        .putShort(2)
        .putShort(3)
        .flip()
    indexBuffer = IndexBuffer.Builder()
        .indexCount(indexCount)
        .bufferType(IndexBuffer.Builder.IndexType.USHORT)
        .build(engine)
    indexBuffer.setBuffer(engine, indexData)

}


override fun onResume() {
    super.onResume()
    choreographer.postFrameCallback(frameScheduler)
    animator.start()
    cameraHelper.onResume()
}

override fun onPause() {
    super.onPause()
    choreographer.removeFrameCallback(frameScheduler)
    animator.cancel()
    cameraHelper.onPause()
}

override fun onDestroy() {
    super.onDestroy()
    // Stop the animation and any pending frame
    choreographer.removeFrameCallback(frameScheduler)
    animator.cancel()
    // Always detach the surface before destroying the engine
    uiHelper.detach()
    // Cleanup all resources
    engine.destroyEntity(renderable)
    engine.destroyRenderer(renderer)
    engine.destroyVertexBuffer(vertexBuffer)
    engine.destroyIndexBuffer(indexBuffer)
    engine.destroyMaterialInstance(materialInstance)
    engine.destroyMaterial(material)
    engine.destroyView(view)
    engine.destroyScene(scene)
    engine.destroyCameraComponent(camera.entity)
    // Engine.destroyEntity() destroys Filament related resources only
    // (components), not the entity itself
    val entityManager = EntityManager.get()
    entityManager.destroy(light)
    entityManager.destroy(renderable)
    entityManager.destroy(camera.entity)
    // Destroying the engine will free up any resource you may have forgotten
    // to destroy, but it's recommended to do the cleanup properly
    engine.destroy()
}

inner class FrameCallback : Choreographer.FrameCallback {
    @RequiresApi(Build.VERSION_CODES.P)
    override fun doFrame(frameTimeNanos: Long) {
        // Schedule the next frame
        choreographer.postFrameCallback(this)
        // This check guarantees that we have a swap chain
        if (uiHelper.isReadyToRender) {
            cameraHelper.pushExternalImageToFilament()
            // If beginFrame() returns false you should skip the frame
            // This means you are sending frames too quickly to the GPU
            if (renderer.beginFrame(swapChain!!, frameTimeNanos)) {
                renderer.render(view)
                renderer.endFrame()
            }
        }
    }
}

inner class SurfaceCallback : UiHelper.RendererCallback {
    override fun onNativeWindowChanged(surface: Surface) {
        swapChain?.let { engine.destroySwapChain(it) }
        swapChain = engine.createSwapChain(surface)
        displayHelper.attach(renderer, surfaceView.display)
    }

    override fun onDetachedFromSurface() {
        displayHelper.detach()
        swapChain?.let {
            engine.destroySwapChain(it)
            // Required to ensure we don't return before Filament is done executing the
            // destroySwapChain command, otherwise Android might destroy the Surface
            // too early
            engine.flushAndWait()
            swapChain = null
        }
    }

    override fun onResized(width: Int, height: Int) {

// val aspect = width.toDouble() / height.toDouble()
// camera.setProjection(45.0, aspect, 0.1, 20.0, Camera.Fov.VERTICAL)
val zoom = 1.5
val aspect = width.toDouble() / height.toDouble()
camera.setProjection(Camera.Projection.ORTHO,
-aspect * zoom, aspect * zoom, -zoom, zoom, 0.0, 10.0)
view.viewport = Viewport(0, 0, width, height)
}
}

private fun readUncompressedAsset(@Suppress("SameParameterValue") assetName: String): ByteBuffer {
    assets.openFd(assetName).use { fd ->
        val input = fd.createInputStream()
        val dst = ByteBuffer.allocate(fd.length.toInt())

        val src = Channels.newChannel(input)
        src.read(dst)
        src.close()

        return dst.apply { rewind() }
    }
}

override fun onRequestPermissionsResult(requestCode: Int, permissions: Array<String>, grantResults: IntArray) {

// if (!cameraHelper.onRequestPermissionsResult(requestCode, grantResults)) {
// this.onRequestPermissionsResult(requestCode, permissions, grantResults)
// }
}

}
`

Desktop (please complete the following information):

  • OS: [e.g. iOS] mac os
  • GPU: [e.g. NVIDIA GTX 1080]
  • Backend: [OpenGL/Vulkan]

Smartphone (please complete the following information):

  • Device: [e.g. Pixel 2] sumsung s10
  • OS: [e.g. Android Pie 9.0] android 10

Additional context
Add any other context about the problem here.

Please use the Discussions tab for questions instead of opening issues. The problem is likely the winding of your second triangle, it's probably back facing. Change the order of the indices you use.

Try to update:

val indexData = ByteBuffer.allocate(indexCount * shortSize)
    .order(ByteOrder.nativeOrder())
    .putShort(0)
    .putShort(1)
    .putShort(2)
    **.putShort(2)
    .putShort(3)
    .putShort(0)**

Please use the Discussions tab for questions instead of opening issues. The problem is likely the winding of your second triangle, it's probably back facing. Change the order of the indices you use.

sorry,It won't be like this next time

No worry, we just try to keep the two separated so we have a good idea of how many bugs we have to fix :) Thanks!

Try to update:

val indexData = ByteBuffer.allocate(indexCount * shortSize)
    .order(ByteOrder.nativeOrder())
    .putShort(0)
    .putShort(1)
    .putShort(2)
    **.putShort(2)
    .putShort(3)
    .putShort(0)**

thank you,It's true that the order is wrong, solved it