Why doesn't a line draw after instantiating it on a thread other than the renderer thread?

| | August 8, 2015

I’m making a 2D game with OpenGL ES 2.0 on Android, as an excuse to learn a bunch of things. I’m drawing many lines to make a grid, but setting everything up takes time and I’d like to render a loading screen while I set everything up in the background.

While trying to tackle this issue, I’m facing a problem that I can’t understand so my question is why am I seeing this behavior.

The code:

This is the class whose instances will represent each line:

This is just a slightly modified example I found on the web. There’s a call to Thread.sleep that’s only meant to increase the time it takes to create it, so that I can show the problem I’m facing.

import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.FloatBuffer;

import android.opengl.GLES20;

public class LineExample {

    private final String vertexShaderCode =
                "attribute vec4 vPosition;" +
                "void main() {" +
                "   gl_Position = vPosition;" +

    private final String fragmentShaderCode =
                "precision mediump float;" +
                "void main() {" +
                "  gl_FragColor = vec4(0.0, 0.0, 1.0, 1.0);" +

    private int mProgram;
    private FloatBuffer vertexBuffer;
    private ByteBuffer bb;

    // number of coordinates per vertex in this array
    static final int COORDS_PER_VERTEX = 2;
    private final int vertexStride = COORDS_PER_VERTEX * 4; // bytes per vertex
    private float lineCoords[] = new float[2*COORDS_PER_VERTEX];
    private int vertexCount = lineCoords.length / COORDS_PER_VERTEX;

    public LineExample(float[] lineCoords) {

        try {
        } catch (InterruptedException e) {
            // TODO Auto-generated catch block

        this.lineCoords = lineCoords;

        // initialize vertex byte buffer for shape coordinates
        bb = ByteBuffer.allocateDirect(
        // (# of coordinate values * 4 bytes per float)
                lineCoords.length * 4);
        vertexBuffer = bb.asFloatBuffer();

        // prepare shaders and OpenGL program
        int vertexShader = loadShader(GLES20.GL_VERTEX_SHADER,
        int fragmentShader = loadShader(GLES20.GL_FRAGMENT_SHADER,

        mProgram = GLES20.glCreateProgram();             // create empty OpenGL Program
        GLES20.glAttachShader(mProgram, vertexShader);   // add the vertex shader to program
        GLES20.glAttachShader(mProgram, fragmentShader); // add the fragment shader to program
        GLES20.glLinkProgram(mProgram);                  // create OpenGL program executables


    public void draw() {

        // Add program to OpenGL environment

        // get handle to vertex shader's vPosition member
        int mPositionHandle = GLES20.glGetAttribLocation(mProgram, "vPosition");

        // Enable a handle to the triangle vertices

        // Prepare the triangle coordinate data
        GLES20.glVertexAttribPointer(mPositionHandle, COORDS_PER_VERTEX,
                                     GLES20.GL_FLOAT, false,
                                     vertexStride, vertexBuffer);

        // Draw
        GLES20.glDrawArrays(GLES20.GL_LINES, 0, vertexCount);

        // Disable vertex array

    private static int loadShader(int type, String shaderCode){

        // create a vertex shader type (GLES20.GL_VERTEX_SHADER)
        // or a fragment shader type (GLES20.GL_FRAGMENT_SHADER)
        int shader = GLES20.glCreateShader(type);

        // add the source code to the shader and compile it
        GLES20.glShaderSource(shader, shaderCode);

        return shader;


Here’s the renderer:

public class MyRenderer2 implements Renderer {

    LineExample currentLineToRender = null;

    public void onDrawFrame(GL10 unused) {
        if (currentLineToRender!=null){

    public void onSurfaceCreated(GL10 gl, EGLConfig config) {

    public void onSurfaceChanged(GL10 gl, int width, int height) {
        GLES20.glViewport(0, 0, width, height);
        GLES20.glClearColor(0.0f, 0.0f, 0.0f, 1.0f);

    private void initializeLines(){
        float[] waitingLineVerts = {-0.5f,0.0f , 0.5f,0.0f};
        currentLineToRender = new LineExample(waitingLineVerts);
        new Thread(){
            public void run() {
                float[] doneLineVerts = {0.0f,-0.5f , 0.0f,0.5f};
                currentLineToRender = new LineExample(doneLineVerts);

    private void initializeLinesSameThread(){
        float[] waitingLineVerts = {-0.5f,0.0f , 0.5f,0.0f};
        currentLineToRender = new LineExample(waitingLineVerts);
        float[] doneLineVerts = {0.0f,-0.5f , 0.0f,0.5f};
        currentLineToRender = new LineExample(doneLineVerts);


The goal is to see the an horizontal line for 5 seconds (analogous to a loading screen) then see a vertical line (analogous to the actual game grid). However, with this code I get a dark screen for ~10 seconds and then the second line is drawn.

I figured that the thread is busy with the initialization code so it’s not drawing the first line, so to allow it to get to the onDrawFrame method and draw the first line, I decided to call the initialization of the second line on another thread (by calling initializeLines() instead of initializeLinesSameThread() on onSurfaceCreated), but by doing that, I only see the first line and I never get to see the second.

My guess is that some things that are done in the other thread aren’t accessible by the renderer thread when the line’s draw method is called.

My question is why is this happening?

(and I’d be grateful if you could give me a suggestion on how to achieve what I want to do)

2 Responses to “Why doesn't a line draw after instantiating it on a thread other than the renderer thread?”

  1. It turns out you can’t use GLES20 static methods outside of the renderer’s thread. Even though the lines are being “drawn” in the renderer thread, I was using OpenGL calls to prepare their programs in their initialization, which was taking place in another thread. Those calls seem to always return 0, that’s why the line wouldn’t be drawn later, even if drawn in the renderer thread.

    I’ve had success using the renderer’s onDrawFrame method to initialize some assets if needed and show the loading screen at the same time but Shivan Dragon’s suggestion is probably much more reasonable for most cases.

  2. If you don’t care that the loading screen is not drawn in OpenGL, you can just make a plain View which displays the image corresponding to your loading screen, display that view and then start the view that holds your OpenGL surface. Once that view is loaded, hide the loading screen view.

Leave a Reply