Encountered a mind-blowing problem with OpenGL.

When I finally drew a circle using LWJGL with OpenGL, I tried moving my code from glBegin/glEnd calls to VBOs, and today I tried to make my code using shaders. And problem is here.

I draw a circle by slicing it into some segments and calculate the coordinates of points on the edge, then I can use GL_TRIANGLE_FAN or GL_LINE_LOOP to draw a circle. I randomly choose a color by assigning red equals cos(theta), where the i th theta is 2 * pi * i divided by total segments.

If I draw the circle using GL_LINE_LOOP, the color on the edge is correct, where only theta in mines pi to pi is positive number, thus given the red color, which should be the right hand side of the circle. Picture 1 is a compare that on the left I used GL_COLOR_ARRAY client state to draw color, on the right side I use shader but calculated cos before fragment shader(both in CPU part and vertex shader, they are same in result). In picture 2, I calculate cos value in the fragment shader by the theta passed by vertex shader. And it shows on the right of the picture, which is pretty odd.

I knew there other stages between vertex shader and fragment shader, but how they effect those color or theta value? Also on the left side of picture, color gradient given by fixed pipeline also seems a little bit weird.

BTW I have no ideas about where to find those documents. LWJGL doesn't have a detailed document on OpenGL(which I guess they of course not have one), and on the OpenGL side, the documents are just overwhelmed me...

@skyblond vars specified per vertex are linearly interpolated per fragment.

You should also check your normals and see if you are correctly transforming them.

@aluaces I see, it looks like if I'm passing theta from vertex shader, the theta value will be linearly interpolated and thus each fragment will have a theta not that correct. If I calculate cos(theta) before that interpolation, then pass it to fragment shader, the red color it self will be linearly interpolated, thus it looks way better.

I change red to theta/2pi, to show me how the theta value is interpolated though the circle, and it seems I need the first point to be (0,0) to make the center of circle properly positioned.

@skyblond I'm not sure if I'm correctly following you, but if I had to pass anything to the fragment shader, I would pass mainly variables that can be interpolated that way, like the x, y coordinates and then compute theta and cos(theta) in the fragment shader.

@aluaces
That's a good idea, recovering theta from coordinates. At first I was worrying about the performance drop if I put all calculation in vertex shader, as what I know about CPU pipeline, each stage should use same time to do there work for the best performance, I guess the GPU pipeline also works that way too.

@skyblond you are right, but you are still way before hitting hardware limits, so you can afford to perform precise computations. Only if you are hard pressed you could resort to approximations and tricks as precomputing those values you need into textures.

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.