DEV Community

mdh81
mdh81

Posted on • Updated on

Integer to void* conversion

I came across this intriguing OpenGL code snippet

glVertexAttribPointer(colorAttrib,              
                      1,                        
                      GL_FLOAT,                 
                      GL_FALSE,                 
                      3*sizeof(float),          
                      (void*)(2*sizeof(float)) // FTW?!
                      );
Enter fullscreen mode Exit fullscreen mode

glVertexAttribPointer's signature says the last argument is of type void*. This argument is supposed to tell OpenGL what the offset is into a vertex buffer array data.

This looked surely wrong to me--how the hell can you cast an integral type to a pointer and use that pointer meaningfully? To demonstrate that this doesn't compute, I wrote a simple program like so:

#include <iostream>
using namespace std;


void fooFunc(void* ptr) {
    int* intptr = reinterpret_cast<int*>(ptr);
    cout << *intptr << endl; // **Crashes here as expected**
}


int main() {
    fooFunc((void*)(12*3));
    return 0;
}
Enter fullscreen mode Exit fullscreen mode

Sure enough, it crashes when it tries to de-reference the int pointer because the void* that the int* is converted from is pointing to garbage memory.

While it was clear that it was garbage memory, it wasn't immediately clear to me how the value of that pointer could be used. Understanding that was the key to unlocking how this awkward OpenGL API was working.

In fooFunc, ptr's value was 0x24. Clearly, garbage memory. But, it's also the result of the integer expression 12*3 from which the pointer was created in the calling code! Now, how can this be useful for anyone?!

Consider this new version of fooFunc

void fooFunc(void* ptr) {
    size_t intval = reinterpret_cast<size_t>(ptr);
    cout << intval << endl; // ** prints 36 **
}
Enter fullscreen mode Exit fullscreen mode

Now, we have managed to make something out of the dangling void pointer! We have managed to get the integral value that the caller wanted to pass to this function without passing an integral value, instead the caller passed a pointer!

Now, why can't glVertexAttribPointer simply use an integer parameter type and save us the trouble of staring hard at this mysterious looking code? The answer, like it most often is in cases like this is--legacy. Apparently, there was time when OpenGL would allow glVertexAttribPointer()'s last parameter to be used to specify a pointer to the vertex data array in the CPU memory. Post VBOs, we transfer the vertex data to the GPU via glBufferData() call and don't need to pass it via glVertexAttribPointer() call. The Khronos group decided to make this API backward compatible and decided to overload the last parameter to allow the user to set an offset into vertex buffer array data. So, we ended up with this awkward (void*)(2*sizeof(float) cast from integer to pointer.

References:
https://stackoverflow.com/questions/28063307/in-opengl-why-does-glvertexattribpointer-require-the-pointer-argument-to-be-p
https://stackoverflow.com/questions/58679610/how-to-cast-an-integer-to-a-void-without-violating-c-core-guidelines

Top comments (0)