I'm trying to make function that initializes VAO on specif index, with some data. The problem is, when I'm accessing the sizeof vertices, return's an wrong size.
Here is the data:
typedef struct {
GLfloat XYZW[4];
GLfloat RGBA[4];
} Vertex;
const Vertex Vertices2[] =
{
{ { 0.25f, 0.25f, 0.0f, 1.0f },{ 1.0f, 0.0f, 0.0f, 1.0f } },
{ { 0.75f, 0.25f, 0.0f, 1.0f },{ 0.0f, 1.0f, 0.0f, 1.0f } },
{ { 0.50f, 0.75f, 0.0f, 1.0f },{ 0.0f, 0.0f, 1.0f, 1.0f } }
};
const Vertex Indices2[] = {....}
I call the function like this:
createArrayObjects(0, Vertices2, Indices2);
void createArrayObjects(int index, const Vertex vertices[], const GLubyte indices[]){
cout << sizeof(vertices) << endl; ---> returns 4
cout << sizeof(Vertices2) << endl; ---> returns 96
...
}
If I use sizeof(Vertices2), to fill the VBO, the program runs fine.
Without the correct size on the input vertices, I can't fill the VAO and VBO and visualize correctly the data.
sizeof()doesn't return what you expect. If you're not passing objects that know their own size (eg STL container classes), you'll need to pass an array length as well as the array pointer into the function.