ios - gldrawelements bad access in xcode when used outside of GLKViewController -


i'm pretty new opengl es, i'm trying draw indexed vertices using gldrawelements in character class. i've gotten work before inside of glkviewcontroller class, when tried creating character class perform own rendering, got nothing bad_access. here character class:

#import "character.h"  @interface character() {     gluint _vertexbuffer;     gluint _indexbuffer;     gluint _vertexarray; }  @property(nonatomic, weak) glkbaseeffect *effect;  @end  typedef struct {     float position[3];     float color[4]; } vertex;  const vertex vertices[] = {     {{1, -1, 0}, {1, 0, 0, 1}},     {{1, 1, 0}, {0, 1, 0, 1}},     {{-1, 1, 0}, {0, 0, 1, 1}},     {{-1, -1, 0}, {0, 0, 0, 1}} };  const glushort indices[] = {     0, 1, 2,     2, 3, 0 };  @implementation character  - (id)initwitheffect:(glkbaseeffect *)effect {     if (self = [super init])     {         self.effect = effect;         [self setupgl];     }      return self; }  - (void)setupgl {     glgenvertexarraysoes(1, &_vertexarray);     glbindvertexarrayoes(_vertexarray);      glgenbuffers(1, &_vertexbuffer);     glbindbuffer(gl_array_buffer, _vertexbuffer);     glbufferdata(gl_array_buffer, sizeof(vertices), vertices, gl_static_draw);      glgenbuffers(1, &_indexbuffer);     glbindbuffer(gl_element_array_buffer, _indexbuffer);     glbufferdata(gl_element_array_buffer, sizeof(indices), indices, gl_static_draw);       glenablevertexattribarray(glkvertexattribposition);     glvertexattribpointer(glkvertexattribposition, 3, gl_float, gl_false, sizeof(vertex), (const glvoid *) offsetof(vertex, position));     glenablevertexattribarray(glkvertexattribcolor);     glvertexattribpointer(glkvertexattribcolor, 4, gl_float, gl_false, sizeof(vertex), (const glvoid *) offsetof(vertex, color));       glbindvertexarrayoes(0); }  - (void)teardowngl {     gldeletebuffers(1, &_vertexbuffer);     gldeletebuffers(1, &_indexbuffer); }  - (void)render {     self.effect.transform.modelviewmatrix = glkmatrix4translate(glkmatrix4identity, self.position.x, self.position.y, self.position.z);     self.effect.transform.modelviewmatrix = glkmatrix4rotate(self.effect.transform.modelviewmatrix, self.rotation, 0.0f, 0.0f, 1.0f);      [self.effect preparetodraw];     glbindvertexarrayoes(_vertexarray);     gldrawelements(gl_triangles, sizeof(indices) / sizeof(indices[0]), gl_unsigned_short, 0); }  @end 

then in viewcontroller

- (void)viewdidload {     [super viewdidload];      self.context = [[eaglcontext alloc] initwithapi:keaglrenderingapiopengles2];      glkview *view = (glkview *)self.view;     view.context = self.context;      character = [[character alloc] initwitheffect:self.effect];     character.position = glkvector3make(self.view.bounds.size.width / 2, self.view.bounds.size.height / 2, 0.0f);     [self setupgl]; } 

rendered using:

- (void)glkview:(glkview *)view drawinrect:(cgrect)rect {     glclearcolor(0.65f, 0.65f, 0.65f, 1.0f);     glclear(gl_color_buffer_bit | gl_depth_buffer_bit);      [character render]; } 

i know isn't simple miscalculation of byte-size or because i've been @ couple days now.

hard sure on quick read, looks problem don't have current gl context when you're setting vao in -[character setupgl].

creating eaglcontext object doesn't make current context. if rest of code in view controller looks xcode "opengl game" template, viewcontroller doesn't set current context until own setupgl method, call after calling character class , attempts set opengl es resources.


Comments

Popular posts from this blog

Android layout hidden on keyboard show -

google app engine - 403 Forbidden POST - Flask WTForms -

c - Why would PK11_GenerateRandom() return an error -8023? -