Deferred Rendering with OpenGL, experiencing heavy pixelization near lit boundaries on surfaces -
problem explaination
i implementing point lights deferred renderer , having trouble determining heavy pixelization/triangulation noticeable near borders of lights coming from.
the problem appears caused loss of precision somewhere, have been unable track down precise source. normals obvious possibility, have classmate using directx , handling normals in similar manner no issues.
from about 2 meters away in our game's units (64 units/meter):
a few centimeters away. note "pixelization" not change size in world approach it. however, appear swim if change camera's orientation:
a comparison closeup forward renderer demonstrates spherical banding 1 expect rgba8 render target (only 0-255 possible values each color). note in deferred picture walls exhibit normal spherical banding:
the light volume shown here green wireframe:
as can seen effect isn't visible unless close surface (around 1 meter in our game's units).
position reconstruction
first, should mention using spherical mesh using render portion of screen light overlaps. rendering back-faces if depth greater or equal depth buffer as suggested here.
to reconstruct camera space position of fragment taking vector camera space fragment on light volume, normalizing it, , scaling linear depth gbuffer. sort of hybrid of methods discussed here (using linear depth) , here (spherical light volumes).
geometry buffer
my gbuffer setup is:
enum render_targets { e_dist_32f = 0, e_diffuse_rgb8, e_norm_xyz8_specpow_a8, e_light_rgb8_specintes_a8, num_rt }; //... glint internal_formats[num_rt] = { gl_r32f, gl_rgba8, gl_rgba8, gl_rgba8 }; glint formats[num_rt] = { gl_red, gl_rgba, gl_rgba, gl_rgba }; glint types[num_rt] = { gl_float, gl_float, gl_float, gl_float }; for(uint = 0; < num_rt; ++i) { glbindtexture(gl_texture_2d, _render_targets[i]); glteximage2d(gl_texture_2d, 0, internal_formats[i], _width, _height, 0, formats[i], types[i], nullptr); } // separate non-linear depth buffer used depth testing glbindtexture(gl_texture_2d, _depth_tex_id); glteximage2d(gl_texture_2d, 0, gl_depth_component32, _width, _height, 0, gl_depth_component, gl_float, nullptr);
normal precision
the problem normals didn't have enough precision. @ 8 bits per component means 255 discrete possible values. examining normals in gbuffer overlaid ontop of lighting showed 1-1 correspondence normal value lit "pixel" value.
i unsure why classmate not same issue (he going investigate further).
after more research found term quantization. example of can seen here specular highlight on page 19.
solution
after changing normal render target rg16f problem resolved.
using method suggested here store , retrieve normals following results:
i need store normals more compactly (i have room 2 components). this survey of techniques if finds in same situation.
[edit 1]
as both andon , guyrt have pointed out in comments, 16 bits bit overkill need. i've switched rgb10_a2 suggested , gives satisfactory results, on rounded surfaces. 2 bits lot (256 vs 1024 discrete values).
here's looks now.
it should noted (for references post in future) image posted rg16f has undesirable banding method using compress/decompress normal (there error involved).
[edit 2]
after discussing issue more classmate (who using rgb8 no ill effects), think worth mentioning might have perfect combination of elements make appear. game i'm building renderer horror game places in pitch black environments sonar-like ability. in scene have number of lights @ different angles (my classmate's environments lit - they're making outdoor racing game). combined fact appears on round objects relatively close might why provoked this. (slightly educated) guess on part.
Comments
Post a Comment