glsl - How to normalize image coordinates for texture space in OpenGL? -


say have image of size 320x240. now, sampling sampler2d integer image coordinates ux, uy must normalize texture coordinates in range [0, size] (size may width or height).

now, wonder if should normalize this

texture(image, vec2(ux/320.0, uy/240.0)) 

or this

texture(image, vec2(ux/319.0, uy/239.0)) 

because ux = 0 ... 319 , uy = 0 ... 239. latter 1 cover whole range of [0, 1] correct? means 0 corresponds e.g. left-most pixels , 1 corresponds right pixels, right?

also want maintain filtering, not use texelfetch.

can tell this? thanks.

no, first 1 correct:

texture(image, vec2(ux/320.0, uy/240.0)) 

your premise "ux = 0 ... 319 , uy = 0 ... 239" incorrect. if render 320x240 quad, say, ux = 0 ... 320 , uy = 0 ... 240.

this because pixels and texels squares sampled @ half-integer coordinates. so, example, let's assume render 320x240 texture on 320x240 quad. bottom-left pixel (0,0) sampled @ screen-coordinates (.5,.5). normalize dividing (320,240), opengl multiply normalized coordinates (320,240) actual texel coordinates, sample (.5,.5) texture, corresponds center of (0,0) pixel, returns exact color.

it important think of pixels in opengl squares, coordinates (0,0) correspond bottom-left corner of bottom-left pixel , non normalized (w,h) corresponds top-right corner of top-right pixel (for texture of size (w,h)).


Comments

Popular posts from this blog

php - How to display all orders for a single product showing the most recent first? Woocommerce -

asp.net - How to correctly use QUERY_STRING in ISAPI rewrite? -

angularjs - How restrict admin panel using in backend laravel and admin panel on angular? -