How convert texture coordinate to vertexPosition ?

Hi

I'm a beginner at using Metal.


in my tutorial i have a kernel function for select color in Video:

ernel void image_LAB(constant MBELABHSVParameters &params [[buffer(7)]],
                             texture2d<half, access::read> sourceTexture [[texture(7)]],
                             texture2d<half, access::write> destTexture [[texture(8)]],
                             sampler samp [[sampler(7)]],
                      device PositionVertex *inPosition [[buffer(9)]],
                      device PositionVertex *outPosition [[buffer(10)]],
                             uint2 gridPosition [[thread_position_in_grid]])
{
//method for select color
//return:
colorReturn = rgba;
destTexture.write(colorReturn, gridPosition);

uint2 sourceCoord = uint2(gridPosition);
half2 coordinate = normalize(sourceTexture.read(sourceCoord).xy);

}

how make to convert the coordinate of the pixel in position

of the vertex to be used in a subsequent Command Encoder.

Maybe what I'm doing is not accurate, as the best way forward to achieve the position of the vertex ?

If you want to convert texture (screen aligned) coordinate to vertex coordinate (in NDC space, I guess) you got to keepin mind the following.

1) in Metal, texel (and fragment) Y axis is inverted (top->down) so for given float2 texcoord:

float2 tmp = float2( texcoord.x, 1.0f - texcoord.y );

2) now, texture coords range from 0.0 to 1.0 and NDCs are -1 to 1, so:

float2 vertex = 2.0f * tmp - 1.0f;

and that should work. Do not cast texel or fragment coord to any integer type, as in metal texel/fragment centers are at i.5f, j.5f


Hope that helps

Michal

Thank you for your help.

Ok. I know how they are oriented coordinates in texture.

i have set in:

    half2 coordA = normalize(sourceTexture.read(sourceCoord).xy);


maybe I should not use normalize ?

I performed the same calculation, as suggested by you, but by separating x and y.

the strange thing is that output the texture is correct.

But let's say for example, if I want to have the coordinate of the first white pixel that is in the reading of the texture and copy it in "outPosition". when I try to draw it with the next renderPass you are not in the same location of the white pixels of the texture..

and again, I'll reset whenever the output buffer, but the reading of the next frame of the video white pixels in the texture has a new correct position, while the remains static pixels in the draw does not follow the white point, you can explain to me ?


Now I try to do as suggested by you.

Thank you very much

I'd like to help, Frank, but I am afraid (English is not my strongest point :-( ) that I do not quite understand what you're trying to achieve here. What I described above is, I believe, correct way of converting from viewport aligned texture (such as when rendering to texture) coordinate to vertex coordinate in the same viewport. But...there are lots of buts. For example, it all depends on what you're using vertex coordinates for.


What you're trying to achieve? Get all white pixels in video frame, store their locations and write them over with something other using vertex coordinates? Like points? This should work, provided you store all pixel coordinates separately and not (for example) write them over.


Regards

Michal

How convert texture coordinate to vertexPosition ?
 
 
Q