is the MPSDynamicScene example correctly computing the motion vector texture?

I'm trying to implement de-noising of AO in my app, using the MPSDynamicScene example as a guide: https://developer.apple.com/documentation/metalperformanceshaders/animating_and_denoising_a_raytraced_scene

In that example, it computes motion vectors in UV coordinates, resulting in very small values:

// Compute motion vectors
    if (uniforms.frameIndex > 0) {
        // Map current pixel location to 0..1
        float2 uv = in.position.xy / float2(uniforms.width, uniforms.height);
        
        // Unproject the position from the previous frame then transform it from
        // NDC space to 0..1
        float2 prevUV = in.prevPosition.xy / in.prevPosition.w * float2(0.5f, -0.5f) + 0.5f;
        
        // Next, remove the jittering which was applied for antialiasing from both
        // sets of coordinates
        uv -= uniforms.jitter;
        prevUV -= prevUniforms.jitter;
        
        // Then the motion vector is simply the difference between the two
        motionVector = uv - prevUV;
    }

Yet the documentation for MPSSVGF seems to indicate the offsets should be expressed in texels:

The motion vector texture must be at least a two channel texture representing how many texels
 * each texel in the source image(s) have moved since the previous frame. The remaining channels
 * will be ignored if present. This texture may be nil, in which case the motion vector is assumed
 * to be zero, which is suitable for static images.

Is this a mistake in the example code?

Asking because doing something similarly in my own app leaves AO trails, which would indicate the motion vector texture values are too small in magnitude. I don't really see trails in the example, even when I speed up the animation, but that could be due to the example being monochrome.

Update:

If I multiply the uv offsets by the size of the texture, I get a bad result. Which seems to indicate the header is misleading and they are in fact in uv coordinates. So perhaps the trails I'm seeing in my app are for some other reason.

I also wonder who is actually using this API other than me? I would think most game engines are doing their own thing. Perhaps some of apple's own code uses it.