James L.Remote

iOS AR Engineer

Project-Based

Description

The work is technical and precise, we render directly onto a tracked face mesh at 60fps and every pixel matters.

You need to have shipped something real with ARKit. Not a demo. A product where face tracking drove rendering decisions at the vertex level.

The work involves: Face mesh geometry and UV mapping Real-time texture generation from geometric contours Sub-pixel optical edge detection against live camera frames Coordinate space transformations between 3D, image, and texture space Threading discipline across ARKit, Vision, CoreGraphics and SceneKit

You are the right person if:

You have debugged a rendering artifact by reasoning about coordinate spaces, not by guessing blend modes You know what happens to Vision landmark coordinates when you pass the wrong orientation to VNImageRequestHandler You can explain why the same ARKit vertex produces different UV values depending on head angle Working with raw pixel buffers feels normal to you

You are not the right person if: Your ARKit experience stops at placing virtual objects in a room You have never read BGRA pixel values directly from a CVPixelBuffer Catmull-Rom splines and even-odd fill rules are unfamiliar

Budget: USD 35/hour

Proposals: 9 freelancers have applied

Skills

iOS

Want AI to find more roles like this?

Upload your CV once. Get matched to relevant assignments automatically.

Try personalized matching