WEARFITS Texture Projector
A tool for projecting 2D images onto 3D GLB models. Works in browser (GPU-accelerated) and headless via API (for automation and AI-driven texture editing).
Two Ways to Use
1. Browser Tool (Interactive)
Browser-based editor with real-time 3D preview. Available at /texture-painter/ when deployed.
2. Headless API (Automation)
API-driven workflow for batch processing and AI integrations. Uses Modal backend for GPU rendering and Cloudflare Worker for projection.
Browser Workflow
- Load a GLB model via drag & drop or "Load GLB" button
- Position the camera by dragging to rotate, scrolling to zoom
- Render the current view to PNG with transparent background
- Edit the PNG in your favorite image editor (Photoshop, GIMP, DALL-E, etc.)
- Upload the edited image to project it back onto the 3D model's texture
- Export the modified texture or complete GLB
Development
Opens at http://localhost:5173
Production Build
Build outputs to ../../public/texture-painter/ for deployment with the main API.
Loading Models
Drag & Drop: Drag a GLB file onto the viewer.
File Picker: Click "Load GLB" button.
URL Parameter: Add ?model=URL to load a model from a URL:
Controls
- Left click + drag: Orbit camera
- Right click + drag: Orbit camera (alternative)
- Scroll: Zoom in/out
- Middle click + drag: Pan camera
Keyboard Shortcuts
| Key | Action |
|---|---|
| Escape | Close preview overlay |
Headless API Workflow
For automation, batch processing, or AI-driven texture editing:
┌─────────────────────────────────────────────────────────────────┐
│ Shared TypeScript Code │
│ projectWithUVMap(imgData, uvMap, textures) → modifiedTextures │
│ src/core/projection.ts │
└─────────────────────────────────────────────────────────────────┘
│ │ │
┌────┴────┐ ┌────┴────┐ ┌────┴────┐
│ Browser │ │ Modal │ │CF Worker│
│ Three.js│ │pyrender │ │ (API) │
│ WebGL │ │ EGL │ │ │
└─────────┘ └─────────┘ └─────────┘
Renders UV map Renders UV map Calls Modal
locally (GPU) headless (GPU) for UV map
Step 1: Render View and UV Map
Note: The
v1-texture-renderandv1-texture-projectendpoints are not deployed to production (to stay under Modal's 8 endpoint limit). For headless workflow: - Local development: Runmodal serve modal_app.pyto start local endpoints - Production: Use the browser-based tool at/texture-painter/which runs projection client-side - For AI enhancement: Usev1-texture-enhance-glbwhich handles the full render→enhance→project pipeline
Call the Modal v1-texture-render endpoint (local dev only):
curl -X POST https://wearfits--v1-texture-render.modal.run \
-H "Content-Type: application/json" \
-H "Authorization: Bearer <API_KEY>" \
-d '{
"glb_url": "https://example.com/model.glb",
"camera_position": [0, 0, 3],
"camera_target": [0, 0, 0],
"fov": 45,
"width": 1024,
"height": 1024
}'
Response:
{
"status": "completed",
"view_url": "https://api.wearfits.com/files/signed?...",
"uv_map_url": "https://api.wearfits.com/files/signed?...",
"uv_map_width": 1024,
"uv_map_height": 1024,
"mesh_info": [
{ "index": 0, "name": "geometry_0", "texture_size": [2048, 2048], "has_texture": true, "has_uvs": true }
],
"processing_time_ms": 3200
}
Step 2: Edit the View
Download view_url (PNG with transparent background) and edit it:
- Manual: Photoshop, GIMP, Procreate, etc.
- AI: DALL-E inpainting, Stable Diffusion img2img, Claude vision
- Programmatic: PIL, OpenCV, any image processing library
The edits you make on the 2D image will be projected back onto the 3D texture.
Step 3: Project Edits Back
Use the TextureProjectionService in the Cloudflare Worker:
import { createTextureProjectionService } from './services/texture-projection-service';
const textureProjection = createTextureProjectionService(env);
// Project edited image onto GLB textures
const modifiedGlb = await textureProjection.projectImage({
glbUrl: 'https://example.com/model.glb',
editedImageUrl: 'https://example.com/edited-view.png',
uvMapUrl: renderResult.uvMapUrl,
uvMapWidth: renderResult.uvMapWidth,
uvMapHeight: renderResult.uvMapHeight,
meshInfo: renderResult.meshInfo // Include for reliable texture alignment
});
// modifiedGlb is an ArrayBuffer containing the modified GLB
Complete TypeScript Example
import { createTextureProjectionService } from './services/texture-projection-service';
async function editTexture(glbUrl: string, editFn: (viewUrl: string) => Promise<string>) {
const service = createTextureProjectionService(env);
// 1. Render view and UV map
const render = await service.renderViewAndUVMap({
glbUrl,
camera: { position: [0, 0, 3], target: [0, 0, 0], fov: 45 },
width: 1024,
height: 1024
});
// 2. Edit the view (your custom logic)
const editedImageUrl = await editFn(render.viewUrl);
// 3. Project edits back (pass meshInfo for reliable texture alignment)
const modifiedGlb = await service.projectImage({
glbUrl,
editedImageUrl,
uvMapUrl: render.uvMapUrl,
uvMapWidth: render.uvMapWidth,
uvMapHeight: render.uvMapHeight,
meshInfo: render.meshInfo
});
return modifiedGlb;
}
Shared Projection Algorithm
The core projection logic is in src/core/projection.ts and works identically in all environments:
import { projectWithUVMap, createTextureBuffer, createUVMapBuffer } from './core/projection';
// Works in browser, Node.js, and Cloudflare Workers
projectWithUVMap(editedImage, uvMap, textures, { uvPrecision: 'float32' });
Interface
interface TextureBuffer {
data: Uint8ClampedArray;
width: number;
height: number;
}
interface UVMapBuffer {
data: Float32Array | Uint8Array;
width: number;
height: number;
}
function projectWithUVMap(
editedImage: TextureBuffer,
uvMap: UVMapBuffer,
textures: TextureBuffer[],
options?: { uvPrecision?: 'float32' | 'uint8'; alphaThreshold?: number }
): void;
UV Map Format
The UV map encodes texture coordinates per pixel:
| Channel | Type | Description |
|---|---|---|
| R | float (0-1) | U texture coordinate |
| G | float (0-1) | V texture coordinate |
| B | float (mesh_index/255) | Mesh index (1-indexed, 0 = background) |
| A | float (0-1) | normal·view (for falloff, 1 = facing camera) |
Precision
Both browser (WebGL FloatType) and headless (Float32 binary) use full 32-bit floating point precision for UV coordinates. This ensures accurate texture mapping even for high-resolution textures (4096×4096+).
Technical Details
Browser Rendering
- Uses three.js with WebGL2
- UV map rendered to
FloatTyperender target for full precision - Custom shader outputs UV coordinates as fragment colors
Headless Rendering (Modal)
- Uses pyrender with EGL backend
- GPU-accelerated offscreen rendering
- UV map stored as raw Float32 binary (not PNG)
Projection Algorithm
The projection algorithm:
- Iterates each pixel in the edited image
- Looks up UV coordinates from the UV map (same pixel position, Y-flipped)
- Skips transparent pixels (alpha < 10)
- Maps UV to texture coordinates
- Alpha-blends the edited pixel onto the texture
This is pure pixel math with no GPU/DOM dependencies, making it portable across environments.
File Structure
tools/texture-painter/
├── index.html # Browser entry point
├── package.json # Dependencies
├── tsconfig.json # TypeScript config
├── vite.config.ts # Vite build config (outputs to public/)
├── src/
│ ├── main.ts # Browser app entry
│ ├── TextureProjector.ts # Browser UI and state
│ ├── UI.ts # UI controls
│ ├── style.css # Styles
│ └── core/
│ ├── TextureProjectorCore.ts # Browser Three.js integration
│ └── projection.ts # Shared projection algorithm (portable)
└── README.md # This file
Export Options
Browser
- Export Texture: Downloads the largest texture as PNG
- Export GLB: Downloads the modified 3D model with projected textures
Headless
The TextureProjectionService.projectImage() returns an ArrayBuffer containing the complete modified GLB file.
Use Cases
Manual Touch-ups
- Render view of 3D model
- Open in Photoshop
- Paint fixes, add details, remove artifacts
- Project back to 3D
AI Texture Generation
- Render view of untextured or placeholder-textured model
- Send to DALL-E/Stable Diffusion for texture generation
- Project AI-generated texture onto model
Batch Processing
- Render multiple views of multiple models
- Apply consistent edits programmatically
- Project all edits back in parallel
Real-time Preview Loop
- Render view
- Make quick edits
- Project and preview
- Repeat until satisfied
- Export final GLB
Integration with WEARFITS API
The texture projection service integrates with the main WEARFITS API:
// In wrangler.jsonc
"POSE_TRANSFER_API_URL": "https://wearfits--v1-pose-transfer.modal.run"
// Service uses same auth as pose transfer
import { createTextureProjectionService } from './services/texture-projection-service';
const textureProjection = createTextureProjectionService(env);
Troubleshooting
"No UVs found on mesh"
The GLB model must have UV coordinates. Most 3D modeling software exports UVs by default.
Projection looks offset
Ensure the camera state (position, target, FOV) matches exactly between render and projection. The headless API returns the exact dimensions used.
UV precision issues
Use uvPrecision: 'float32' for high-resolution textures. The default Float32 format provides enough precision for 4K+ textures.
Slow headless rendering
First render may have cold start latency (~5-10s). Subsequent renders are fast (~2-3s). The Modal worker stays warm for 3 minutes.