Building Cycling Training Intelligence in One Week with Claude Code

Introduction: Beyond Data Display

Route viewers show you data. CTI makes it explorable.

The goal was simple: transform indoor cycling training files into an interactive, cinematic experience. Not just charts — a way to discover insights about routes, locations, and performance through exploration.

What is Rouvy?

Rouvy is an indoor cycling platform featuring video routes from real-world locations. Each ride generates .fit files containing GPS coordinates, power, heart rate, cadence, and speed data. CTI analyzes these files to visualize your training sessions in 3D.

What Was Built

Production features shipped in ~1 week:

  • Authentication: Supabase auth with email/password, session management, RLS policies
  • File Upload: TUS protocol via Uppy—resumable, chunked uploads with progress tracking
  • FIT Parsing: Binary format parsing for GPS coordinates and telemetry
  • 3D Terrain: Mapbox GL with cinematic camera animations following your route
  • Real-time Charts: Elevation, speed, power, HR, cadence tracking synchronized with map
  • Responsive Design: Desktop and mobile support with touch interactions

Development Approach with Claude Code

Human-in-the-Loop (HITL) Coding:

  1. Foundation Phase: Used Claude Sonnet 4.5 to design data architecture and parsing strategy
  2. Visualization Phase: Implemented 3D terrain and camera animations with AI guidance
  3. Refinement Phase: Iteratively polished cinematic experience and performance

Testing

HITL is essential for frontend development. Visual testing is crucial, for example, spotting that a small change has caused a drop in frame rate resulting in janky rendering. Automated tests are also important, but they can't replace visual inspection.

Cost & Timeline:

  • Total spend: ~$50 USD
  • Duration: ~1 week
  • Starting point: Next.js + Supabase template, heavily customized

The AI handled boilerplate, suggested optimizations, and caught edge cases while I focused on architecture and UX decisions.

Tech Stack

Frontend:

  • React 19 (React Compiler enabled)
  • Next.js 16 (App Router, Server Components)
  • TypeScript + Tailwind CSS + shadcn/ui
  • Mapbox GL JS (3D terrain)
  • Recharts (performance visualization)

Backend:

  • Supabase (PostgreSQL, auth, storage)
  • TUS resumable upload protocol
  • Zod schema validation
  • Next.js API routes

Key Libraries:

  • fit-file-parser: Binary FIT file decoding
  • @turf/turf: Geospatial calculations (bearing, distance, interpolation)
  • @uppy/core + @uppy/tus: Polished upload UX with retry logic

FIT Files and Data Parsing

What are .fit files?

Binary format by Garmin (Flexible and Interoperable Data Transfer). Stores GPS tracks, heart rate, power, cadence, speed, elevation, and session metadata like TSS and intensity factor.

Parsing Implementation (lib/fit-parser.ts):

export interface RoutePoint {
  lat: number;
  lng: number;
  elevation: number;
  speed?: number;    // km/h
  power?: number;    // watts
  heartRate?: number; // bpm
  cadence?: number;   // rpm
}

export interface FitMetadata {
  totalDistance?: number;     // km
  totalAscent?: number;       // meters
  movingTime?: number;        // seconds
  avgSpeed?: number;          // km/h
  avgHeartRate?: number;      // bpm
  avgPower?: number;          // watts
  normalizedPower?: number;   // watts
  trainingStressScore?: number;
  intensityFactor?: number;
  // ... more fields
}

Key parsing steps:

  1. Configure fit-file-parser with force mode, normalized units (km/h, celsius)
  2. Extract route points from record messages: lat/lng (handle semicircle conversion), elevation, metrics
  3. Extract session metadata from session messages: totals, averages, maximums
  4. Validate GPS data exists—reject workout files without position data

Semicircle coordinate conversion (lines 100-104):

// FIT files store in semicircles, but parser may auto-convert
if (Math.abs(lat) > 180 || Math.abs(lng) > 180) {
  lat = (lat * 180) / Math.pow(2, 31);
  lng = (lng * 180) / Math.pow(2, 31);
}

Cinematic Camera Implementation

Most technically interesting section. Inspired by Mapbox's cinematic route animations article.

Camera Configuration (components/map-viewer.tsx):

const TERRAIN_EXAGGERATION = 1.0;
const CAMERA_PITCH = 60;           // Cinematic perspective
const CAMERA_ZOOM = 16;
const BEARING_LERP_FACTOR = 0.3;   // 30% blend for smooth turns

Bearing Smoothing with LERP:

Sharp turns cause jarring camera rotations. Linear interpolation (LERP) smooths bearing transitions while handling 360° wrap-around:

function lerpBearing(prevBearing: number, newBearing: number, factor: number): number {
  // Calculate shortest angular difference
  let diff = newBearing - prevBearing;

  // Normalize to -180 to 180 range
  while (diff > 180) diff -= 360;
  while (diff < -180) diff += 360;

  // Apply LERP interpolation
  const smoothBearing = prevBearing + diff * factor;

  // Normalize to 0-360 range
  return (smoothBearing + 360) % 360;
}

Without normalization, rotating from 350° to 10° would spin 340° backward instead of 20° forward. The LERP factor (0.3) controls smoothing strength—lower values = smoother but more lag.

Dynamic Camera Positioning:

// Position camera at current progress distance
const currentDistance = newProgress * routeLength.current;
const cameraPoint = turf.along(routeLineString.current, currentDistance, {
  units: "kilometers",
});

// Look ahead 500m for bearing calculation
const lookAheadDistance = Math.min(currentDistance + 0.5, routeLength.current);
const lookAheadPoint = turf.along(routeLineString.current, lookAheadDistance, {
  units: "kilometers",
});

// Calculate bearing from camera to look-ahead point
const targetBearing = turf.bearing(cameraPoint, lookAheadPoint);

// Smooth bearing transitions
const smoothBearing = lerpBearing(previousBearing.current, targetBearing, BEARING_LERP_FACTOR);
previousBearing.current = smoothBearing;

// Update camera without easing (we control smoothness)
map.jumpTo({
  center: cameraPoint.geometry.coordinates as [number, number],
  bearing: smoothBearing,
  zoom: zoom,
  pitch: CAMERA_PITCH,
});

Animation Loop:

// Speed-scaled elapsed time
const elapsed = (currentTime - startTimeRef.current) * speed * SPEED_SCALE_FACTOR;
const newProgress = Math.min(elapsed / animationDuration, 1);

// Update route gradient (throttled to 30fps)
if (now - lastGradientUpdateRef.current >= 33) {
  map.setPaintProperty("route", "line-gradient", [
    "step",
    ["line-progress"],
    "#FFD700",           // Gold color for travelled section
    newProgress,
    "rgba(255, 215, 0, 0)", // Transparent for upcoming section
  ]);
}

animationFrameRef.current = requestAnimationFrame(animate);

The gradient creates a "breadcrumb trail" effect—gold shows where you've been, transparent shows what's ahead. Updates at 30fps while camera runs at 60fps.

Key Techniques:

  • turf.along() for distance-based positioning (not index-based—handles varying point density)
  • 500m look-ahead for bearing calculation (prevents camera from looking at feet)
  • jumpTo() instead of easeTo() (manual smoothing via LERP, no automatic easing)
  • requestAnimationFrame for 60fps camera, throttled gradient updates

Supabase Integration

Authentication:

  • Server-side with cookies (@supabase/ssr)
  • Middleware handles session refresh automatically
  • Protected routes redirect to login
  • Email/password signup with confirmation emails

Database Schema:

CREATE TABLE attachments (
  id UUID PRIMARY KEY,
  user_id UUID REFERENCES auth.users NOT NULL,
  file_name TEXT NOT NULL,
  file_size BIGINT NOT NULL,
  storage_path TEXT NOT NULL,
  fit_metadata JSONB,
  route_title TEXT,
  created_at TIMESTAMPTZ DEFAULT NOW()
);

-- RLS policies: users can SELECT/INSERT/DELETE only their own files
CREATE POLICY "Users can view own attachments"
  ON attachments FOR SELECT USING (auth.uid() = user_id);

Indexes on user_id and created_at optimize queries.

Storage:

  • user-attachments bucket with TUS protocol support
  • File path pattern: {user_id}/{filename}
  • Signed URLs with 1-hour expiry for downloads
  • Proxy endpoint (app/api/storage/upload/resumable/[[...file]]/route.ts) forwards TUS requests to Supabase with auth headers

TUS File Upload with Uppy

Client Implementation (components/file-list-drawer.tsx):

const uppy = new Uppy({
  restrictions: {
    maxFileSize: 10 * 1024 * 1024, // 10MB
    allowedFileTypes: ['.fit'],
  },
})
.use(Tus, {
  endpoint: '/api/storage/upload/resumable',
  chunkSize: 6 * 1024 * 1024, // 6MB chunks
  retryDelays: [0, 1000, 3000, 5000],
});

Pre-upload Validation:

Before uploading, parse .fit file locally to:

  1. Validate it's a valid FIT file with GPS data
  2. Extract metadata for database storage
  3. Reject invalid files immediately (no wasted uploads)

Server Proxy:

Forwards TUS requests (POST, PATCH, HEAD, OPTIONS, DELETE) to Supabase Storage with session auth token. Enables resumable uploads even if connection drops.

Flow:

  1. User drops .fit file in drawer
  2. Parse locally → validate GPS data, extract metadata
  3. Upload via TUS with progress tracking
  4. On success → create database record with metadata
  5. UI updates with new route in file list

Chart Performance Optimizations

Problem: Training files contain thousands of GPS points (1-2 second intervals). Rendering all points in Recharts causes sluggish performance.

M4 Downsampling Algorithm (lib/m4-downsample.ts):

Largest-Triangle-Four-Buckets (LT4B) reduces data while preserving visual fidelity:

export function m4Downsample<T extends DataPoint>(
  data: T[],
  targetPoints: number
): T[] {
  const bucketSize = (data.length - 2) / (targetPoints - 2);
  const downsampled: T[] = [data[0]]; // Always keep first

  for (let i = 0; i < targetPoints - 2; i++) {
    const bucketStart = Math.floor(i * bucketSize) + 1;
    const bucketEnd = Math.floor((i + 1) * bucketSize) + 1;

    // Find min and max in bucket
    let minPoint, maxPoint;
    // ... find min/max by elevation/value

    // Add in chronological order (preserves visual peaks)
    downsampled.push(minPoint);
    if (minPoint !== maxPoint) downsampled.push(maxPoint);
  }

  downsampled.push(data[data.length - 1]); // Always keep last
  return unique(downsampled).sort((a, b) => a.index - b.index);
}

Strategy:

  • Divide data into buckets
  • Keep min/max per bucket (preserves peaks and valleys)
  • Always preserve first/last points
  • Result: 500 points for 310px chart width (~90% reduction)

Throttling (hooks/use-throttled-progress.ts):

Limit chart re-renders to 30 FPS:

export function useThrottledProgress(progress: number, fps: number = 30) {
  const [throttledProgress, setThrottledProgress] = useState(progress);
  const lastUpdateRef = useRef(0);
  const frameInterval = 1000 / fps;

  useEffect(() => {
    const now = performance.now();
    if (now - lastUpdateRef.current >= frameInterval) {
      setThrottledProgress(progress);
      lastUpdateRef.current = now;
    } else {
      // Schedule update for next frame interval
      const timeout = setTimeout(() => {
        setThrottledProgress(progress);
        lastUpdateRef.current = performance.now();
      }, frameInterval - (now - lastUpdateRef.current));
      return () => clearTimeout(timeout);
    }
  }, [progress, frameInterval]);

  return throttledProgress;
}

Additional Optimizations:

  • useMemo for expensive calculations (cumulative distances, grade percentages)
  • Dual datasets: full precision for data lookup, downsampled for rendering
  • React.memo on chart components to prevent unnecessary re-renders

Result: Smooth 60fps map animation with charts updating at 30fps, imperceptible visual difference from full data.

Desktop & Mobile Support

Responsive Design:

  • Safe area insets for iPhone notch and home indicator
  • Tablet landscape detection (wider layouts)
  • Phone landscape mode (compact controls)
  • Chart sizing: 310px × 130px (optimized for mobile screens)

Touch Interactions:

  • Drag-to-collapse elevation panel
  • Touch-friendly play/pause, speed controls, zoom buttons
  • File drawer slides from left with drag gesture
  • Popover instead of hover cards (works on touch screens)

Accessibility:

  • Keyboard navigation for controls
  • ARIA labels on interactive elements
  • Focus management for modals and drawers

Future Development Ideas

I am thinking about an intelligence layer, pace cards, route discovery and social sharing.

Potential enhancements:

  • Multi-ride Comparison: Overlay routes, compare performance across sessions
  • Segment Analysis: Auto-detect climbs, sprints, intervals
  • Training Load: Track TSS, intensity factor, recovery metrics over time
  • Social Features: Share routes, public leaderboards, kudos
  • Platform Integration: Import from Strava, TrainingPeaks, Zwift
  • Route Recommendations: Suggest rides based on fitness level and goals
  • Weather Overlay: Historical weather data along route
  • Power Analysis: Critical power curve, FTP estimation, L-R balance
  • Export Options: Video renders of routes, PDF training reports

Conclusion

Building CTI demonstrated the power of AI-assisted development for complex, interactive applications. The combination of modern web technologies (React 19, Next.js 16, Mapbox GL) with Claude Code's guidance enabled rapid iteration from concept to production.

Key takeaways:

  • HITL approach works: AI handles boilerplate and optimization, humans make architectural decisions, visual testing is crucial for performance and polish.
  • Cost-effective: $50 and one week for a production-ready app
  • Performance matters: Downsampling and throttling enable smooth 60fps with large datasets
  • Cinematic UX: Small details (LERP smoothing, look-ahead bearing) make huge difference in feel

The result is an app that transforms static training data into an explorable, entertaining experience — exactly what we set out to build.


Technologies: React 19, Next.js 16, TypeScript, Tailwind CSS, Mapbox GL JS, Supabase, TUS Protocol, Uppy

Built with: Claude Sonnet 4.5 via Claude Code CLI