Camera Mounting & Analytics Guide

Optimal Camera Placement for AI-Powered Traffic Analytics
Complete Technical Manual for WINK AI Traffic & WINK Analytics
WINK Streaming, Inc.
Version 1.0 - 2025

Optimal Results - Camera Mounting and Placement

Best practices for camera installation to maximize traffic detection accuracy and reliability

Why Good Camera Mounting Matters

Proper camera mounting is critical for accurate traffic detection. When your cameras are mounted correctly:

Optimal Mounting Angles

Top-Down View (60-80°)

Recommended
Camera Highway
Advantages:
  • Minimal vehicle overlap
  • Clear lane separation
  • Reduced shadow interference
  • Accurate vehicle counting
Best For:
  • Vehicle counting
  • Speed estimation
  • Lane occupancy
  • FHWA classification

Side View (20-40°)

Alternative
Camera
Advantages:
  • Good lane separation
  • License plate visibility
  • Vehicle type classification
  • Incident detection
Best For:
  • Lane-specific analysis
  • Vehicle identification
  • Behavioral analysis
  • Queue detection

Mounting Height Guidelines

Camera Type Optimal Height Coverage Area Use Case
Overhead Gantry 20-30 feet 3-4 lanes Multi-lane counting
Side Pole Mount 15-20 feet 2-3 lanes Lane-specific analysis
Bridge Mount 25-40 feet All lanes Full highway coverage
Cantilever Mount 18-25 feet 2-4 lanes Partial coverage

Camera Stability Requirements

Critical Factors

  • Vibration Dampening: Use shock-absorbing mounts to minimize camera shake
  • Wind Resistance: Ensure mounts can withstand 80+ mph wind loads
  • Temperature Stability: Account for thermal expansion/contraction
  • Rigid Mounting: Avoid flexible poles or loose connections

Best Practices

  • Guy Wires: Use for tall poles in windy areas
  • Concrete Foundations: Minimum 4-foot depth for pole mounts
  • Regular Maintenance: Check mount tightness quarterly
  • Image Stabilization: Enable electronic stabilization when available

Impact on Detection Accuracy

Mounting Issue Detection Impact False Positive Rate Solution
Camera Shake -35% accuracy +45% false motion Stabilize mount, add dampeners
Poor Angle -25% accuracy +20% missed vehicles Adjust to 60-80° angle
Low Height -40% coverage +30% occlusions Raise to 20+ feet
Side Glare -20% dawn/dusk +15% shadows Add sun shields, adjust angle

Understanding Margins of Error

Common Sources of Detection Error

Vehicle Occlusion

When vehicles block other lanes or partially hide other vehicles:

  • Cross-lane occlusion: Large trucks blocking adjacent lanes (±5-15% error)
  • Same-lane occlusion: Tailgating vehicles appear as one (±3-8% error)
  • Merge/weave areas: Complex overlapping patterns (±10-20% error)
Environmental Factors
  • Heavy rain/snow: Reduced visibility (±15-25% error)
  • Sun glare: Dawn/dusk periods (±10-15% error)
  • Night conditions: Headlight glare and shadows (±5-10% error)
Technical Limitations
  • Frame rate: Fast vehicles between frames (±2-5% error)
  • Resolution limits: Distant lane detection (±3-7% error)
  • Processing latency: Missed short-duration events (±1-3% error)

Typical Error Rates by Scenario

Best Case: Clear weather, optimal mounting: ±2-5% error

Average Case: Mixed conditions: ±8-12% error

Worst Case: Heavy traffic, poor weather: ±15-25% error

Calibration Strategies

Secondary Camera Counting

Use a dedicated counting camera to calibrate detection accuracy:

Detection Camera Counting Camera Count Line
Implementation:
  • Position counting camera perpendicular to traffic flow
  • Use simple line-crossing algorithm for ground truth
  • Compare detection results with actual counts
  • Calculate correction factors per lane/time of day
Benefits:
  • Real-time accuracy validation
  • Automatic bias correction
  • Weather-specific calibration factors

Multi-Camera Comparison

Multiple cameras on same roadway for cross-validation:

Overlapping Coverage

Cameras with 20-30% overlap zones for verification

  • Compare counts in overlap areas
  • Identify systematic biases
  • Validate vehicle classifications
Sequential Validation

Cameras at entry/exit points for continuity checks

  • Track vehicle flow conservation
  • Detect missing/phantom vehicles
  • Calibrate travel time estimates
Note: While multi-camera validation improves accuracy, the computational cost often outweighs benefits. Single detection + counting camera typically provides 90% of the accuracy at 50% of the cost.

Technical Configuration Parameters

Confidence Thresholds

Vehicle Type Day Threshold Night Threshold Bad Weather
Cars 0.75 0.65 0.60
Trucks 0.80 0.70 0.65
Motorcycles 0.70 0.60 0.55
Buses 0.85 0.75 0.70

Lower thresholds in poor conditions reduce missed detections but may increase false positives

Lane Distance Parameters

Lane Width Detection
  • Standard lane: 12 feet (3.7m)
  • Tolerance: ±1 foot (0.3m)
  • Shoulder detection: >14 feet from centerline
Cross-Lane Tracking
  • Max lateral speed: 10 ft/sec (lane change)
  • Min dwell time: 2 seconds in lane
  • Lane assignment confidence: 0.85
Occlusion Handling
  • Partial occlusion: >40% visible = valid
  • Max occlusion time: 3 seconds
  • Reacquisition distance: 50 feet

Camera Installation Checklist

Pre-Installation

  • Survey site for optimal viewing angle
  • Check power and network availability
  • Verify mounting structure stability
  • Calculate required camera height
  • Plan cable routing and protection

During Installation

  • Use calibrated angle finder for precision
  • Install vibration dampeners
  • Secure all mounting hardware
  • Apply thread-locking compound
  • Test live view before finalizing

Post-Installation

  • Verify detection accuracy
  • Check for image stability
  • Test in various weather conditions
  • Document final angles and height
  • Schedule maintenance intervals

Highway Camera Analytics – Technical Requirements

Essential technical specifications for optimal AI-powered traffic monitoring and vehicle detection

Technical Foundation for Accurate Analytics

The success of AI-based traffic monitoring depends heavily on meeting technical requirements. This guide outlines the critical specifications needed for:

Video Capture Specifications

Parameter Minimum Recommended Notes
Frame Rate 15 fps 25–30 fps Tracking unreliable below 15 fps
Resolution 1280×720 (720p) 1920×1080 (1080p) or 4MP Needed for detection & classification
Codec H.264 H.265 (HEVC) H.265 reduces bandwidth at same quality
Protocol RTSP, UDP, RTP RTSP (TCP), SRT RTSP over TCP for unreliable networks
Bitrate 2 Mbps 4–8 Mbps Per stream, depends on settings
WDR Optional Yes Handles sun, glare, shadows
IR/Night Yes Yes 24/7 analytics support
Streaming Protocols:
  • Direct UDP streaming in raw format or RTP is equivalent to RTSP for performance
  • Supported protocols: RTSP (TCP or UDP), UDP, RTP, SRT, or RTMP
  • For H.265: Keep keyframe intervals low (1-2 seconds) to minimize video loss from packet drops
  • On unreliable networks: RTSP over TCP or SRT are preferred to prevent packet loss
⚠️ Important: HLS (HTTP Live Streaming) is not recommended for real-time analytics due to high latency, chunked delivery, and unreliable frame arrival timing. Use RTSP or SRT for low-latency, frame-accurate video required by AI tracking and vehicle classification systems.

Lighting & Visibility Requirements

Camera Features

  • IR/Night Vision: Required for dark conditions
  • WDR (Wide Dynamic Range): Required for high contrast scenes
  • Minimum Illumination: 0.01 lux (color), 0.001 lux (B&W)
  • Smart IR: Prevents overexposure of nearby objects

Site Lighting

  • External Lighting: Adequate roadway illumination for night accuracy
  • Light Pollution: Shield cameras from direct light sources
  • Headlight Compensation: HLC or BLC features recommended
  • Dawn/Dusk Performance: Auto-exposure adjustment critical

Performance Impact by Configuration

Configuration Tracking Quality Classification Accuracy Typical Use Case
<15 fps Poor tracking, frequent ID switches Missed vehicles, unreliable counts Not recommended
15–20 fps, 720p Basic tracking possible 7-class classification Minimum viable configuration
30+ fps, 4MP+ Excellent tracking, minimal losses Full classification, >95% accuracy Optimal for complex intersections

Stream Quality Impact on Detection

Poor Quality < 720p < 15 fps High compression 45% Detection Rate High false positives Acceptable 720p-1080p 15-25 fps H.264/H.265 75% Detection Rate 7-class tracking Optimal 1080p-4K 25-30+ fps Low latency 95%+ Detection Rate Full classification

Low Quality Impact

  • Vehicles merge into single blobs
  • Cannot distinguish vehicle types
  • Tracking breaks frequently
  • Unusable in rain/night

Medium Quality Trade-offs

  • Basic vehicle counting works
  • Simple classification possible
  • Some tracking losses at high speed
  • Reduced accuracy at night

High Quality Benefits

  • Full vehicle classification
  • Reliable multi-lane tracking
  • Works in all weather conditions
  • Accurate speed estimation

Bandwidth vs Quality Trade-offs

Optimization Strategies

Limited Bandwidth (< 5 Mbps)
  • Use H.265 codec for 50% bandwidth savings
  • Reduce to 720p if needed
  • Consider SRT for packet loss recovery
  • Implement adaptive bitrate
Moderate Bandwidth (5-10 Mbps)
  • 1080p @ 25fps with H.264/H.265
  • RTSP over TCP for reliability
  • Enable WDR for lighting changes
  • Standard keyframe intervals
High Bandwidth (> 10 Mbps)
  • 4K resolution for maximum detail
  • 30+ fps for smooth tracking
  • Raw UDP/RTP for lowest latency
  • Multiple streams for redundancy

System Integration Considerations

Bandwidth Calculations

Estimate bandwidth requirements for your deployment:

1080p @ 30fps, H.264 4-8 Mbps per camera
1080p @ 30fps, H.265 2-4 Mbps per camera
4MP @ 25fps, H.265 4-6 Mbps per camera

Add 20% overhead for network protocols and bursts

FHWA Vehicle Classification Guide

Understanding the Federal Highway Administration's 13-class vehicle classification system for traffic monitoring

Overview of FHWA Classification

The Federal Highway Administration (FHWA) defines 13 distinct vehicle classes for standardized traffic monitoring and reporting. Our system performs this classification as a background process after initial vehicle detection, ensuring accurate categorization without impacting real-time performance.

How Classification Works

Real-time Detection Vehicle found Crop Extraction Isolate vehicle Background Queue Async process FHWA Classification 1-13 classes

Benefits of Background Processing

No Impact on Real-time

Detection continues at full speed while classification happens asynchronously

Higher Accuracy

More computational resources available for detailed analysis of each vehicle

Flexible Models

Can use different models for different vehicle types without slowing detection

Easy Updates

Classification models can be updated without affecting core detection

The 13 FHWA Vehicle Classes

1 Motorcycles
2 or 3 wheels
2 Passenger Cars
Sedans, coupes, station wagons
3 Light Trucks/SUVs
Pickups, vans, SUVs
4 Buses
All bus types
5 Single Unit (2 Axle, 6 Tire)
Box trucks, large vans
6 Single Unit (3 Axle)
Dump trucks, concrete mixers
7 Single Unit (4+ Axle)
Heavy single units
8 Single Trailer (3-4 Axle)
Small semi-trailers
9 Single Trailer (5 Axle)
Standard 18-wheelers
10 Single Trailer (6+ Axle)
Heavy semi-trailers
11 Multi-Trailer (5 or Less Axle)
Double bottoms
12 Multi-Trailer (6 Axle)
Triple trailers
13 Multi-Trailer (7+ Axle)
Turnpike doubles

Implementation Strategies

Progressive Implementation

Start simple and add complexity as your system matures:

1
Basic Categories

Start with Car/Truck/Bus/Motorcycle (4 classes)

2
Size Differentiation

Add small/medium/large truck categories (7 classes)

3
Axle Detection

Implement axle counting for truck sub-classification

4
Full FHWA

Complete 13-class implementation with validation

Technical Considerations

Model Selection
  • Use lightweight models for initial detection
  • Deploy specialized models for classification
  • Consider ensemble approaches for difficult classes
Data Requirements
  • Collect diverse samples for each class
  • Account for regional vehicle variations
  • Include different weather/lighting conditions
Validation Methods
  • Cross-reference with manual counts
  • Compare with loop detector data if available
  • Regular accuracy assessments

Integration with Traffic Systems

FHWA classification integrates seamlessly with other traffic monitoring components:

Conclusion

This comprehensive guide combines best practices for camera mounting, technical specifications, and FHWA vehicle classification to help organizations deploy successful AI-powered traffic analytics systems. By following these guidelines and leveraging WINK AI Traffic and WINK Analytics, you can achieve:

For technical support or additional guidance:

© 2025 WINK Streaming, Inc. All rights reserved.

Camera Mounting & Analytics Guide - Version 1.0