Mobility in African cities is not only a geometric problem; it is a socio-political challenge shaped by road culture, infrastructural gaps, informal transport systems, and environmental uncertainty.

Assistive Futures engineers context-aware mobility intelligence for PWDs, building systems that understand local realities and operate autonomously in complex, unpredictable environments.

Four Mobility Systems

Autonomous Wheelchair Systems

A-CHAIR

A fusion of robotics, computer vision, and decision intelligence designed specifically for Ugandan terrains and urban environments.

Technical Foundation

LiDAR + Stereo-Vision

3D environmental mapping and real-time obstacle detection

Decision Transformers

AI models that learn optimal navigation strategies

Local SLAM Maps

Simultaneous localization and mapping for complex spaces

Tactile Obstacle Modeling

Understanding terrain textures and surface conditions

Fail-Safe Mechanical Overrides

Manual control always accessible for user autonomy

Adaptive Capabilities

  • Learns user preferences and navigation patterns over time
  • Optimizes for safety in crowded, unpredictable environments
  • Operates fully offline without cloud dependencies
  • Designed for rough terrain, potholes, and informal pathways

Intelligent Mobility Stick

Auto-Canopy

A motor-assisted, sensor-rich mobility tool that augments the traditional white cane with computational intelligence.

Ultrasonic Sensing

Detects obstacles at multiple heights and distances, providing early warnings of environmental hazards

Haptic Feedback

Vibration patterns communicate distance, direction, and urgency of obstacles through touch

Micro-Navigation AI

Learns familiar routes and provides intelligent path suggestions based on context

Voice-Guided Wayfinding

Spoken directions and environmental descriptions delivered through bone conduction audio

Design Philosophy

Auto-Canopy enhances rather than replaces the skills blind users already possess. It respects existing mobility techniques while adding computational awareness of the environment.

eSaferide Platform

Accessible Transport for PWDs

A locally engineered alternative to platforms like Uber and SafeBoda, specifically adapted for persons with disabilities.

Wheelchair-Compatible Vehicles

Fleet includes vehicles with ramps, lifts, and secure wheelchair anchoring systems. Drivers receive specialized training in disability etiquette and safe transport protocols.

Rider Behavioral Scoring

Drivers are evaluated on accessibility compliance, respectful communication, and assistance quality. PWDs have control over rating criteria.

Disability-Aware Route Optimization

Algorithm considers accessible pathways, avoids stairs and steep inclines, prioritizes smooth roads, and identifies accessible building entrances.

Acoustic Alerts for Blind Riders

Audio notifications for vehicle arrival, route updates, and destination proximity. Integration with Auto-Canopy for seamless mobility chains.

Emergency-Health Escalation

Direct connection to healthcare services if medical emergencies occur during transit. Drivers trained in basic emergency response.

Local Ownership: eSaferide is built on sovereign infrastructure—no data extraction by foreign corporations. All ride data remains within Uganda, controlled by the disability community.

Acoustic Environment Translation

AET

An AI system that "speaks the environment"—interpreting acoustic scenes and creating computational sensory prosthesis for blind users.

🚶 Moving Object Identification

Detects and announces approaching pedestrians, vehicles, cyclists, and animals based on sound patterns

🏙️ Scene Interpretation

Describes environments acoustically: "You are entering a busy market," "Quiet residential street ahead"

👥 Crowd Density Analysis

Estimates number of people, movement patterns, and safe navigation corridors through crowded spaces

⚠️ Hazard Prediction

Identifies potential dangers: construction zones, open manholes, aggressive vendors, irregular traffic

📋 Visual Signage Reading

Combines computer vision with audio output to read street signs, shop names, bus numbers, and public notices

🧠 Contextual Learning

Adapts to user's familiar routes, learning to highlight only relevant information and reducing cognitive load

Beyond Vision Replacement

AET does not attempt to "restore sight"—it creates a new sensory modality. Blind users gain augmented environmental awareness through sound, developing hybrid perceptual capabilities that exceed what sighted navigation offers.

Mobility as Augmented Agency

These systems transform mobility from mere movement into augmented agency—the ability to navigate space with autonomy, dignity, and computational support.

We design for African realities: dusty roads, informal settlements, unreliable infrastructure, and rich social environments. Our mobility technologies are not imported solutions—they are built from the ground up with local knowledge, local materials, and local innovation.

Explore More Technologies

Mobility and navigation are foundational to our comprehensive assistive intelligence ecosystem.