We explore interfaces that exceed the visual and textual paradigms imposed by Western HCI traditions.

Our speculative sensory architecture treats touch, sound, and vibration as computational media—not secondary access channels, but primary modalities for engaging with digital worlds.

Four Sensory Systems

Tactile Educational Tablets

Graph-Based Rendering for STEM Learning

Dynamic tactile displays that render complex visual information through raised patterns, textures, and responsive surfaces—making STEM education accessible through touch.

Geometric Shapes

2D and 3D forms rendered with varying heights and textures, allowing tactile exploration of polygons, polyhedra, and geometric relationships

Diagrams & Charts

Flow diagrams, organizational charts, and data visualizations translated into tactile graphs with distinct textural elements

Maps & Spatial Data

Geographic maps, building layouts, and spatial information with elevation changes and landmark textures

Mathematical Structures

Functions, equations, and algebraic expressions rendered as tangible patterns revealing mathematical relationships

Spatial Narratives

Story structures, timelines, and conceptual frameworks presented as navigable tactile landscapes

Pedagogical Innovation

These tablets don't merely "translate" visual content—they reimagine how abstract concepts can be understood through touch. Blind students develop spatial reasoning through embodied exploration, building cognitive models that differ from, but are equally valid as, visual learning pathways.

Low-Cost Haptic Communication Devices

Wearable Communicators

Compact, affordable devices that enable silent, discreet communication through vibration patterns—creating new linguistic possibilities beyond spoken or signed language.

Silent Alerts

Notification systems that communicate urgency, category, and context through nuanced vibration patterns. Users learn to distinguish between social messages, emergencies, navigation cues, and environmental alerts without sound or sight.

Directional Cues

Spatial orientation communicated through multi-point vibration: left-right guidance, front-back awareness, and three-dimensional navigation instructions delivered through wearable arrays.

Encoded Tactile Language

Rhythmic patterns encoding linguistic information—a haptic orthography that enables text transmission through touch. Users develop fluency in reading vibration sequences as naturally as reading text or sign.

Universal Design Principle

While designed for deaf-blind communication, these devices reveal utility across contexts: silent coordination in noisy environments, discreet communication in formal settings, and assistive alerts for anyone seeking non-intrusive notifications.

Acoustic–Tactile Dual Interfaces

Mixed-Signal Feedback Loop

Interfaces that merge audio interpretation with haptic textures, enabling blind users to "read" scenes through a hybrid perceptual system that exceeds either modality alone.

Acoustic Layer

Environmental Sonification

Spatial audio rendering of object positions, movement vectors, and distance relationships

Semantic Audio Labels

Spoken descriptions of scene elements, object identities, and contextual information

Adaptive Audio Resolution

Detail level adjusts based on user focus—granular information on demand, ambient awareness by default

Tactile Layer

Texture Mapping

Different surface materials rendered as distinct vibration patterns and textures

Spatial Haptic Grid

Multi-point vibration arrays creating 2D or 3D tactile representations of scenes

Dynamic Intensity Scaling

Vibration strength indicates proximity, urgency, or significance of elements

Multimodal Synergy

The power lies not in redundancy but in complementarity. Audio provides semantic context and temporal dynamics; haptics deliver spatial structure and material properties. Together, they create a perceptual gestalt exceeding what either modality offers independently.

Multi-Sensory Cultural Heritage Interpreters

Embodied Cultural Learning

Tools that make cultural heritage sites, museums, and historical narratives accessible through immersive multi-sensory experiences—transforming passive observation into active, embodied engagement.

🖼️ Immersive Tactile Museum Experiences

3D-printed replicas of artifacts with embedded sensors and actuators. Touching different parts triggers contextual audio descriptions, vibration patterns encoding cultural significance, and temperature variations representing materials and age.

📚 Embodied Cultural Learning

Historical narratives presented through tactile timelines, spatial audio dramas, and haptic storytelling. Users "walk through" history by exploring physical/digital hybrid spaces where touch reveals layered historical information.

🌍 Non-Visual Tourism in Heritage Sites

Guided systems combining tactile maps, acoustic environment interpretation, and haptic navigation. Visitors with visual impairments explore architecture through detailed tactile models synchronized with spatial audio narratives.

Decolonizing Heritage Access

These systems challenge the visual bias in museum design and heritage interpretation. They create opportunities for indigenous tactile knowledge practices to inform interface design, centering non-visual ways of knowing that have been marginalized by colonial epistemologies.

Touch, Sound, Vibration as Computational Media

We consider touch, sound, and vibration as computational media—not secondary access channels, but primary modalities with their own affordances, aesthetics, and epistemologies.

This approach refuses the ableist assumption that visual interfaces are "normal" and all others are "accommodations." Instead, we design sensory-first systems that reveal new possibilities for human-computer interaction beyond the screen.

Explore More Technologies

Sensory interfaces are one dimension of our comprehensive assistive intelligence ecosystem.