1209551
📖 Tutorial

DAIMON Robotics Unleashes Largest Tactile Dataset to Give Robot Hands a Sense of Touch

Last updated: 2026-05-04 13:24:30 Intermediate
Complete guide
Follow along with this comprehensive guide

Introduction

In a significant leap for physical artificial intelligence, Hong Kong-based DAIMON Robotics has unveiled Daimon-Infinity — a groundbreaking omni-modal robotic dataset that brings high-resolution tactile sensing to the forefront of robot manipulation. Released this April, the dataset is the largest of its kind, covering a wide array of tasks from folding laundry in homes to precision assembly in factories. The launch marks a strategic shift for the two-and-a-half-year-old company, known for its advanced tactile sensor hardware, as it moves to accelerate the real-world adoption of embodied AI.

DAIMON Robotics Unleashes Largest Tactile Dataset to Give Robot Hands a Sense of Touch
Source: spectrum.ieee.org

The Vision Behind Daimon-Infinity

The core motivation behind Daimon-Infinity is to address a critical gap in modern robotics: the lack of tactile feedback. While vision-based systems have dominated, robots still struggle with dexterous manipulation tasks that humans handle effortlessly, such as handling fragile objects or adjusting grip strength. DAIMON’s dataset aims to bridge this gap by providing millions of hours of multimodal data, including ultra-high-resolution tactile readings alongside visual and language inputs.

“The missing piece has been the insensitivity of robot manipulation,” explains Prof. Michael Yu Wang, co-founder and chief scientist at DAIMON Robotics. “Current models rely heavily on Vision-Language-Action (VLA) architectures, but without tactile feedback, robots cannot adapt to real-world unpredictability.”

What Makes the Dataset Unique

Daimon-Infinity is not just large in scale but also in scope. It encompasses:

  • Million-hour scale multimodal data: Combining visual, tactile, language, and action sequences.
  • Ultra-high-resolution tactile feedback: Leveraging DAIMON’s proprietary monochromatic, vision-based tactile sensor that packs over 110,000 effective sensing units into a fingertip-sized module.
  • Diverse real-world scenarios: Data collected from over 80 environments and 2,000+ human skills, ranging from household chores to industrial manufacturing.

The dataset is supported by a distributed out-of-lab collection network that can generate millions of hours of data annually, ensuring continuous improvement and expansion.

Prof. Michael Yu Wang’s Pioneering VTLA Architecture

At the heart of this initiative is the Vision-Tactile-Language-Action (VTLA) architecture, a framework that elevates tactile sensing to a modality on par with vision and language. Prof. Wang, an IEEE Fellow and former Editor-in-Chief of IEEE Transactions on Automation Science and Engineering, brings four decades of expertise in manipulation and robotics. His academic journey includes a PhD at Carnegie Mellon under the guidance of Matt Mason and founding the Robotics Institute at the Hong Kong University of Science and Technology.

The VTLA architecture integrates tactile feedback as a core component, allowing robots to perform tasks that require a delicate sense of touch — from picking up a wine glass to handling flimsy materials. This approach contrasts with traditional VLA models that treat touch as a secondary or absent modality.

DAIMON Robotics Unleashes Largest Tactile Dataset to Give Robot Hands a Sense of Touch
Source: spectrum.ieee.org

Accelerating Real-World Deployment

DAIMON is not just about building datasets; the company is actively pushing for practical applications. To accelerate the adoption of embodied AI, DAIMON has open-sourced 10,000 hours of its data, hoping to spur innovation across research labs and industries. The company envisions initial deployments in settings like hotels (for cleaning and laundry services) and convenience stores (for restocking and customer interaction) across China.

“This dataset is a key enabler for transferring robotic manipulation from controlled labs into messy, natural environments,” says Prof. Wang. “We are focusing on scenarios that demand dexterity and adaptability — where a robot must adjust its grip based on the object’s texture, weight, or flexibility.”

Collaborations and Open-Source Initiatives

The Daimon-Infinity project is a collaborative effort involving top institutions worldwide, including Google DeepMind, Northwestern University, and the National University of Singapore. These partnerships ensure a rich cross-pollination of ideas and data, accelerating progress in tactile sensing for physical AI.

By releasing the dataset now — rather than focusing solely on product development — DAIMON aims to establish tactile feedback as a standard modality for future robotic systems. The company believes that sharing this resource will catalyze breakthroughs in manipulation, just as large language models revolutionized natural language processing.

Conclusion

DAIMON Robotics’ Daimon-Infinity dataset represents a bold step toward giving robots the sense of touch they have long lacked. With its massive scale, high-resolution tactile data, and open-source philosophy, it promises to reshape how robots interact with the physical world. As Prof. Wang notes, “The future of robotics lies not just in ‘seeing’ but also in ‘feeling’ — and we are making that future a reality today.”