At Neuromeka AI Lab, we tackle industry-specific challenges by focusing on practical, field-ready AI solutions. By integrating advanced data-driven methodologies into industrial machines, we aim to deliver real-world impact and drive meaningful transformation in automation.
We are looking for creative and motivated team players!
Our current research focuses on equipping robots with new capabilities to address practical challenges in industrial settings. We begin by targeting specific industrial applications, with the broader goal of developing more generalizable robotic systems in the future.
Learning in Industry: We focus on applying advanced AI methods to real-world industrial problems. This includes developing domain-specific large datasets and models to ensure our AI systems remain robust and effective in dynamic factory environments. Our goal is to bridge the gap between cutting-edge AI research and practical deployment on the factory floor.
Visuomotor Policy Learning: We develop robust visuomotor policies that bridge perception and control, enabling robots to interpret visual information and respond in real time. We explore learning-based control methods—such as imitation learning (IL) and reinforcement learning (RL)—both in simulation and on physical platforms.
We develop fast and efficient collision avoidance techniques for collaborative robots (cobots) using point cloud data from sensors like Intel Realsense and 3D LiDAR. Our approach leverages constrained reinforcement learning (CRL) to ensure safe, tool-aware motion in dynamic environments while maintaining efficiency and responsiveness.
We are actively exploring learning-based approaches that incorporate real-robot data and human demonstrations. In 2024, we showcased a public demonstration based on ACT (Action Chunking Transformer), showing the potential of imitation learning (IL) for novel skill acquisition.

We are preparing mobile manipulators, including Neuromeka’s Moby and Moby Outdoor. Our current focus is on building a GPU-accelerated navigation framework with real-time local mapping and path planning.
Once our baselines are fully established, we will advance toward autonomous mobile manipulation in both structured and unstructured outdoor environments.