AMA with Head of Hardware on Design, Innovation & User Experience

Robot vacuums have been around for over two decades, but most still fall short in the same familiar ways. They’re loud. They get tangled up in cords. They bump or get stuck all the time. These aren’t just inconveniences. They’re symptoms of hardware not built for autonomy.
At Matic, we took a different approach. In this exclusive AMA (dated 04/24/2025) with Anshuman Kumar, Head of Hardware at Matic Robots, we dive deep into the design philosophy, engineering challenges, and customer-focused innovations behind Matic’s autonomous floor cleaner. From how the robot cleans edges and corners with its unique square shape, to the complexities of building a user-friendly, low-noise product that’s easy to maintain, Anshuman shares candid insights into the journey from prototype to production.
About Anshuman: Anshuman Kumar leads Hardware at Matic Robots. His engineering journey includes pivotal roles at Tesla Motors, where he addressed critical reliability and scaling challenges for the Model S and Model 3 traction inverters. He holds a Master's in Product Design from Carnegie Mellon University and a Bachelor's in Mechanical Engineering from IIT Delhi. Anshuman also founded and led the Carnegie Mellon Hyperloop team, which was recognized in the SpaceX Hyperloop competition.
From Tesla to Matic: What Changes—and What Doesn’t—When Designing for Scale

Q1: "How does designing for mass scale (like Tesla) differ from early-stage robotics (like Matic)?"
Anshuman: That’s a great question. I was thinking about that earlier, and I think it’s not that different, and it’s very different. It’s not like Matic isn’t building for mass scale. We’re not building a very niche scientific instrument that’s only useful to 20 super advanced researchers somewhere in the world. We know this is a robotic floor cleaner, and we intend to make it so good that hopefully millions of people will use it one day. But at the same time, we’re just starting out, so we’re at a different point on that journey.
When you’re designing something like the Model 3 today, you just have a lot more understanding of what the customer wants. You also understand the technology a lot better, because Tesla, as an organization, has gone through the Roadster, Model S, Model X, and then gotten to Model 3. Over that decade, they’ve learned a lot about what it takes to make a product like that work.
The third piece is that the supply chain is much more evolved. The suppliers you’re working with are already building the kinds of components you need. Take motors, for example. Tesla is a car company, and cars have existed for decades—so there are suppliers already making suspensions, wheels, and motors. You don’t have to reinvent the wheel, so to speak.
At Matic, all three of those things are less developed. We’re still learning what the customer really wants. And that’s the exciting part of building a product that hasn’t been perfected yet. We genuinely believe robotic vacuums can be 10x, even 100x better than what exists today. A big part of the process is understanding what’s actually broken about them, and we’ve learned a lot through that design process. The technology also wasn’t well understood when we started. Who’s to say what’s the best way to build a mopping system six years ago? Or how to build the widest robot in the world?
And then there’s the supply chain. There really isn’t a supplier making exactly the kind of motors or actuators we need.
So because of all that, we need to stay really agile and open to change in our design process. We can’t set aggressive cost targets up front—we have to accept higher costs and give ourselves room to keep iterating.
Practically, that means we don’t jump straight into high-scale processes like injection molding. We do a lot of 3D printing, vacuum casting, and rapid prototyping. And not just in how we manufacture—our design thinking reflects that too. We build quickly, test quickly, and if something doesn’t work, we change it and try again.
When I was at Tesla, I was working on super fine optimizations—saving a few cents on cost, improving inverter efficiency by 5%, that kind of thing. But at Matic, it’s a complete zero-to-one game. We’re starting from scratch. We’re defining the scope. And we go through, I’d say, 10x more iterations. It’s very fun, and really satisfying as a designer—to start from basically nothing and build out the product.
The Button That Broke Five Robots: A Field Failure Story from Matic

Q2: "What was the most surprising mechanical failure mode discovered during in-home testing, and how did the team catch and resolve it?"
Anshuman: Yeah, I think there have been a lot of interesting ones. I’d say every month after we put robots in people’s homes, two new issues would come up, and we’d have to go back and change the hardware design. And all of those changes are pretty big tear-ups, because once you’ve cut tools, like injection molding tools, changing hardware gets a lot more difficult than, say, pushing a software update.
The funniest story I can think of: we were just starting to film videos of Matic in action. We had a shoot scheduled in New York and shipped five robots there for the first time from our facility in Mountain View. Up until then, we had mostly been hand-carrying robots very gingerly—from the office to our homes—to test them. But obviously you can’t do that for a shoot in New York. So we boxed them up and sent them by air.
I remember opening the first box and finding the robot was broken—specifically, the cleaning head actuation cable was snapped. That cable is what moves the cleaning head up and down. It runs beneath a panel, and on delivery, you can see the head slightly elevated. But on arrival, it was fully collapsed. That was disheartening, but I figured: okay, good thing we sent five. I opened the second box—same issue. And one by one, I discovered all five had the same failure.
At first, we assumed a production or packaging error. But then I started tinkering and noticed something interesting: when the robot experienced a vertical shock—like the kind it might get during shipping it would turn on inside the box. Turns out, the button lacked debounce logic. So any shock could trigger the robot to power on, and once on, it would lift the cleaning head. Doing that while still inside the packaging made the cable more likely to fail.
It was a totally unexpected chain of events. But once we figured it out, we implemented debounce logic on the switch and never saw that failure again. Of course, that didn’t help us in the moment—all five robots were broken just hours before the shoot. We had to ask Josh, our production supervisor, to get on a plane that day, fly to New York with parts, and fix all the robots in the field.
I’d say that’s one of the crazier field failures we’ve seen. I never would’ve guessed it would be the power button causing the problem. But that’s the fun (and challenge) of robotics: everything is connected to everything. One small issue can trigger a totally different failure somewhere else.
When Software Evolves, How Do You Keep Up in Hardware

Q3: "When Software Evolves, How Do You Keep Up in Hardware?"
Anshuman: That's a good question. I think there are different levels of changes. Sometimes, and we try really hard to make this most of the time, software can run in parallel and not force a change on the hardware side.
And the way you do that is by building very versatile systems. All the way from how we selected our SoC, to the kind of cameras we picked, to how much battery we allocate for the robot—we try to think ahead. We look at the features the software team already has on the roadmap, and especially the ones that could impact hardware. That helps us avoid needing to change hardware when software evolves.
A good example out in the wild is what Tesla tried to do—might not have executed it perfectly—but they shipped cars with FSD hardware hoping software updates would eventually make it work. That’s not always possible.
In our case, we had a big moment when the software team realized the previous SoC just wasn’t cutting it for the kind of compute-intensive algorithms we’re now running. Mehul talked recently about how we now have 10x better SLAM than anything else out there. That includes very fine depth mapping, semantic classification of carpet vs. hardwood, and soon we’ll start identifying things like pet waste. Those algorithms need a very powerful and efficient computing platform.
We realized our old SoC—and even the SSD—just wasn’t good enough and wasn’t going to get better. So we made a big call to switch to NVIDIA’s Orin Nano. That change took over a year. And it wasn’t just the SoC—it was everything that plugs into it, including the thermal system needed to cool it. We had to overhaul that entire system.
So to answer the question, it depends on what kind of software change you’re talking about. Most of the time, we don’t have to touch hardware. But sometimes, when the bar gets raised, we go back and refresh the hardware to keep up.
Why Five Cameras?

Q4: "What cameras do you use? Why?"
Anshuman: That’s a good question. If I take a step back, I’ll also explain why cameras are such a core choice for us in the first place. Our philosophy is rooted in the idea that robots live in people’s homes, and those spaces are really unique and complex. Just like humans rely heavily on their eyes to navigate, vision is a super-powerful sensor. It gives you the richest information about the environment—much more than, say, LIDAR, sonar, radar, ultrasonic sensors, or time-of-flight sensors.
Also, cameras have the huge advantage of massive industry support because of smartphones—thank you, Steve Jobs! The supply chain, manufacturing scale, and cost efficiencies make cameras way more accessible and affordable compared to most other sensors.
Self-driving cars also helped push camera technology forward, so from a high-level technical standpoint, it was a clear decision to rely solely on cameras for our robot.
Initially, we started with just RGB cameras—red, green, and blue pixels—to capture color info. That’s important because even if the robot could perfectly understand depth from other sensors, RGB is needed to build the user-facing map. This map is the key communication tool between what the robot “sees” and what the customer sees on their app.
But RGB cameras have limitations: in low-light or nighttime, they just don’t capture enough. So we realized early on that we needed to add IR illumination and IR sensitivity to the cameras. On the robot itself, you can see this: the front stereo pair cameras have an IR-transparent plastic panel in the middle with IR LEDs behind it. These LEDs light up the scene in darkness or low light. The same setup is mirrored on the back of the robot, and the “up camera” on top also has a matching IR panel.
At first, we tried RGB IR sensors where the same pixels detect both visible and infrared light. But that caused issues—like when sunlight would wash out the camera because it couldn’t separate IR from visible colors. You end up not knowing what’s what because the signals are merged.
To solve that, you either need to physically filter IR light (like mechanical sunglasses for the camera) or separate RGB and IR into different pixels. Mechanical filters add complexity, cost, and reliability issues, and we want to keep the robot compact and simple. So we went with RGB plus IR cameras—separate pixels for each, which allows us to filter and process the signals fully in software.
Stereo vision is also crucial. The front has two cameras spaced apart (a stereo pair). Each camera sees the scene from a slightly different angle. The robot knows exactly how the cameras relate to each other, including lens distortion and position. By comparing where a pixel in one camera appears in the other (the horizontal shift), it calculates depth, just like human eyes do when you close one eye and then the other and notice how things shift.
That’s how we build a depth map. We could have just had a front stereo pair, but that would make backing out of tight spaces tricky because the robot wouldn’t see behind it. Instead of adding mechanical parts like a rotating “neck,” we added a second stereo pair on the back—another set of eyes.
So that’s four cameras total for stereo depth perception. The fifth camera—the “up camera”—was one of the most debated decisions inside the company. With Mehul and Navneet’s background in gesture recognition, we wanted the robot to interact with humans more naturally.
Humans rely on seeing faces, gestures, and body language to communicate. If the robot can’t see you, it limits interaction. The up camera lets the robot look at a person’s face and see where they’re pointing.
This enables a feature we call “Come Here Clean This.” Imagine standing near the robot and pointing to a spot on the floor while saying, “Clean that mess.” The robot can understand your gesture, map the pointing line in 3D space, and go clean exactly that spot.
Privacy has always been a top priority. Many devices with cameras offload complex computation to servers, which poses risks since user data leaves the device.
We decided early that all compute must happen on the robot itself—on-device. That’s why we chose hardware, including the NVIDIA Orin Nano SoC, that can handle all the heavy processing locally.
Because of this, Matic never needs to collect or store user data like images or videos. The only time we collect any data is when customers explicitly allow it for debugging.
We treat privacy as sacred because we know it’s critical to our customers.
Packaging: Unboxing Made Simple, Fast, and Fun

Q5:" Can you go over the retail packaging design? It's so cool"
Anshuman: I think the retail packaging is the first physical interaction anyone has with the robot. We wanted to make sure it’s inviting and sets the tone to expect great things from the product. Personally, having worked on this hardware for a long time and tested many competitor robots, I’ve unpacked pretty much everything out there.
Without fail, the most hated part is opening a big cardboard box with a box inside it. Then you have to figure out how to get the inner box out — maybe by laying it flat or cutting open the outer box. The second box usually opens from the top, so you cut open the flaps and then have to pull out sometimes up to 10 kilograms. I don’t know the exact weight of a heavy dock, but it’s probably at least that. Lifting that much weight vertically out of a box is a very bad experience. Then you have to cut open a bunch of tape and plastic.
Before you know it, you’ve spent 30 minutes just getting the hardware out of the box. Onboarding is another challenge, but I won’t get into that here. For us, we wanted to short-circuit the time from having the box in your home to the robot cleaning as fast as possible — ideally within five minutes.
As much as the hardware is responsible, we made the box design very intuitive. When the box arrives, it’s inside a cardboard shipping box for protection. But we added a simple handle on top so you can pull the box out without cutting the outer box. Then you can lay it on the floor or table and open the latches — no tape cutting needed. There are four latches on each side. You open them all, then the top box pulls right off.
At home, the robot is sitting on its base, ready to go. One theatrical element I love is the ramp that opens up in front of the robot. You just press the play button on top of the robot, and it starts cleaning. That’s it.
Then you go through the app onboarding, which doesn’t take long — probably a minute or two if you can set up the dock quickly. We’ve achieved our goal: get the robot cleaning within five minutes.
We’re really happy with how the packaging is shaping up, and we plan to make it even more impressive. We’ve toyed with ideas like shooting confetti when the box opens and having the robot clean it up. We haven’t pulled the trigger on that yet — it’s a risky feature, but exciting. I’ve been at parties like that. Yeah. It’s a party box.
How Matic Ensures Effective Edge Cleaning and Thoughts on Stair-Climbing

Q6: " With Matic's flat front and brush layout, how do you ensure edge/corner cleaning is effective? Any plans for stair-climbing in the future?"
Anshuman: There are two questions here: edge/corner cleaning and stair climbing.
For edge and corner cleaning, the robot has an advantage because of its square shape. This lets us go all the way to the edge without relying solely on a side brush. We expose the main brush fully to the edges where we want to clean, so we get the full power of the brush and suction. Of course, the side brush also helps—it’s positioned in the corner and flicks out debris from the edge into the cleaning path for the robot to suck up.
We shipped the toe kick cleaning feature a few weeks ago. The robot uses its unique shape to go under toe kicks, exposing dirt to full suction power. It cleans that area and then backs out, repeating as it moves along. This allows very effective cleaning in toe kick areas. The robot drives along edges and uses its side brush to sweep debris toward the main brush.
On stair climbing: I would have loved to build that feature because the mechanical engineer in me loves complex mechanisms. But it requires a lot of work, so we decided not to.
From an efficiency standpoint, if you have multiple floors, it would take too long for one robot to clean all floors by climbing stairs repeatedly. It’s more practical to have multiple robots—one per floor.
Also, the robot weighs about seven kilograms (around 15 pounds). If it fell down stairs, that could be dangerous. Adding a stair-climbing mechanism would increase risk. We’ve built reliable features to detect stairs and avoid falling, which works great. But climbing stairs is far more difficult.
Stairs vary widely—marble, carpet, different shapes and curvatures—so building a stair-climbing robot would be very expensive, likely 2–3 times the cost. We prefer to keep the robot affordable and let people buy multiple units if needed, rather than one complex, costly stair-climbing model. That’s how we thought about it.
From Prototype to Production

Q7:" What is your process for going from prototype to production?"
Anshuman: We’re very focused on building a great design. Of course, there are economies of scale to consider, costs to manage, supply chain challenges, tariffs, and all those things that we need to handle to be a viable long-term business.
But for us, it starts and stops with the customer experience. If we can’t build a product that is loved by at least some people, then cost doesn’t matter. So the design process heavily emphasizes building the right features and building them the right way.
This means understanding customer priorities. On the hardware side, it’s always been about superior cleaning and an extremely low probability of user intervention—especially unstructured intervention like rescuing the robot from eating iPhone cables or getting stuck on carpets.
Cleaning efficacy and minimizing user intervention make the experience great. Anything the user needs to do should be very easy and intuitive. Noise is also a very big deal for us—we want the robot to be very quiet.
A small but important detail in user experience: if you’ve used a Roomba, you have to grab the robot, turn it over to access the brush and bin. Matic is designed so you can access the brush from the top down, and the mop removes easily from the side. The bag and solvent tank are also accessible from the top.
We don’t want the user to have to flip the robot over because it could spill dirt and create chaos. So we emphasize user experience over cost in decisions like this and stick to those principles.
We also don’t shy away from building complexity if needed. For example, we didn’t like how mopping worked on other robots, so we built a better system. It’s had many technical challenges, but we kept pushing forward to create something useful and delightful.
For me, removing that peg is super easy, and there’s no mess everywhere. The team has done a great job keeping everything in one place.
How I Stay Motivated Through Setbacks and Long Hardware Cycles

Q8:" How do you personally stay motivated through the inevitable setbacks and long cycles of hardware innovation?"
Anshuman: I think it'd make sense to answer this question in the frame of reference of Matic because I think I did not do that at Tesla. When I left Tesla after two years, it was a tough decision. Tesla was a dream job I’d wanted for a long time, so walking away wasn’t easy. Plus, coming from an Asian household, my parents were skeptical and concerned — “Why leave a great company for an unknown startup?”
But my inspiration came from the idea of building a product like Matic from scratch. That “why” gave me the motivation to keep going. The “how” — the challenges and obstacles — could be figured out along the way.
If you’re doing something because you decided to do it, and you believe in it, motivation comes much easier. But if you’re just doing what others expect from you, it’s much harder to stay committed.
On another note — as Matic grows, we’re hiring across the board! We’ve started shipping and the customer feedback has been phenomenal. We literally have bulletin boards filled with “love letters” and photos from happy customers using our robots.
Because of this demand, we need to scale production quickly. We’re looking for talented electrical and mechanical engineers, as well as production team members. One exciting role is our Robotics Production Internship — a great hands-on opportunity for those wanting to build products that actually ship.
We’re also growing our supply chain and vendor management teams, seeking people with experience in contract negotiations, electronics, and hardware manufacturing. We want exceptional people driven by the idea of building something special — a robot that can end up in millions of homes. There’s no other American company building home robots at the scale Matic aims for, so there are amazing opportunities ahead.

Q9:" Is there a feature or design element the team thought would be easier to implement, but turned out to be a real headache when in production?"
Anshuman: Many! I could say the bag design was more complicated than we initially expected. We use a bag because we collect both dirt and dirty water, and we wanted to give users a truly hassle-free “use and throw” experience. The idea was that users shouldn’t have to scrape out a bin or deal with messes.
However, making a bag that’s both waterproof and effective turned out to be tricky. The bag is actually a hybrid — half plastic, half fabric — to achieve the right waterproofing. We also added sealing features at the front so it properly seals with the robot, plus a vent fullness system at the back.
All of these design choices improved the user experience dramatically — the bag works great and is easy to handle. But from a production perspective, these complexities made manufacturing more challenging than we anticipated.
That said, we’re making rapid progress. We have plans to simplify the design over the next few months, which will allow us to produce bags faster and at scale.