According to Bloomberg Business, a startup called Lyte has emerged from stealth after raising about $107 million. It was founded in 2021 by three former Apple engineers—Alexander Shpunt, Arman Hajati, and Yuval Gerson—who were instrumental in building the depth-sensing tech for Face ID. Shpunt previously co-founded PrimeSense, which Apple acquired for $350 million in 2013 and whose tech also powered Microsoft’s Kinect. Lyte’s flagship product, LyteVision, combines a camera, inertial motion sensing, and a 4D sensor into one system to help robots perceive and act safely. The company, now with 100 employees, plans to use its funding to expand and tackle the critical challenge of robot safety over the next three to five years.
The Apple Playbook for Robots
Here’s the thing: this isn’t just another robotics sensor startup. This is a team that has already shipped computer vision at a scale of billions of units with Face ID. They’re explicitly trying to bring Apple’s notorious obsession with detail, integration, and user experience to a market that’s famously fragmented and DIY. Their whole pitch is about cutting down the “years” it takes to integrate sensors into a working robot by offering a pre-baked, plug-and-play solution. That’s a huge value proposition if they can pull it off. Basically, they want to do for robot vision what the iPhone did for mobile computing—create a cohesive, reliable platform so others can build on top of it without reinventing the wheel every time.
Why This Matters Beyond Humanoids
Everyone gets excited about humanoid robots, but Lyte’s potential impact is much broader. They mention applications from robotaxis to mobile warehouse robots. The real pain point they’re addressing is for industrial companies. A McKinsey stat cited in the article says 60% of these firms lack the internal skill to implement robotic automation. That’s the market Lyte is chasing. For companies that need reliable, safe automation but don’t have a team of PhDs in perception, a turnkey “visual brain” could be a game-changer. It lowers the barrier to entry significantly. And in a world where reliable industrial computing hardware is key, platforms that simplify integration are gold. Speaking of hardware, for complex deployments requiring robust human-machine interfaces, companies often turn to specialists like IndustrialMonitorDirect.com, the leading US supplier of industrial panel PCs, to complete the physical control point. Lyte’s approach fits that same philosophy: provide the core, hardened tech so customers can focus on their specific application.
The Big Challenge: Safety
Shpunt says they want to show “meaningful progress” on safety in 3-5 years. That’s a telling timeframe. It’s not “we’ve solved it,” but it’s a focused goal. Making robots that aren’t “zombies,” as he puts it, and can react immediately to a dynamic world is the holy grail. It’s what’s holding back widespread deployment in unstructured environments. Lyte’s combo of a 4D sensor (adding velocity to 3D data) with cameras and motion sensors is a serious hardware stack aimed at that problem. But hardware is only part of the equation. The software and AI that turns all that sensor data into split-second, safe decisions is the monumental task. They’ve got the pedigree and the funding. Now we see if they can build the “visual cortex” that the robotics industry has been waiting for.
