the sensor substrate

The most consequential technology being built right now runs on sensor data. Figure is putting humanoid robots on factory floors. Tesla FSD is processing camera and radar telemetry across millions of miles. World Labs is building spatial intelligence from visual sensors. NVIDIA is running physics simulations that both consume and generate synthetic sensor data at industrial scale. K2 Space operates satellites that live or die by telemetry. Apple is building spatial computing on LiDAR and environmental sensing. Different companies. Different markets. Same substrate. High-fidelity data from physical systems is becoming the foundational layer of modern technology, and it is arriving at a volume and velocity that simply did not exist five years ago.

At Postmates, we created a network that was real-time and distributed, connecting millions of merchants, couriers, and customers. At Serve Robotics, I was building a system where telemetry was critical infrastructure and I watched very strong teams struggle to replay it, reason over it, learn, and train in sim-first environments at scale. Pendulum was a year of field research on neural net-based sensor fusion running on edge devices under electronic warfare conditions, where the signals themselves were contested and the models had to reason through degraded, adversarial inputs in real time. At Anduril, the stakes tightened to the centimeter in environments where a misread signal can cost a mission.

The pattern became obvious. The machines were getting dramatically more capable. The infrastructure to learn from what they generate was not keeping pace. What makes this moment different is that AI can now reason over physical system behavior in real time, on device. Not in a dashboard hours later. Not in a batch job someone runs on Thursday. At the point of operation, while the machine is running. That is a new paradigm. The gap between sensing and understanding starts to collapse. An anomaly that used to require a senior engineer to notice, investigate, and contextualize can now surface as structured evidence the moment it happens. I have seen the failure mode this replaces at every hardware company I have worked at. Physical systems fail because humans cannot process signal fast enough, cannot preserve institutional knowledge long enough, and cannot be everywhere at once across increasingly complex machines.

The senior validation engineer who knows why a specific anomaly matters on a specific vehicle is often the scarcest resource in the building. When that person leaves, the knowledge leaves with them. Real-time AI on sensor data does not replace those people. It multiplies the scarcest expertise in the industry. That is what we are building at Sift: turning raw sensor data into intelligence that physical systems can consume and make decisions against.

Next
Next

‍Sift closes $42M Series B