At CPSquare, we spend our time on the problems that keep autonomous systems from quite making it out of the lab: environments that refuse to cooperate, models that are a little wrong, teammates that drop out, sensors that lie. Our projects span autonomous vehicles, aerial drones, and robot manipulators, and they tend to pull together threads from control theory, motion planning, perception, and learning into what's now being called Physical AI. The three directions below describe where we focus most.
We are consistently seeking ambitious graduate and undergraduate students interested in cyber-physical systems and robotics. If you're motivated by addressing fundamental challenges for autonomous systems and are considering theoretical/experimental research collaborations, feel free to contact sfarzan@calpoly.edu.
Most of our work lives here. We develop planning and control methods that let autonomous agents (ground vehicles, aerial drones, mobile manipulators) operate safely and predictably, either alone or as part of a coordinated team. Real systems have to contend with sensor noise, disturbances, unmodeled dynamics, degraded teammates, and dropped communication, so we lean on adaptive and robust control, control barrier functions, and optimization-based planning to build methods that degrade gracefully rather than catastrophically. The problems that pull us in most right now are highway platoons, aerial swarms, distributed estimation across sensor networks, and fault-tolerant coordination in heterogeneous fleets.
A lot of what holds robots back in manufacturing and logistics isn't mechanical; it's the gap between what the robot perceives and what it can confidently act on. Our work in this area tries to close that gap by tightly coupling perception with motion planning and control. We're drawn to visual servoing for precise manipulation, grasp planning under partial or noisy observations, and handling the deformable, variable objects (stacks of envelopes, fruit, mixed parcels) that historically required a human touch. The goal is flexible automation that holds up outside the carefully controlled environments where traditional industrial robots have always lived.
Farms are one of the hardest environments to automate: unstructured, ever-changing, weather-beaten, and unforgiving, which is exactly what makes them compelling. Our research brings motion planning, computer vision, and learned perception together to tackle specific problems in specialty crop agriculture: autonomous apple harvesting, in-row navigation for ground vehicles, aerial vision-based pest monitoring, and semantic segmentation of crops under varying light and seasonal conditions. The long-term aim is to ease the labor burden and environmental footprint of producing food, while working in settings that continually push the limits of what our algorithms can do.