Aurora Driver
Created from industry-leading hardware and software, the Aurora Driver is designed to adapt to a variety of vehicle types and use cases, allowing us to deliver the benefits of self-driving across several industries, including long-haul trucking, local goods delivery, and people movement. The Aurora Driver consists of sensors that perceive the world, software that plans a safe path through it, and the computer that powers and integrates them both with the vehicle. The Aurora Driver was designed to operate any vehicle type, from a sedan to a Class 8 truck. The Aurora Computer is the central hub that connects our hardware and autonomy software and enables the Aurora Driver to seamlessly integrate with every vehicle type. Our custom-designed sensor suite—including FirstLight Lidar, long-range imaging radar, and high-resolution cameras—work together to build a 3D representation of the world, giving the Aurora Driver a 360˚ view of what’s happening around the vehicle in real time.
Learn more
Oxbotica Selenium
Selenium is our flagship product, a full-stack autonomy system, the product of over 500 person-years of effort. An on-vehicle suite of software which given a drive-by-wire interface and very modest compute hardware, brings full autonomy to a land-based vehicle. Selenium has the ability to transform any suitable vehicle platform into an autonomous vehicle, both at prototype volume and at scale. It is a collection of interoperable software modules that allow the vehicle to answer three key questions, where am I? What’s around me? What do I do next? Selenium spans the technological spectrum, from low-level device drivers, through calibration, 4-modal localization, mapping, perception, machine learning and planning, and its remarkable vertical integration even covers user interface and data export systems. It does not even need GPS or HD-Maps (although this can still be utilized, if available).
Learn more
Apollo Autonomous Vehicle Platform
Various sensors, such as LiDAR, cameras and radar collect environmental data surrounding the vehicle. Using sensor fusion technology perception algorithms can determine in real time the type, location, velocity and orientation of objects on the road. This autonomous perception system is backed by both Baidu’s big data and deep learning technologies, as well as a vast collection of real world labeled driving data. The large-scale deep-learning platform and GPU clusters. Simulation provides the ability to virtually drive millions of kilometers daily using an array of real world traffic and autonomous driving data. Through the simulation service, partners gain access to a large number of autonomous driving scenes to quickly test, validate, and optimize models with comprehensive coverage in a way that is safe and efficient.
Learn more
Applied Intuition Vehicle OS
Applied Intuition Vehicle OS is a scalable, modular platform that enables automakers, commercial fleets, and defense integrators to develop, deploy, and update comprehensive vehicle software, hardware, and AI applications across all domains, from ADAS and infotainment to autonomy and digital services. The on-board SDK provides embedded real-time OS, drivers, middleware, and reference compute architecture for safety-critical and consumer‑facing functions, while the off-board platform supports cloud-based data logging, remote diagnostics, OTA vehicle updates, and digital twin management. Developers work within a unified Workbench environment featuring integrated build and testing tools, CI pipelines, and automated validation workflows. It bridges vehicle intelligence across ecosystems by combining autonomy stacks, simulation suites including vehicle dynamics and sensor simulation, and a vibrant developer toolchain.
Learn more