Lightbox: Autonomy Visualization at Aurora

  • By Nathan Falk, Marion Le Borgne, Peter Bentley, Brian Becker

How We Built a New Tool That Lets Autonomy Engineers Move 10X Faster

To build autonomous trucking and ride-hailing products, we need to understand what the autonomy system is experiencing. In other words, we need to be able to see what the Aurora Driver sees—is it picking up those construction cones, is it registering that speed change sign, does it notice that motorcycle coming from behind? What’s more, we need to be able to see how and why the Aurora Driver sees what it sees—are some of the cones obstructed and not perceptible, is the sign confusing or illegible, is the motorcycle still recognizable when it suddenly does a wheelie?

Access to all of this information allows our operations teams to quickly spot and diagnose issues that appear while our vehicles are out testing; it allows our simulation teams to reliably test our technology offline at scale; and it allows our engineering teams to understand how the code they write affects the way our vehicles perform out in the real world. Engineers on Aurora’s Autonomy team, for example, generally run thousands of short simulations to evaluate whether a code change or new model should be implemented. Small adjustments often result in a number of improvements and regressions, and autonomy engineers typically have to inspect dozens of these short video snippets to verify whether those improvements and regressions were expected or whether they warrant deeper investigation.

We’ve built several custom tools over the years that enable autonomy visualization for various key workflows, but as we’ve grown and more people have begun using these tools, we’ve run into a few challenges:

  • Our autonomy visualization tools were built on different tech stacks at different times.
  • Each tool was built for a specific use case and has trouble generalizing to new use cases.
  • The tools are slow and struggle to keep up with increasing demand.

To solve these pain points, a cross-functional group of folks from product, design, and engineering got together and decided to see if we could develop a new autonomy visualization platform. After conducting rigorous user research, engineers on Aurora’s Visualization team began experimenting with a first-principles approach and our product and design team began mocking up visual prototypes. Several iterations later, we landed on a new visualization application that would allow our autonomy engineers to move 10x faster.

Introducing Lightbox: Aurora’s Next-Generation Visualization Tool

Lightbox is a completely web-based application that allows users to quickly customize their viewing experience and dig deep into the parts of the autonomy stack they are trying to improve. It provides users with an easy interface to a 3D scene with visualizations of sensor, mapping, perception, simulation, and motion planning data, giving our teams the ability to “preview” before we hit “publish” on changes to the Aurora Driver.

Lightbox is designed to be broadly usable and approachable so that users across teams can iterate on Aurora’s autonomy stack with a common language and visualization framework. Its decentralized contribution model empowers autonomy developers to create visualizations that support their workflows without having to rely on a central web team. Lightbox is also designed to be extensible and configurable to enable users to focus on whatever information is most important to them.

So how does it work?

Speed & Reliability

Users across teams often need to quickly review a short log or simulation and might only have access to a 5 MB/s connection. Users also often need to review many short videos in rapid succession, and quickly find and jump to the right moment in each snippet. This means we had to optimize for very fast loading using low bandwidth, and allow for live scrubbing (dragging the video slider and immediately seeing the 3D view update in real-time), starting playback at any timestamp without delay, and viewing at faster than real-time.

To do this, we had to figure out how to load the same data 10x faster. In many use cases, users only care about certain kinds of data—the rest is unnecessary. So Lightbox is designed around a specialized data loading scheme and configurations that control bandwidth and application size.

Rather than loading all log data at once, Lightbox will initially only load lower-fidelity, lightweight data to give the user a general sense of the scene. In place of the API, Lightbox relies on a metadata file to load this important data first, at low resolution, and instantly display it based on timestamp. Our system reads the configuration from the metadata file to dynamically load the higher-fidelity, heavyweight data with further details (such as hi-res camera data or more detailed LIDAR points) when the user pauses to inspect a specific scene or seeks for a desired timestamp. In this way, Lightbox enables fast scrubbing and much finer grain, on-demand data loading while effectively eliminating long load times and buffering. By opting for a custom data flow, we are able to avoid unnecessary dependencies and overhead, add compression, and only include the information that is relevant to our users.

Customization & Collaboration

Different teams need different things from their visualization tool. For example, our Perception teams need to see sensor data, tracks, and labels over time, with varying levels of granularity. Our Planning teams need to access data on the Aurora Driver’s motion planner and controller, the state of the autonomy system as a whole, and other road users. Meanwhile, our Triage and Data Science teams need to easily spot road features and map discrepancies. And, most importantly, all of these teams need to collaborate to build the Aurora Driver.

To be as flexible as possible, Lightbox supports dynamic configuration and gives users full control of their workspace. Users can override the preset data configurations and program Lightbox to load different data to meet their specific needs. For example, a perception engineer might choose to enable lidar data and disable planner decisions entirely.

In addition to tweaking the application’s settings, autonomy developers can also use Lightbox’s C++ API to create their own visualizations—think of it as developers combining visual lego blocks to understand the state of Aurora’s autonomy stack in Lightbox. This developer framework is called XVIZ and is part of Aurora’s open-source offering. This broadened contribution model removes the need for a specialized tool maintenance team, reducing resource-spend and eliminating a potential bottleneck.

We also designed Lightbox with built-in functionalities to address some of the most common use cases.

  • To support our offline evaluation process, Lightbox generates short visualizations (normally less than 30 seconds long) that include labels, simulation test results, scenario constraints, timelines, and other data our engineering teams need to rapidly review offline results.
  • A playlist functionality allows for rapid viewing of many short clips in succession. Users can add specific logs to a list or generate a list from a report (for example, a playlist of simulations that failed today) and simply click to play the next video instead of loading a new page or toggling between tabs. Lightbox preloads the lightweight data of the next video in the playlist, allowing users to instantly jump from one log to the next.
  • Lightbox also includes a side-by-side viewing functionality to compare old and new versions of our autonomy software. This enables autonomy engineers to quickly determine and visualize the effects of a code change within a single workspace.

Performance & Operational Scale

Since the release of our next-generation visualization tool, we’ve seen an overwhelmingly positive response from teams across the company—users feel that Lightbox allows for a continuous and dynamic introspection of Aurora’s autonomy stack. We’ve also seen strong preliminary results in terms of speed, reliability, and efficiency. We estimate that Lightbox’s fast loading capabilities translate to 300+ hours of engineering time saved per month.

We have also successfully turned the corner on a major operational shift: Lightbox removed a significant organizational bottleneck by enabling other teams to develop their own visualization features. They no longer have to wait on a centralized web team to get new features and are able to iterate much faster. Lightbox has become a decentralized development hub, greatly helping Aurora scale feature development across teams.

Aurora has always prioritized investing in building strong infrastructure and technological foundations. Doing so sets us up for long-term success by equipping our teams with the tools and support they need to move quickly and efficiently. The creation of Lightbox is one example of a strategic investment that will meaningfully accelerate autonomy development and bring us closer to the commercial launch of the Aurora Driver.

This article was originally published by Aurora Innovation Inc..

Tags

Visit Supplier

Visit Supplier Website

Contact Aurora

Address:

Mountain View
280 N Bernardo Ave
Mountain View
California 94043
US

Contact:

Call: (888) 583-9506

More News

Contact Aurora

Use the form to get in touch with Aurora directly to discuss any requirements you might have.










    We'd love to send you the latest news and information from the world of Future Transport-News. Please tick the box if you agree to receive them.

    For your peace of mind here is a link to our Privacy Policy.

    By submitting this form, you consent to allow Future Transport-News to store and process this information.