Home Solutions Media Gallery Careers Contact Us
Use Case
  • Solve the data void when there is a shortage of training data for the development and validation of autonomy algorithms.
  • Customized 3D environments as test cases for refining autonomous behaviours and testing performance.
  • Cost effective simulations at scale.
  • Exhaustive testing of perception algorithms using edge case scenarios that would be otherwise too dangerous or expensive to acquire in the real world.
  • Bootstrap deep learning algorithms with synthetic data for rapid prototype development.
  • Test autonomy algorithms using local architecture to ensure autonomous machines are completely operational prior to validating them in the real world.
  • Create high fidelity videos for early stage research of intelligent systems.
  • Automatic generation of ground truth with various occlusion and visibility constraints.
  • Validate sensor models in adverse weather conditions.
  • Scalability can be met with clustering solutions.
  • Comparative sensors placement analyses for rapid product development.
  • Automatically labelled training data to train autonomy algorithms.
  • Accelerate training and validation leveraging past simulation efforts with co-simulation.
  • Collection of multi-fidelity sensor data in situations where the sensors have yet to be fully developed.
Features

Simulations

  • Vast suite of sensor modalities, including Visible Light (RGB) cameras, LiDAR, mmWave Radar and thermal sensor outputs to match physical devices.
  • Time-stamped sensor data ensures precision and consistency across multiple sources.
  • Highly optimized rendering and sensor-simulation engine, enabling real-time Hardware-In-the-Loop (HIL), Human-In-the-Loop (HUL), or Software-In-the-Loop (SIL) simulations even with complex sensor configurations

Extensibility

  • Extensible and intuitive Application Program Interface (API) allowing full control over any entity in the world, environmental/weather conditions, sensor placement, and configuration and simulation control including pause/resume.
  • Python Standard Development Kit (SDK) to interact with the API using the full power of Python, allowing for complex agent and sensor behaviours, including logical if-then relationships with all other entities in the world.
  • Designed from the ground up for co-simulation, with enabled integration with external engines, such as ANVEL, Project Chrono, and SUMO.
  • Scalable architecture based on a standard messaging system (pub/sub bus), which allows big-cluster simulations on large server networks, perfect for highly complex environments and scenarios requiring real-time sensor data-feeds from multiple sources.
  • Flexible system of plugins that can be extended to customize any step in the simulation pipeline and generation of datasets, including creating proprietary sensor modelling functions (LiDAR noise/intensity, etc.), image augmentations, and exporting/streaming of data.

Dataset Generation

  • Powerful spawner tools for the generation of highly-entropic multi-class datasets tailored for neural network training.
  • Configurable sensor capture frequency to yield real-time dataset generation even if the scenario itself is too complex to run in realtime.
  • Generation of on-the-ground features and materials, including bounding boxes (2D/3D), segmentation maps, position, motion, and contextual entity information, such as vehicle make/model, pedestrian clothing details, traffic-light states, etc.