AMIGOS
AMIGOS: Autonomous Multimodal Ingestion for Goal Oriented Support is a project funded by the Defense Advanced Research Projects Agency (DARPA) and awarded to the Palo Alto Research Center, Inc. (PARC). We are subcontractors to PARC, which are a very prominent american research company, known among other for the development of the ethernet, the modern personal computer, the graphical user interface, the computer mouse, and the field of ubiquitous computing. AMIGOS introduces fundamentally new methods for the offline extraction of procedural knowledge from sparsely labeled multi-media content through joint cross-modal learning on language, videos and illustrations. It will learn to recognise common task failures a priori by simulating failures not explicitly encoded in expert demonstrations and manuals. During live operation, a novel multimodal (MM) mixed-initiative dialogue manager will engage a multimodal perception module, a neuro-symbolic state tracker, and an adaptive goal-aware user model for interactive Augmented Reality (AR) guidance.
Our subproject deals with the knowledge transfer from textual and visual sources to planning models. These models are later used in the user modelling in order to track the user state and detect any problems in the execution of tasks. The subproject is based on two tools developed at our group, the Text2HBM tool, which deals with the automated knowledge extraction and model learning from textual sources and with the CCBM tool, which is a tool for user modelling and behaviour tracking.
The subproject started in January 2022 and will continue for four years.
Short Facts
- Project title: AMIGOS: Autonomous Multimodal Ingestion for Goal Oriented Support
- Sub-project at CoMSA²t: Knowledge Transfer and User Modelling
- Project homepage:
- Runtime: 01.01.2022 – 31.09.2025
- Sponsor:DARPA
- Budget: 600.000 Euro at CoMSA²t ($5.800.000 in total)