FAQ [OG3]

PERASPERA / FAQ – Frequently Asked Questions

(This FAQ and Web portal reflects the PSA consortium’s view. The EC and REA are not responsible for any use that may be made of the information it contains)

Q8 [OG2] [OG3]: The path planning functionality is included in OG2 (AF) or OG3 (CDFF)?

A: The path planning function is included in OG2. The OG2 AF will create a semantic model of its environment, describing the understanding of the rover’s situation with respect to surface-level hazards, objects, terrain and sites of interest in order to (re)plan optimum navigational paths and will implement the best possible course of navigation or action relative to the situation of the platform.

The OG3 (CDFF) is in charge of building a 3D model of the environment and the “Navigation” term corresponds here to the N letter in the GNC, that intends to provide the geometric/kinematic/dynamic state of a machine with respect to a reference frame (e.g. inertial frame) or its local environment through an estimation process that relies on a combination of processed data coming from absolute and/or relative sensors and a priori information.

Q9 [OG2] [OG3]: The Opportunistic Science capability is included in OG2 (AF) or OG3 (CDFF)?

A: The task of identify one object for scientific interest (by color, shape, composition etc) is included in the OG3 but it is the OG2 which take the decision of going to the scientific interest object and do the planning/(re)planning.

Q17 [OG3] [OG4]: We try to determine whether our current project of standalone instrument for the orbital rendezvous that incorporates not only data acquisition but also high-level treatment to extract the desired measurement (features relating to both OG3 and OG4) is admissible for the COMPET-04-2016 call.

A: First of all, we draw your attention on the text of the compendium and particularly the summary of OG3 and OG4 challenge description:

  • OG3: The challenge is the development of a software framework implementing data fusion techniques for various sensors such as LIDAR, Imagers, radar, sonar, IMUs, sun sensors, etc.
  • OG4: The challenge is to realize a suite of perception sensors that allow localization and map making for robotic inspection of orbital assets (under space representative conditions) and for planetary surface exploration.

This short summary clearly states that the SW development foreseen in OG3 aims at dealing with a large variety of sensor data types and not focusing on a particular sensor. In the same way, the sensor suite to be considered in OG4 must deal with every kind of the specified set of sensors and not a single one. As written further down in OG4 description, the emphasis is put in the inspection suite modularity and flexibility:  INSES shall be implemented with high modularity and flexibility in order to allow the sensor suite sub-set re-configuration. Each sensor unit shall be designed to be self-consistent with unified interfaces in order to be added/removed easily, without affecting the overall system functionality (plug-in like concept)”.

In addition, the relevance of addressing two OGs in the same proposal is evoked in this FAQ page in a previous question Q12

Q19 [OG3] [OG4] [All OGs]: Should we show, at the end of the project, that we are mastering the whole set of sensors and that these sensors can offer the required interface and modularity capabilities or should we build a complete robotic system including data processing and fusion functionalities ?

A: The first part of the alternative is the right one. It is definitely not foreseen within OG4 to build a robotic system with data processing and fusion capabilities since this would overlap with OG3 activity.

Nevertheless, it shall be proven that the OG4 end product can be interfaced with OG3 functionalities to demonstrate various capabilities of perception/navigation system. This type of activity will be performed within OG6 in coordination with all relevant OGs (OG3 for the most part). The elaboration of the demonstration requirements will be also elaborated in a collaborative mode.

Bidders attention must be drawn on what is indicated on page 5 of the “SRC Guidelines Space Robotics Technologies (COMPET-4-2016)” document:

“Even though the integration of OG outputs is in the future, all OGs need to interact in order to prepare for the future integration. The interaction among OGs will be based on:

  • Cross delivery of interface specifications
  • Cross delivery of preliminary/final outputs
  • Integration and demonstration of the final outputs of OGs in common test platforms

Each OG Consortium is expected to nominate an “interface engineer” in order to coordinate establishment and maintenance of interface specifications with other OGs.”