Skip to content

Conversation

@jlaserna
Copy link
Contributor

@jlaserna jlaserna commented Dec 9, 2024

This PR resolves an issue where evaluation metrics were not being correctly saved during the execution of the loop closure pipeline. Specifically, the method self.results.compute_closures_and_metrics() was being called after saving the configuration and logging, resulting in incomplete or missing evaluation metrics in the output file (evaluation_metrics.txt).

The change moves the compute_closures_and_metrics() method to be executed before the configuration is saved and the results are logged, ensuring that all metrics are correctly calculated and included in the output.

Issue Details

Before this fix, running the pipeline with the command:

map_closure_pipeline -e --dataloader mulran ./data/MulRan/KAIST03/ ./output

produced an incomplete evaluation_metrics.txt file with no data:

+----------+----------------+-----------------+-----------------+-----------+--------+----------+
| #Inliers | True Positives | False Positives | False Negatives | Precision | Recall | F1 score |
+==========+================+=================+=================+===========+========+==========+

                                 Loop Closure Evaluation Metrics                                 

After the fix, the metrics are properly calculated and saved, as shown below:

+----------+----------------+-----------------+-----------------+-----------+--------+----------+
| #Inliers | True Positives | False Positives | False Negatives | Precision | Recall | F1 score |
+==========+================+=================+=================+===========+========+==========+
|    5     |     54062      |       502       |     1504324     | 0.9908    | 0.0347 | 0.0670   |
|    6     |     54062      |       502       |     1504324     | 0.9908    | 0.0347 | 0.0670   |
|    7     |     54062      |       502       |     1504324     | 0.9908    | 0.0347 | 0.0670   |
|    8     |     54062      |       502       |     1504324     | 0.9908    | 0.0347 | 0.0670   |
|    9     |     54062      |       502       |     1504324     | 0.9908    | 0.0347 | 0.0670   |
|    10    |     54062      |       502       |     1504324     | 0.9908    | 0.0347 | 0.0670   |
|    11    |     54062      |       502       |     1504324     | 0.9908    | 0.0347 | 0.0670   |
|    12    |     52302      |       16        |     1506084     | 0.9997    | 0.0336 | 0.0649   |
|    13    |     52302      |       16        |     1506084     | 0.9997    | 0.0336 | 0.0649   |
|    14    |     52302      |       16        |     1506084     | 0.9997    | 0.0336 | 0.0649   |
+----------+----------------+-----------------+-----------------+-----------+--------+----------+

                                 Loop Closure Evaluation Metrics                                 

@saurabh1002
Copy link
Collaborator

Thanks @jlaserna for detecting this bug.
I will merge the PR soon.

@saurabh1002 saurabh1002 added the bug Something isn't working label Dec 9, 2024
@saurabh1002 saurabh1002 merged commit 356d15b into PRBonn:main Dec 9, 2024
10 checks passed
@jlaserna jlaserna deleted the bugfix/evaluation_metricts_results branch December 9, 2024 16:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants