Skip to content

Add MaximumMeanDiscrepancy metric#3243

Merged
vfdev-5 merged 11 commits intopytorch:masterfrom
kzkadc:mmd
May 7, 2024
Merged

Add MaximumMeanDiscrepancy metric#3243
vfdev-5 merged 11 commits intopytorch:masterfrom
kzkadc:mmd

Conversation

@kzkadc
Copy link
Copy Markdown
Contributor

@kzkadc kzkadc commented May 1, 2024

Description: added maximum mean discrepancy (MMD) metric, which measures the discrepancy of two distributions.
https://www.onurtunali.com/ml/2019/03/08/maximum-mean-discrepancy-in-machine-learning.html

Check list:

  • New tests are added (if a new feature is added)
  • New doc strings: description and/or example code are in RST format
  • Documentation is updated (if required)

@github-actions github-actions bot added docs module: metrics Metrics module labels May 1, 2024
Copy link
Copy Markdown
Collaborator

@vfdev-5 vfdev-5 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR, @kzkadc
Right now CI is failing with related broken tests, can you please check them ?

Copy link
Copy Markdown
Collaborator

@vfdev-5 vfdev-5 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks a lot for the PR @kzkadc !

@vfdev-5 vfdev-5 merged commit 8c1912a into pytorch:master May 7, 2024
@vfdev-5
Copy link
Copy Markdown
Collaborator

vfdev-5 commented May 8, 2024

@kzkadc can you please check and fix the issue with doctest CI job:

**********************************************************************
File "../../ignite/metrics/maximum_mean_discrepancy.py", line ?, in default
Failed example:
    metric = MaximumMeanDiscrepancy()
    metric.attach(default_evaluator, "mmd")
    x = torch.tensor([[-0.80324818, -0.95768364, -0.03807209],
                    [-0.11059691, -0.38230813, -0.4111988],
                    [-0.8864329, -0.02890403, -0.60119252],
                    [-0.68732452, -0.12854739, -0.72095073],
                    [-0.62604613, -0.52368328, -0.24112842]])
    y = torch.tensor([[0.0686768, 0.80502737, 0.53321717],
                    [0.83849465, 0.59099726, 0.76385441],
                    [0.68688272, 0.56833803, 0.98100778],
                    [0.55267761, 0.13084654, 0.45382906],
                    [0.0754253, 0.70317304, 0.4756805]])
    state = default_evaluator.run([[x, y]])
    print(state.metrics["mmd"])
Expected:
    1.0726975202560425
Got:
    1.072697639465332

https://github.com/pytorch/ignite/actions/runs/8991931786/job/24700585654#step:7:895
Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants