Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

add list_evaluation_modules function #59

Merged
merged 6 commits into from
May 26, 2022
Merged

add list_evaluation_modules function #59

merged 6 commits into from
May 26, 2022

Conversation

lvwerra
Copy link
Member

@lvwerra lvwerra commented May 23, 2022

Updates the list_metrics function to list_evaluation_modules. It uses the https://huggingface.co/api/spaces?filter={type} endpoint to query all available modules from the hub.

cc @julien-c

@lvwerra lvwerra requested review from osanseviero and lhoestq May 23, 2022 09:03
@lvwerra lvwerra changed the title add list evaluation modules function add list_evaluation_modules function May 23, 2022
@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented May 23, 2022

The documentation is not available anymore as the PR was closed or merged.

Copy link
Contributor

@osanseviero osanseviero left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks very good! 🔥 thanks for the PR!

I left some questions/suggestions

with_details (:obj:`bool`, optional, default ``False``): Return the full details on the metrics instead of only the short name.
type (:obj:`string`, optional, default ``None``): Type of evaluation modules to list. If ``None`` all types are listed.
include_community (:obj:`bool`, optional, default ``True``): Include community modules in the list.
with_details (:obj:`bool`, optional, default ``False``): Return the full details on the metrics instead of only the ID.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe for consistency with huggingface_hub this should be renamed to full. No strong thoughts though.

@@ -31,41 +32,67 @@ class SplitsNotFoundError(ValueError):
pass


def list_metrics(with_community_metrics=True, with_details=False):
def list_evaluation_modules(type=None, include_community=True, with_details=False):
"""List all the metrics script available on the Hugging Face Hub.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe you should clarify that this return a list of repo ids. The name tells me it returns a list of modules, which is a bit confusing since it really returns ids. Similar with the docstring

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I actually changed this a bit: for canonical modules the the namespace is omitted so this function returns what the user should put into load() function. Technically, load("evaluate-metric/accuracy") also works but I think we should use load("accuracy") as the standard.


r = requests.get(HF_LIST_ENDPOINT.format(type=type))
r.raise_for_status()
d = r.json()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

At one point it would be cool to have list_spaces in huggingface_hub. I opened huggingface/huggingface_hub#881

@lvwerra lvwerra requested a review from osanseviero May 25, 2022 12:35
Copy link
Contributor

@osanseviero osanseviero left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great!! Thank you 🚀

Co-authored-by: Omar Sanseviero <osanseviero@gmail.com>
@lvwerra lvwerra merged commit bcc2443 into main May 26, 2022
@lvwerra lvwerra deleted the list-modules-from-hub branch July 24, 2022 12:29
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants