Sony AI
Voir la ressource originaleFHIBE breaks new ground as the first publicly available, consensually-collected, and globally diverse fairness evaluation dataset specifically designed for human-centric computer vision tasks. Created by Sony AI in 2024, this dataset addresses a critical gap in AI development: the lack of ethically sourced, diverse data for testing fairness across different populations. Unlike many existing datasets that raise consent and representation concerns, FHIBE provides researchers and developers with a clean, comprehensive benchmark for evaluating whether their computer vision systems perform equitably across different demographic groups.
The dataset stands out in three key ways that address longstanding issues in fairness evaluation:
FHIBE enables several critical fairness evaluation scenarios:
Getting started with FHIBE requires understanding both the technical integration and the evaluation methodology:
Publié
2024
Juridiction
Mondial
Catégorie
Datasets and benchmarks
Accès
Accès public
VerifyWise vous aide à implémenter des cadres de gouvernance de l'IA, à suivre la conformité et à gérer les risques dans vos systèmes d'IA.