Sony AI
Ver recurso originalFHIBE breaks new ground as the first publicly available, consensually-collected, and globally diverse fairness evaluation dataset specifically designed for human-centric computer vision tasks. Created by Sony AI in 2024, this dataset addresses a critical gap in AI development: the lack of ethically sourced, diverse data for testing fairness across different populations. Unlike many existing datasets that raise consent and representation concerns, FHIBE provides researchers and developers with a clean, comprehensive benchmark for evaluating whether their computer vision systems perform equitably across different demographic groups.
The dataset stands out in three key ways that address longstanding issues in fairness evaluation:
FHIBE enables several critical fairness evaluation scenarios:
Getting started with FHIBE requires understanding both the technical integration and the evaluation methodology:
Publicado
2024
Jurisdicción
Global
CategorÃa
Datasets and benchmarks
Acceso
Acceso público
VerifyWise le ayuda a implementar frameworks de gobernanza de IA, hacer seguimiento del cumplimiento y gestionar riesgos en sus sistemas de IA.