Sony AI
View original resourceFHIBE breaks new ground as the first publicly available, consensually-collected, and globally diverse fairness evaluation dataset specifically designed for human-centric computer vision tasks. Created by Sony AI in 2024, this dataset addresses a critical gap in AI development: the lack of ethically sourced, diverse data for testing fairness across different populations. Unlike many existing datasets that raise consent and representation concerns, FHIBE provides researchers and developers with a clean, comprehensive benchmark for evaluating whether their computer vision systems perform equitably across different demographic groups.
The dataset stands out in three key ways that address longstanding issues in fairness evaluation:
FHIBE enables several critical fairness evaluation scenarios:
Getting started with FHIBE requires understanding both the technical integration and the evaluation methodology:
Published
2024
Jurisdiction
Global
Category
Datasets and benchmarks
Access
Public access
VerifyWise helps you implement AI governance frameworks, track compliance, and manage risk across your AI systems.