Sony AI
Original-Ressource anzeigenFHIBE breaks new ground as the first publicly available, consensually-collected, and globally diverse fairness evaluation dataset specifically designed for human-centric computer vision tasks. Created by Sony AI in 2024, this dataset addresses a critical gap in AI development: the lack of ethically sourced, diverse data for testing fairness across different populations. Unlike many existing datasets that raise consent and representation concerns, FHIBE provides researchers and developers with a clean, comprehensive benchmark for evaluating whether their computer vision systems perform equitably across different demographic groups.
The dataset stands out in three key ways that address longstanding issues in fairness evaluation:
FHIBE enables several critical fairness evaluation scenarios:
Getting started with FHIBE requires understanding both the technical integration and the evaluation methodology:
Veröffentlicht
2024
Zuständigkeit
Global
Kategorie
Datensätze und Benchmarks
Zugang
Öffentlicher Zugang
VerifyWise hilft Ihnen bei der Implementierung von KI-Governance-Frameworks, der Verfolgung von Compliance und dem Management von Risiken in Ihren KI-Systemen.