Sony AI
Original-Ressource anzeigenFHIBE breaks new ground as the first publicly available, consensually-collected, and globally diverse fairness evaluation dataset specifically designed for human-centric computer vision tasks. Created by Sony AI in 2024, this dataset addresses a critical gap in AI development: the lack of ethically sourced, diverse data for testing fairness across different populations. Unlike many existing datasets that raise consent and representation concerns, FHIBE provides researchers and developers with a clean, comprehensive benchmark for evaluating whether their computer vision systems perform equitably across different demographic groups.
The dataset stands out in three key ways that address longstanding issues in fairness evaluation:
FHIBE enables several critical fairness evaluation scenarios:
Getting started with FHIBE requires understanding both the technical integration and the evaluation methodology:
Veröffentlicht
2024
Zuständigkeit
Global
Kategorie
Datensätze und Benchmarks
Zugang
Öffentlicher Zugang
Canada's Artificial Intelligence and Data Act (AIDA): status, requirements, and compliance timeline
Vorschriften und Gesetze • Government of Canada
India AI Governance Techno-Legal Framework
Vorschriften und Gesetze • Office of the Principal Scientific Adviser (OPSA), Government of India
Microsoft Responsible AI Standard v2
Governance-Frameworks • Microsoft
VerifyWise hilft Ihnen bei der Implementierung von KI-Governance-Frameworks, der Verfolgung von Compliance und dem Management von Risiken in Ihren KI-Systemen.