Black Women's Health Imperative
Changing black women's lives through advocacy and public policy, health education, research and leadership development, the Black Women's Imperative works to advance health equity and social justice for Black women across the lifespan.