While we continue to “lean in” and fight for gender parity at work, it’s important to recognize the position of privilege American women have in the marketplace, simply by virtue of living in this country.
Women in the U.S. are rising in the workforce, at every level. They’re starting businesses and working their way up to executive-level positions. As a result, according to the Council for Advancement and Support of Education, women control more wealth in the United States today than ever before.
To state the obvious: We hold immense financial resources in our hands, particularly in comparison to women in the developing world. Our wealth gives us more chances to give, to invest in fellow mothers, wives, and sisters by bringing them the opportunity to work and earn.
"I’m not sure Western women understand the power of restored dignity through work,” wrote Christian author, Jen Hatmaker. "We often disparage work, a luxury of the already ...1