How Women Have Changed the Past College Workforce

⫷ | Today's date is |
Once upon a time, a young, vibrant woman’s lot in life was to just stay home. Post World War II, back in the 1950’s, there was a sense of traditional gender roles and a woman’s primary role was to be a caregiver to her children, her husband and to maintain a neat home.

women-workforce-from-college

Now there was always an exception to the rule and that is when you had the occasional single ladies who may have been a Secretary fetching coffee, filing, answering phones and typing for her male boss.

However, that progressively began to change. The 1960’s and 70’s rolled through with the Women’s Movement and shook up all that was considered normal and standard.

Then comes 1980 and for the first time, both women and men were enrolled in college equally with the following year more women earning a degree than men.

With more women having earned college degrees than men, does this change the workforce? Yes, having more women earning college degrees absolutely changes the workforce dynamic.

Women born during the Silent Generation (born 1945 or before), during their college-age years, the majority of them at 58% were not participating actively in the labor workforce.

Then came the Baby Boomers. During the Baby Boomer Generation is when the percentage dropped dramatically to 29% who were not in the labor workforce.

To go from 58% down to 29% of women who were not working was a 50% decrease (or increase depending on how you are looking at it).

Now those numbers have remained very consistent with Millennials, with today 71% of young Millennial women being actively employed.

With the increase of women in the labor workforce, the implications have a positive effect. Women joining the workforce causes wages to rise and brings diversity to leadership roles.

Working women cause wages to rise as they help their communities and cities become more productive.

There can be many reasons for wages to increase such as women replacing men who were less productive and/or women with their college degrees are now offering unique skills causing an increase in competition.

College educated women also bring a diversity to the workplace and even more so when women are in senior-level leadership roles.

A woman in a leadership position brings a unique perspective that differs from a man. Studies show that a woman can bring empathy and intuition to their leadership role which has shown to have a positive impact on the workplace culture.

The role of women in the labor workforce, especially in leadership roles, is something that may always be a subject that is open to debate, speculation, and research.

The topic is even more so of a hotbed topic when you start discussing working mothers. Ultimately, anything that increases one’s self-worth will always be a benefit and a woman obtaining a college education does that.

You can click here and see all the positive and impactful ways a college degree brightens your future. Remember, even though getting your education may take a few years to obtain the positive effects of getting an education can last you a lifetime.

Leave a feedback