Updated: Aug 27
Let me start off by saying that the title of this post is clearly snarky, but it's not what you think. The always great Wellable blog recently posted the results of a study done at The Well-being Lab at George Mason University that shared the following:
"wellness benefits alone cannot meaningfully change the health of employees" and...
"...leadership and culture have an incredible impact on well-being in the workplace."
Finally, "only 14% of respondents felt their manager regularly shows care, compassion, and appreciation for them."
Company culture is a topic I'm passionate about. Read that last bullet again--"only 14% of respondents felt their manager regular shows care, compassion, and appreciate for them." That is an alarmingly low number, yet I'm not that surprised. It was my own poor experiences working for toxic companies that ultimately led me to forge my own path and launch my own company.
Executives set the tone and culture for their organization, but direct managers are the ones that employees interact with most frequently. This is true whether you work at a Fortune 500 company or an organization with less than 50 employees. So why does this happen? I have a few thoughts:
Executives' behaviors don't mesh with a company's culture. I've said it before and I'll say it again--leadership sets the tone for a company's culture. The Marketing or Communications teams can come up with Company Values, but if leadership doesn't embody and model those values, it's a waste of time.
Benefits are offered because it's something "that has to be done". Benefits that are provided to check off a box such as wellness programs or tuition reimbursement don't benefit employees if employees are discouraged from using them, or worse, punished for using them.
Managers aren't properly trained to manage. How many managers are promoted to their position but don't receive any formal training on how to manage people? My experience is management training is a rarity. Managing people can be intuitive, but more often than not, people need to learn how to manage other people. Companies should give people skills to manage others well and empower managers to be advocates for their team.
Poor managers need to be held accountable. A bad manager can make your life hell. It's even worse if it's been documented that somebody is a bad manager but the company doesn't address it. This merely perpetuates toxic behavior and drives good people out of the company.
Let me climb down my soapbox (I warned you this was a topic I was passionate about). We spend enough time at work, it should be as enjoyable as it can be. To be clear, not all of my work experiences were bad. I've had some great bosses that I'd work for again in a heart beat. Unfortunately, the list isn't very long.
What do you think of my reasons? Any that you would add?