
More Women In Healthcare Leadership Roles Could Improve Wellness
For years, women have enjoyed a prominent role in the healthcare profession. The vast majority of nurses and supporting medical staff in the US are women. However, despite being a predominantly female industry, relatively few women hold senior leadership roles. Data suggest that less than 12 percent of women are CEOs of Fortune 500 […]