Monday, November 21, 2011

Gender

Traditionally, women are nurses in the medical field, while men tend to be the doctors. More recently, I encountered a new friend who told me that he aspired to be a nurse and is currently in the nursing program at Baylor. This shift in career made me think about the change in gender roles especially now when men and women are becoming doctors or nurses more equally.

No comments:

Post a Comment