The Wall Street Journal, the cultural zeitgeist of urban metropolis' and people who talk loudly on Starbucks lines, have been informed by the US Census that more women are working in awesome jobs that used to be occupied by men.
Women account for a third of the nation's lawyers and doctors, a major shift from a generation ago when those professions were occupied almost exclusively by men, new Census figures show.
Women's share of jobs in the legal and medical fields climbed during the past decade even as their share of the overall workforce stalled at slightly less than half. Women held 33.4% of legal jobs—including lawyers, judges, magistrates and other judicial workers—in 2010, up from 29.2% in 2000. The share of female physicians and surgeons increased to 32.4% from 26.8% during that time.
In 1970, women were 9.7% of the nation's doctors and just 4.9% of its lawyers, according to Census data.
Obviously, it goes without saying that:
But are there reasons behind this shift? Of course there are. Here are some highly factual (that's a thing people on the internet do to let you know that this will actually not be factual) theories as to how we let "THEM" take our jerbs.
A. Girls watched Grey's Anatomy, and then were very annoying during the morning hours of high school the next day.
B. Guys found some of these girls attractive.
C. Guys and girls went to party, and the cool ones who "drank" hooked up
D. Guys had to go on a date so they can "go far," and so girls can tell all the other girls "OMG what happened"
E. When they got back from going to Friendlys, they spent some time "downstairs"
F. The girl was panicked and wanted to watch something to set the mood, so flipped on "Grey's Anatomy"
G. From that moment on, everything that went downt that night was physically and emotionally scarring
H. Therefore, fuck ever being a doctor.
A. We're not as racist or sexist as we used to be.
B. Girls make flash cards when they study