Saturday, July 20, 2013

"Feminism" Defined: A Detailed Look at One of the Most Controversial Labels Today

According to the dictionary, feminism is defined as: 1). the theory of the political, economic, and social equality of the sexes, 2). organized activity on behalf of women’s rights and interests.

Read the entire article here.

No comments:

Post a Comment