Society and their views on genders

Options
I know that there are a lot of people that think women are the weaker sex and that men should be in charge, but I'm not one of those people.
I have been brought up in a family where equality between men and women is vital, but, we started studying the topic in my R.S. class (or RE dependent on where you are from) and I found that a lot of the boys in my class think that men should "rule the world" and women aren't as strong as men. But this is only their view and I'm not criticising it but I was wondering about everyone else view as we need to do a case study on weather men and women should be equal in the eyes of society.
I would love to hear you takes on this topic.
(Please no rude or racist comments!)