No one should be told that they can’t love someone, whether that person is another race, another religion, or of the same sex. I believe that being gay/lesbian is not a sin, but even if it was, what makes that sin different from any other? Why must society tell lesbians or gays that they are wrong for being who they are when God made them that way? Homosexual people should have equal rights and should be able to get married and adopt kids like everyone else. Loving someone of the same sex is not hurting anyone in any way. People in America like to go off of what the bible says, but the bible does not actually say that homosexuality is a sin. But there are many others that Americans do everyday. So why is one sin worse than the other??
Just like others in the class, I enjoyed Invisible war the most. It caught my attention. It is very wrong that women in the military have to deal with sexual assault. It is very wrong that the military does nothing for these women! These women fight for their country just like the men do. They do not deserve to get sexually assaulted while doing their job. They should be worried about defending this country, not defending themselves from their military coworkers. These women would also expect support from the military officials to do the right thing by punishing the sexual assaulters. Why would any women want to defend a country that doesn’t want to defend them?
It hurt my heart to see these women at their weakest and wanting to commit suicide because of this issue. No one should feel that way, they should get justice, but for some reason military officials like to hide sexual assault. It is so wrong, it truthfully changes my view on the military.
In class, we started talking about how women are basically sex objects in our culture. How everything that is trying to be sold has a “sexy” woman along with it, whether it’s food, clothes, or electronics. Even though some women say they do not like being a sex object, they tend to get caught up in some of the “sexy” trends that happen in our culture. Trends like a big butt, big boobs, skinny waist but big everything else, or more revealing clothing. Woman may think they are just making themselves happy or they feel pretty, but what they don’t notice is that they are becoming apart of a trend that America thinks is beautiful. Women, little girls, and teens need to realize that they are beautiful just the way they are. They do not need a trend to define their beauty. The definition of sexy should change. It shouldn’t be big butts and big boobs, it should be being comfortable in your own skin. It should be confidence. Women shouldn’t have to feel like they have to be sexy all the time to get acceptance from men and society.
I just think that a lot of things about beauty and exploiting women should change.