- Highlight
- #1
This is a question I have been hella wondering about recently. Why is the media trying so hard to turn American men from the USA homosexual male? Why do they want men to be homosexual? They want white men to stop having babies and to start having sex with other men and I just don’t understand it. They keep showing homosexual men on tv and movies and on the internet and on marketing. It is simply hella confusing to me etc.