I would just like to mention how professional sports like the NFL NHL and baseball and basketball leagues seem to promote prejudice, bigotry, and hatred.
I saw an interview the other day on WHDH channel 7 news in Boston with a patriots fan. She says she hates New Yorkers. What a thing to say. All this hatred just because a football team is playing against "your" team.
When will the people realize that professional sports is not that important?