Personally, I've never thought that being an American was something to necessarily be "proud" of? Did I earn my citizenship? Was it something that I worked to achieve? I've always thought that pride should stem from one's accomplishments, not mere chance. I see someone who says "Proud to be an American" (or any other country) kind of like an NBA star saying that he's proud to be tall. In my book, nationalism is a modern form of incest and only further divides people into various factions, stemming violence and aggression. So, is patriotism a good thing like we've been raised to believe or does it need to be eradicated for the collective human good?