Question: How is American culture better today better than people think? I've heard lots of depressing claims about the abysmal state of American culture lately, particularly since Obama won the election. You've disputed that, arguing that America is better in its fundamentals that many people think. What are some of those overlooked but positive American values? How can they be leveraged for cultural and political change?