American Kindness

I was just thinking about how genuinely kind Americans are.

You can go to France and see everyone being smug and angry, you can go to Japan and see everyone being fake-nice out of a sense of social duty; but when you come to the United States (especially the midwest) people look out for their fellow human because they want to and because it's the right thing to do. I think we should give more respect and appreciation to our social and moral economy.

Thanks for listening.

Edit: Do not conflate a nations government for its people, and yes, sometimes hard decisions need to be made; collateral damage happens to every country.