Quite the opposite. My education skipped over all the bad stuff about the U.S. It made Christopher Columbus and the conquistadors seem like Indiana Jones. We never discussed Japanese internment. Slavery and Jim Crow were only mentioned so that we could learn how great we are for getting rid of them. My education was so focused on brainwashing me into unquestioning loyalty to this country, how could I be anything but disappointed in it when I grew up and started learning the truth?