And you likely never will. The trend is away from anything that represents America. Kids are taught that America has warts, we occupy stolen ground, America was founded on racism, and America exploits others for the sake of a wealthy few. Kids are taught that America is not about principles of government of, by, and for the people. Instead they are taught that America must be emotionally satisfying.
The confirmation hearings of Brown Jackson is a perfect illustration of this. Biden and millions of others have said that having a black female appointed to the Supreme Court helps America fulfill the ideals of the Declaration of Independence and wildly applaud her confirmation. Yet, said black female appointee does not commit to the idea of inalienable rights.
So, what is more important about understanding what being an American means? Does it mean applauding a black woman of color? Or does it mean understanding and believing in the idea of inalienable rights?