In Elementary School they taught me that Native Americans loved the European Settlers and welcomed them with open arms celebrating the exchange of culture and ideas.
In High School they went "oh by the way that was a lie here's what really happened"
If all we are telling kids about History is sanitized and white washed to avoid telling kids that things aren't always rosy then why bother teaching them History at all?
Why not wait until we are willing to teach them the real history? Wouldn't it be easier to teach them what really happened than to have to explain to them which parts of what we taught them were real and which parts were us just making ourselves look good?
In High School they went "oh by the way that was a lie here's what really happened"
If all we are telling kids about History is sanitized and white washed to avoid telling kids that things aren't always rosy then why bother teaching them History at all?
Why not wait until we are willing to teach them the real history? Wouldn't it be easier to teach them what really happened than to have to explain to them which parts of what we taught them were real and which parts were us just making ourselves look good?
Comment