You know, something I've noticed, the induction of Hawaii and the fact that they overthrew a country to do so is really glossed over in schools, while they DO tend to mention the slaughtering/deliberate infection with horrible diseases/herding into tiny reservations of the American Indians. What happened in Hawaii was no less bad, but I don't remember school mentioning ANYTHING about it when I was younger.
I don't think that it was even mentioned in my history classes and I had some pretty liberal high school & college teachers. It's usually the history of the victors that enters the history books and our classrooms.
I don't remember it being mentioned really in classes on the mainland either, and if so it was kind of like, "Oh, and Hawaii became the 50th state and blah blah" as though the island just kind of materialised out of nowhere and became a state one day =p
Comments 3
Reply
Reply
Reply
Leave a comment