Of course in US schools, the genocide of the indigenous inhabitants is usually whitewashed; the curriculum sort of leaves you with the impression that North America was some vast, sparsely-populated land the white folks were just looking for some "elbow room". But the European colonial period, here that's usually just colored blobs on the map. I'm curious as to how this is taught in European classrooms. Any sort of reflection at all on how evil this was?
When I went to school in the 1990's Denmark's colonial history wasn't taught at all. We didn't learn anything about colonialism done by other nations as we simply didn't learn anything about non-Danish history. At best we learned that we used to have some islands in the west indies that were sold to the US, oh and there were some slaves there but they were freed earlier than in many other places so we're the good guys. We also learned that Greenland and the Faroe Islands were part of the Danish commonwealth but never why.
It is my impression that things are better today but not much. There has been some begrudging recognition of Danish complicity in colonialism among certain parts of society that has also permeated into education. The general popular attitude however remains an grumpy anti-woke unwillingness to even discuss the issue, let alone assume a moral responsibility as a nation.