I know in my country (usa) you were portrayed as the enemy and controlling. So I'm just curious what your history class teaches about that o-o or if it covers that. I wanna know what the other side says lol.
Honestly, its been 11 years since i left school and i don't remember being taught about the colonisation of the Americas at all. however i didn't take history for my final two years.
It's basically viewed largely a side note to more major European events/eras such as the Enlightenment/French Revolution/Napoleonic Wars and many times not mentioned at all.
The colonies themselves however seem to be ignored/viewed as just one of the many colonies Britain had in the time period. The reason why the American Revolution is considered more significant is that it was a byproduct of the Enlightenment and led to more major European events like the French Revolution which ultimately led up to the Napoleonic wars. So the colonies are largely ignored/just a side note, lol
Comments on Profile Post by strongpelt