(The North American one, not the European one).
Well?
I got taught one version that went pretty much like this:
The Americans took advantage of Britain's engagement against Napoleon in Europe by invading Canada. The Canadian militia, along with their Native Allies and what few British regulars weren't being used elsewhere (e.g. fighting Napoleon) repelled that invasion so that the US gained no territory at all. Since they preserved all of Canada's territory against an invader, the Canadians, British, and their Native allies (but mostly the Canadians) won the War of 1812.
In the past few years, though, I've heard from my American friends that they got taught a different version:
The British continued to violate American sovereignty after the American Revolution until British acts to the new country became so heinous that there was no other choice but to go to war to defend American rights. In the end, the British agreed to respect American sovereignty, so the Americans won the War of 1812.
Lately, though, I've started to take a more nuanced view. I think that things actually shook out like this:
- Canada won. They fought off an invader.
- The United States won. They got what they were fighting for: British respect for the terms of the Treaty of Paris.
- Britain lost. They gained nothing and gave up what the US wanted.
- the Natives lost. They were instrumental in Canada's victory, but their own war continued after the War of 1812. They were decimated by the Americans and their former allies turned their back and refused to help.
Thoughts? Do you agree? Disagree?
Well?
I got taught one version that went pretty much like this:
The Americans took advantage of Britain's engagement against Napoleon in Europe by invading Canada. The Canadian militia, along with their Native Allies and what few British regulars weren't being used elsewhere (e.g. fighting Napoleon) repelled that invasion so that the US gained no territory at all. Since they preserved all of Canada's territory against an invader, the Canadians, British, and their Native allies (but mostly the Canadians) won the War of 1812.
In the past few years, though, I've heard from my American friends that they got taught a different version:
The British continued to violate American sovereignty after the American Revolution until British acts to the new country became so heinous that there was no other choice but to go to war to defend American rights. In the end, the British agreed to respect American sovereignty, so the Americans won the War of 1812.
Lately, though, I've started to take a more nuanced view. I think that things actually shook out like this:
- Canada won. They fought off an invader.
- The United States won. They got what they were fighting for: British respect for the terms of the Treaty of Paris.
- Britain lost. They gained nothing and gave up what the US wanted.
- the Natives lost. They were instrumental in Canada's victory, but their own war continued after the War of 1812. They were decimated by the Americans and their former allies turned their back and refused to help.
Thoughts? Do you agree? Disagree?