Throughout the 19th century, Americans generally had a much better understanding of their anti-colonial origins than is the case today. Even though the last official war fought between Britain and the USA occured between 1812 and 1815, the British failure to destroy the United States militarily caused British foreign policy to re-focus its efforts on undermining […]
Source