Ad blocker interference detected!
Wikia is a free-to-use site that makes money from advertising. We have a modified experience for viewers using ad blockers
Wikia is not accessible if you’ve made further modifications. Remove the custom ad blocker rule(s) and the page will load as expected.
This alternate history related article is a stub. You can help by expanding it.Britain is an isle north of France that was the heart of the British Empire before its' annexation by the Imperial States of America. The British Isles have a long and varied history that appeared to culminate in the British Empire's rule over a third of the planet by the beginning of the 20th century. When Britain entered World War I against the German empire, it marked the start of the Empire' eventual fall. Although Britain won the war, it lost much of its' manpower and resources. The Great Depression worsened Britain's economic plight, leading to the rise of radical, communist sympathezing labor movements, who worked to promote revolution. When the United Kingdom entered World War II in 1939 against the German People's Republic, these movements were marginalized as the nation rallied around Winston Churchill in one of Britain's finest moments. Defeating Germany again, the British Empire neared collapse. The United States took much of Britain's colonies and prestige, though the British aided them abroad. When Britain entered World War III in 1953, the nation suffered even more then in the previous two world wars. The United Kingdom withdrew from her colonies and focused internally. The elected government soon collapsed, followed by a bitter, three way civil war, of which the National Socialists emerged the victor. Refusing to allow a hostile government control Britain, the Imperial States of America invaded and annexed the isle in 1964, making the United Kingdom an American state, rather ironic since America was once a part of the British Empire prior to the American Revolution. Britain continues to be a key asset in what remains of the American Empire.
Decline of an Empire
Britain emerged from World War I heavily scarred and with nearly a third of its' young men killed in action. The United Kingdom had fallen heavily into debt, which was partially repaid by the sale of its' American territories to the U.S in 1920. The United Kingdom appeared to be recovering from the war, until the onset of the Great Depression.
In the early 1930's, the unemployment rate increased drastically, and economic conditions generally declined.