American Colonies

America has had a wonderful history that is full of dark and evil portions. Remember when America broke free of Great Britain/ England because it didn't want to be a colony anymore? Well, it seems that American leaders forgot because America started colonizing in the late 1800's. After the U.S. won the Spanish-American War, they took over its territories in the Pacific as well as islands in the Americas like Cuba and Puerto Rico. 
See the source image

America made the Philippines an official territory and was reluctant to give it up until after the Japanese came over in WWII. After Japan surrendered, America gave up and allowed the Philippines to become a full, sovereign nation. Yay, Philippines becomes its own nations. But wait, there's more.

Remember Cuba? Yeah, that didn't turn out too well. The U.S. tried to put in Batista, a cruel dictator who was aligned with the U.S. The people, however, hated him, and before we knew it the Cuba Revolution had taken place. This is why today, we have communist Cuba that sucks. They have pretty cars though

Puerto Rico is now a territory of the U.S. Many people want it to be a state, while many don't. Puerto Rico does not have much power over legislation and policy, but can vote and travel freely and all that stuff. But, if it were to become a state, then it would have to pay taxes and all of that fun stuff that may not really be worth it. I mean, they still have protection by the U.S. military, so why else?


Comments

Post a Comment