What is "The West"
What is "The West"
We hear about “Western Involvement”, “Western Values”, and “Western Interests” in the media. People say that the West is the best, or that the West is in decline. Some country is either Westernising or hates the West's way of life.

The West is the countries with democracies and free markets right? Or countries that are part of Western Civilisation? Then what about Latin America, are they Western? What does “The West” even mean and what exactly is Western Civilisation.

 

What's your reaction?

Comments

https://the-happy-now.com/assets/images/user-avatar-s.jpg

0 comment

Write the first comment for this!

Facebook Conversations