I only just found out that some people consider Japan a western country because of it democratic and capitalistic system, with separation of church and state, sound institutions that include due process and innocence until guilt is proven, freedom of press etc etc etc. I fin d this interesting because there was once a time where the west meant the English speaking world ie Britain and Ireland, US, Australia, Canada and New Zealand. Then I began to notice, that the west also meant western Europe. Now seemingly the west includes Japan, weird huh, but I've heard little mention of South Korean, Taiwan, Hong Kong or Singapore. This is very much reminiscent of the signing of the Treaty of Versailles, where Japan was the only world leader present with any influence.
0 ( +0 / -0 )