I don’t really wonder why the West gets a bad reputation. The West has completely dominated the world the last 500 years and spent the last hundreds years of so criticizing itself, which means criticism of western society is widely available.
Other parts of the world has not been keen of self criticism. Often in non-democracies it has been more politically preferable to blame outsiders such as the West for their problems.
The West certainly played a major role, but the stories tends to be heavily skewed in an anti-Western direction.
You rarely hear about good things done by the West because anyone writing about the good parts are afraid of looking like they condone Western imperialism or colonialism.
Thus the emergence of a very negative perception of the West is natural. Yet they does not mean it represents the objective truth whatever that may mean.
You see much more objective analysis of old empires such as the Roman, Chinese, Mongolian, Mughal, Ottoman, Russian, Persian or Aztecs empires despite those in very many regards were much worse than Western empires.