Most Americans realize the mainstream media are biased. According to recent polls, Americans’ trust in mass media has hit all-time lows.
So, why has the mainstream media abandoned all pretenses of political neutrality in favor of pushing lies?
Well, it could certainly have something to do with the fact that the vast majority of those in the mainstream media (as well as those in high places in academia, Hollywood, and corporate America) unashamedly hold leftist views.
This is not a hypothesis. This is a fact.
More HERE