White men need to see reality
By Tom Ehrich
Many white American men are upset, indeed raging. I get that. And I can see why.
Their nation is becoming more and more diverse. Within their lifetimes, whites could become a minority. Much that had belonged solely to white men now must be shared with many. They have been poorly prepared for dealing with these sea-changes.
Many were taught that whites are inherently better, that white skin is their ticket to the American Dream. They have been encouraged to see non-whites and women who do outperform them as having been unfairly boosted by affirmative action and government-led conspiracies.
When they look at chronic unemployment and underemployment, they are encouraged to blame women and dark-skinned minorities for taking their jobs. They don't think to blame Wal Mart for costing America 400,000 domestic jobs by buying merchandise overseas. They don't think to blame companies for hiding US-earned revenues overseas to avoid paying their fair share of US taxes. They don't think to blame US steelmakers and automakers for being so blinded by anti-union attitudes and so fearful of risks that they stopped innovating and gave industrial jobs away to Japan and Germany.
They see something of the ultra-privileged and ultra-lavish lives that a few whites enjoy. But they don't recognize how the 1% got there: by allowing public schools in their neighborhoods to languish, by turning higher education into havens for the entitled set, by minimizing their own tax burden and shifting it to the middle-class, by paying huge salaries to corporate executives whose only game plan is to cut jobs and to prey on the vulnerable, and by setting professional fee structures that reward financial and legal professionals for doing little more than sitting beside the stream of cash wielding large buckets.
White men see relentless social turmoil as women, blacks and immigrants push through glass ceilings at the top and assume a place at every classroom and table below the top. White men mistakenly believe that the old order benefitting them was a better order, when in fact it was an artificial system designed by commerce to sell product and reinforced by religion to protect religion's franchise.
White men see fewer of their faces in public life. They see more television shows starring women, more ads featuring blacks, more public acclaim for women and people of color in business and technology, more females in political office, and more professions like dentistry, veterinary medicine, general medicine, and education dominated by women. If they were taught, as many were, that women are inferior, then either American life is falling apart with all of these "others" running things or they themselves are even more inferior.
Many white men were taught to believe in a world that was largely delusional, not normative. That world ended long ago. Their sense of loss is being fed by the very people who cut their jobs, tanked their economy, made off with the wealth, and now lord it over them as "losers."
Right-wing politicians are telling white men to go after immigrants, Muslims, uppity blacks, uppity women, and government -- none of whom had anything to do with the hard times that white men are experiencing.
They aren't being encouraged to see the truth, namely, that their allies aren't rich people or rich politicians, Their allies are other people trying to get ahead. And their benefactors won't be demagogues demonizing fellow citizens or promising simple macho solutions. They need a growing economy, better education for all, fair taxation, a responsive government, religion grounded in what God actually wants, and fewer predators allowed free rein to swindle them.