American attitudes toward socialism have shifted significantly since our nation fought the National Socialists of Germany in World War II.
For one thing, decades of indoctrination in the U.S. “education” system has led millions of Americans to believe Nazis weren’t even left-wing socialists and that, instead, they were “right wing.” This type of confusion about what socialism even means leads to a more favorable opinion of it.
(snip)
But, again, at a deeper level, there is rampant ignorance about the term. Most people, particularly young people, have no idea what socialism actually is, or they associate it with the European version that includes socialized medicine and social services nested in free-enterprise economies.
https://patriotpost.us/articles/63169-m ... ntent=body