How and when did the news media, the entertainment industry, academia, and science gain a liberal bias? It doesn't make sense that corporate types and celebrities would support a political agenda that considers their wealth, products, content, materialism, power and influence to be "decadent". They've sold out to an ideology that if fully in control would send them to the gulag.
America's Founders had a liberal bias. But they also believed in capitalism, which is why they supported slavery and militaristic expansion. As a result, the spectrum shifted, and liberal capitalists supported liberty, equality, organized labor, fairness, and social justice, while conservative capitalists supported slavery, imperialism, child labor, militarism, and organized crime.
As a result of the Depression, conservative capitalism became extremely unpopular, so most people supported FDR's liberal New Deal. This is what got us out of the Depression, and FDR's wise leadership got us through WW2, with America emerging as a great superpower in the aftermath. Liberalism was successful as a political ideal, but it got bogged down in the geopolitics of the Cold War. Liberals were divided on the Vietnam War, many of whom saw it as an example of U.S. imperialism.