We didn’t build this business—somebody else did.”
So reads a sign outside a small roadside craft store in Utah. The message is clearly tongue-in-cheek. But if it hung next to the corporate offices of some of our nation’s big financial institutions or auto makers, there would be no irony in the message at all.
It shouldn’t surprise us that the role of American business is increasingly vilified or viewed with skepticism. In a Rasmussen poll conducted this year, 68% of voters said they “believe government and big business work together against the rest of us.”
Businesses have failed to make the case that government policy—not business greed—has caused many of our current problems. To understand the dreadful condition of our economy, look no further than mandates such as the Fannie Mae and Freddie Mac “affordable housing” quotas, directives such as the Community Reinvestment Act, and the Federal Reserve’s artificial, below-market interest-rate policy.
Far too many businesses have been all too eager to lobby for maintaining and increasing subsidies and mandates paid by taxpayers and consumers. This growing partnership between business and government is a destructive force, undermining not just our economy and our political system, but the very foundations of our culture.
With partisan rhetoric on the rise this election season, it’s important to remind ourselves of what the role of business in a free society really is—and even more important, what it is not.