The AI Optimism Gap: Executives Are All In, The Public Isn’t So Sure

The AI Optimism Gap: Executives Are All In, The Public Isn't So Sure - Professional coverage

According to Forbes, a new report from Just Capital, based on surveys with The Harris Poll and others, reveals a stark optimism gap around AI. The survey of over 2,000 U.S. adults, 111 executives, and 98 investors found that while 93% of executives and 80% of investors believe AI will be a net positive for society in five years, only 58% of the general public agrees. Majorities across all groups want AI content watermarked and support protecting creators’ IP. However, executives plan to spend just 1-5% of AI investment on safety, while the public and investors want over 5% spent. Furthermore, executives prioritize shareholder returns (28%) and R&D (30%) over worker training (17%) from AI profits.

Special Offer Banner

The C-Suite Bubble

Here’s the thing: those numbers are wild. When 93% of any group agrees on something, especially something as complex and disruptive as AI, it’s worth asking what bubble they’re living in. Executives are seeing the potential for massive efficiency gains, cost savings, and new products. Their view is from the top, looking down at spreadsheets and market opportunities. The public‘s view, at 58% positive, is from the ground, looking up at potential job displacement, weird chatbot hallucinations, and an internet filling with AI-generated slop. It’s not that one side is right and the other is wrong. It’s that they’re having two completely different conversations based on completely different sets of risks and rewards.

Where The Money Isn’t Going

The most telling split is on safety spending. Execs say they’ll put 1-5% of their AI budget toward it. The public and investors want that number above 5%. Now, think about that. In any tech rollout, especially one this powerful, safety and ethics are often treated as a compliance check-box, not a core engineering priority. Allocating a single-digit percentage basically confirms that fear. It says, “We’ll do the bare minimum to avoid PR disasters.” And then there’s the worker training bit. Only 17% of execs prioritized using AI profits to retrain their own people? That’s a stunning admission. Their plan is largely to hand gains to shareholders and plow money back into more AI R&D. So the very technology that might displace roles is funded, while the humans who might be displaced get a footnote. No wonder the public is skeptical.

The Unexpected Consensus

But it’s not all division. The widespread agreement on watermarking AI content is a huge deal. When 78% of the public and 86% of execs want the same thing, that’s a clear signal to policymakers. The internet is becoming unmanageable because we can’t tell what’s real. Everyone, from corporate PR teams to everyday social media users, sees the need for some basic labeling. The consensus on compensating local communities for data center energy use is another fascinating area of agreement. People are directly connecting the AI hype to their power bills and environmental concerns, and they’re saying companies should foot that bill. That’s a pragmatic, bottom-line demand that cuts through the usual tech idealism.

The Road Ahead Is Rocky

Basically, this report paints a picture of a technology barreling ahead, driven by incredibly optimistic leaders, while the people it’s supposed to serve are tapping the brakes hard. The execs’ focus on shareholders over workers, and their minimal safety budget, validates the public’s core fears. You can read the full report here for more details. And look, I get the executive optimism. The potential is real. But if you’re wondering why there’s so much pushback, so much regulation talk, and so much public anxiety, this survey is your answer. The leadership class isn’t just on a different page—they’re reading a different book. Bridging that gap will require more than just promises. It’ll require actually spending money on safety and people, not just algorithms and shareholders.

Leave a Reply

Your email address will not be published. Required fields are marked *