Workers Are Afraid AI Will Take Their Jobs. They're Missing the Bigger Danger. -- Journal Report

Dow Jones
Feb 16

By Matthew Call

Walk into any corporate office, and you'll hear the same anxious conversation: Will AI eliminate white-collar jobs?

The optimists insist that new jobs will emerge to replace the ones we lose -- after all, it has happened in previous tech revolutions. Pragmatists argue the workforce will simply become more productive with artificial intelligence, creating more value with minimal job cuts. And the pessimists fear entry-level knowledge workers will become obsolete altogether.

But this debate misses a crucial dynamic. Right now, workers are potentially training AI how to make them obsolete. And they often don't realize it.

The kind of AI used by companies, called an enterprise AI system, can capture everything you do at work and use that information to train itself. These systems can record your interactions within the platform -- the prompts you write, the documents you create, the queries you run.

In other words, the company can potentially track -- and claim ownership of -- every keystroke you make within the system, every idea you document there, every tool you build using that platform. It can identify what approaches worked best, what email language got responses and how you approached those clients. And all that knowledge can become part of the company AI, so it may eventually know, down to increasingly fine details, how you do your job.

Then comes the dangerous part for employees: The AI can pass that information along to anybody else who does your job, or in some cases just do the job itself. Over time, you could become a lot less valuable to your employer -- and a lot more replaceable.

This dynamic may fundamentally change the relationship between employer and employee. The stakes are so high and so urgent that both sides are rushing to position (or protect) themselves. Executives are rapidly implementing enterprise AI systems, seeking productivity gains and competitive advantage -- and they often aren't disclosing the implications for job security and privacy. Meanwhile, at least some employees are secretly adopting personal AI tools, sometimes violating corporate policies, so that their employers can't capture everything they know and do.

Capturing the essence

To understand what's coming, you need to understand what enterprise AI systems actually are. These are different from the interfaces you use at home. Enterprise AI systems are platforms that integrate directly into corporate workflows -- think of Microsoft Copilot embedded in Word, Excel and Outlook, or Salesforce's Einstein AI woven into customer-relationship management. These systems sit inside the tools where you already work. And they can potentially capture much of your work within the platform, learning from many interactions, and embedding that knowledge into company-owned infrastructure.

What once lived only in employees' heads, built through years of experience and hard-won expertise, is increasingly being institutionalized in real time. When you leave, at least some of your knowledge stays behind, embedded in systems that will be used by the AI and by your replacement (if a replacement is needed at all).

Imagine that you're a senior software engineer debugging a system crash. You run a bunch of tests to figure out the problem, and when you discover the solution isn't in the documentation, you develop a novel workaround. You share the solution with the company, obviously, but the expertise and techniques that you brought to the problem were all yours, in a fundamental way. You figured out the workaround because of what you know and how you work.

That is the way things used to be, anyway. When you do your work through enterprise AI, though, the system doesn't just record your solution. It can capture your problem-solving approach: which questions you asked first, how you refined the search when initial attempts failed, potentially even the logical steps that led you from symptom to cause. The next time junior engineers face a similar crash, the system may be able to guide them through elements of the methodology you used.

You haven't lost your expertise. But now the employer also has access to key aspects of that expertise, in a form it controls and can deploy to other employees without you. It has a partial blueprint for how you think, and some of the knowledge that once made you indispensable is now a reproducible company asset.

Making it personal

These revolutionary changes seem to put workers in a tight spot. But I believe employees have an alternative -- one that isn't easy, but could help move the power dynamic back in their favor. Specifically: Employees should consider avoiding their company AI systems when possible and use personal AI tools like ChatGPT, Claude, Gemini, Copilot or dozens of others.

These tools operate on completely different terms than enterprise AI. You access them directly. You own your prompts, your workflows, your customizations. The knowledge you create stays with you. Most critically, when you walk out the door, your AI-enhanced capabilities walk with you.

Maybe you're required to use your company's enterprise AI for client work. But all the strategic thinking you do before engaging with clients? Develop that using personal AI tools.

I spoke with a regional vice president at an energy company who does exactly that: He uses his firm's enterprise system for required compliance and documentation, but develops new analytical approaches and tests complex decisions in personal AI tools. The novel insights stay his.

What can be done?

Using personal AI tools is just the first step employees should take, however. To really change the power dynamic, they can act on other fronts.

-- Negotiate upfront. When joining a company, people should treat access to AI tools like intellectual-property ownership. Most employment agreements cover IP created on the job, but employees should dig further into a company's policies before signing on: What gets captured through enterprise AI? How long is that data retained? Can you use personal AI tools for skill development? Can you request deletion of your contributions if you leave?

Most companies haven't thought through these questions yet, which means there is room to establish reasonable boundaries before you're locked in.

-- Support collective action. Individual opt-out of AI is often impossible, so unions and professional associations need to pay attention. With collective bargaining, workers could demand transparency about the use of enterprise AI and demand fair compensation for the knowledge it gathers. Without collective power, individual employees will keep clicking "accept" on agreements that restructure their jobs simply because they have no alternative.

Concerted employee action may start to change the AI calculus. Employers may find that enterprise AI systems do capture knowledge, but at a steep cost: They may drive away the most talented employees, ones who realize they can build more valuable, portable capabilities with personal tools.

AI is breaking the traditional model of employment in real time faster than anyone realizes. The companies and employees who understand these dynamics will position themselves to capture AI's benefits. Those who don't may find themselves on the losing side of the biggest workplace shift in a generation.

Matthew Call is an associate professor in the department of management at Texas A&M University's Mays Business School. He can be reached at reports@wsj.com.

 

(END) Dow Jones Newswires

February 15, 2026 12:00 ET (17:00 GMT)

Copyright (c) 2026 Dow Jones & Company, Inc.

At the request of the copyright holder, you need to log in to view this content

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Most Discussed

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10