In the latest federal snapshot of the US labor market, job openings slid to 6.5 million in December—down nearly 1 million from a year ago—while hires and quits barely moved. Employers, in other words, are still employing, but they’re not reaching for new people the way they did.
This attitude of cautious selectivity and ambiguity, hard to discern from industry headlines, helps explain why a major AI forecast may be scarier than the facts warrant. A Gartner survey takes the view that while AI promises a “net neutral job impact through 2026,” the trend “will drive significant redesign of work.” Neutral here doesn’t mean “relaxed” or “unbothered.” It means “agitated.”
What does “neutral AI impact through 2026” really mean?
“Net neutral” refers to the fact that employment increases equal employment decreases. Even so, it remains a pretty brutal process if you’re hiring into the wrong lane because the same view implies a lot of internal turnover within the net employment gain—stuff being offloaded to software, teams being reshaped, etc.
Gartner’s advice to employers is blunt: restrain new hiring for low-complexity tasks and reposition people toward work that generates revenue. That is a clue for you, too: if you want stability in 2026, you’re trying to be hired into complexity, not out of it.
Data for the US appears to follow the same narrative. Indeed’s Hiring Lab sees 2026 as a cooler, not a collapsing year, with openings stabilizing, unemployment increasing slightly, and growth remaining positive, while hiring becomes more selective.
The macro picture: low hire, low fire
A “low hire, low fire” market is exactly what it sounds like. There’ll be fewer new doors opening, but also fewer people being shoved out of existing ones. In the BLS Job Openings and Labor Turnover Survey (JOLTS), December showed openings were down, hires had little change at 5.3 million, quits remained unchanged at 3.2 million, and layoffs/discharges were steady at around 1.8 million.
The Conference Board’s take is that the vacancy rate fell to 3.9%—its lowest level in almost a decade, excluding the brief drop during the pandemic—without signaling a wave of mass layoffs. This is why the experience for job seekers can feel worse than the experience for those already employed, with fewer openings, longer search times, and tougher screens.
Where hiring is still hot: AI-adjacent demand
Selective doesn’t mean locked in. It means money and attention are concentrated where leaders think they’ll get leverage. In January, Indeed reported that job postings mentioning AI were growing even as overall postings were flat or on the decline. Its AI Tracker hit 4.2% of postings in December 2025. Even more striking was that nearly 45% of data & analytics postings contained AI-related terms, versus about 15% in marketing and 9% in HR.
LinkedIn’s Economic Graph makes a similar point but from a different angle. It says US jobs requiring “AI literacy” grew 70% year over year, and that employers are hunting for blends—technical fluency plus the human skills that make tools useful in messy real workplaces.
So the practical takeaway is not “learn to code or else.” It’s: show you can use AI to ship outcomes, and show you still understand the domain you’re shipping them in.
Winners and at-risk roles (by tasks, not titles)
In 2026, job titles can be misleading. Two people with the same title may do very different mixes of work, with some of it ripe for automation and some stubbornly human. A better way to think is in terms of task buckets.
1) Building and/or coordinating AI (wins). These types of jobs will increase as companies increasingly use such technologies and realize they need people to use them properly, effectively, and lawfully. For example, ML/AI engineers, data engineers, AI product managers, model risk/governance analysts, forward-deployed, etc. According to LinkedIn, here are “new-collar” AI-related roles that are growing rapidly: data annotators, forward-deployed engineers, etc.
2) Utilize AI to amplify output (mixed-to-positive). These jobs don’t go away. Instead, they change. What gives you the advantage is speed provided by AI, and judgment about what “good” looks like. Consider marketing managers, financial analysts, sales operations, HR business partners, customer success, and paralegals. The skill is not in magic prompts; it’s converting drafts into decisions.
3) Routine, rules-based tasks (at risk). These are the jobs that get squeezed out when the work is predictable and heavily templated, and quality checking is easy. That would include some of the entry-level tasks such as basic reporting, simple content rewrites, low-variance support, scheduling, and part of the back-office processing. Gartner strongly warns employers against expanding their hiring into “low-complexity” tasks.
If you’re wondering where you sit, try this quick self-audit: if you can categorize any given task as “Copy, paste, reformat, repeat,” it probably should or will at some point become automated or a smaller part of someone’s time.
Functions snapshot (quick hits)
So what do “tasks” (not titles) look like in four different functions?
- Marketing
- More automated: first-draft copy, ad variations, basic SEO outlines.
- Stays human-led: positioning, measurement strategy, brand risk decisions.
- Evidence to watch: Frequency of AI-related mentions in marketing job posts (about 15% in Indeed’s data).
- Operations / analytics
- More automated: recurring dashboards, data cleaning patterns, standard forecasting templates.
- Stays human-led: deciding which metric is important, what controls to put in place, and how to communicate trade-offs to stakeholders.
- Evidence to watch: AI-related terms are found in approximately 45% of data & analytics postings.
- Customer support
- More automated: suggested responses, routing, summarizing cases.
- Stays human-centric: addresses edge cases, emotional de-escalations, policies, etc.
- Evidence: In the case of researchers who worked with a generative AI tool for customer support, productivity increased by about 14% on average, but among new workers, the increase was significantly higher. (See NBER Working Paper 31161.)
- HR
- More automated: Job description drafts, resume filtering, interview questions.
- Stays human-led: compensation philosophy, conflict resolution, performance judgment.
- Evidence to watch: AI terms appear in about 9% of HR postings (and rising). (Indeed Hiring Lab)
Your 2026 action plan by segment
The market is well known for rewarding proof. Your plan should produce artifacts such as work samples, metrics, and small case studies that make a hiring manager feel less risk.
Entry-level (break in)
In a low-openings environment, entry-level candidates often lose out because they look interchangeable. Your job is to look specific.
Spend 30 days creating three small projects that are concrete and tie to one of the target job families (marketing operations, marketing analyst, customer support operations, or junior PM). Use resume templates to format the work so it looks like a professional deliverable and not a class assignment: clear headline, tools used, result, and what you’d do next.
Three portfolio ideas that travel well across industries:
- AI-assisted research brief: one-page competitive scan with sources, plus a “decision memo” recommending a next step.
- Workflow automation: a simple intake form → spreadsheet → summary report (include a screen recording and a short README).
- Support knowledge upgrade: rewrite a messy FAQ into a searchable knowledge base outline, with a proposed tagging system.
Tie each project to a metric—even if it’s simulated. “Reduced time to draft from 2 hours to 20 minutes” is better than “used AI.”
Mid-career (edge up)
The advantage with mid-career employees is context. You know where the bottlenecks are. You’ve seen what happens if managers get too enthusiastic about shiny new tools.
What a workable 90-day plan might look like:
- Choose a process with pain you can measure (cycle time, rework rate, backlog, cost per ticket).
- Pilot an AI-assisted change with guardrails (what the tool can do, what it can’t, and who signs off).
- Report results in plain language—before/after, savings, risks, next steps.
This is also the year to angle toward “AI-adjacent” responsibilities inside your current company: training teammates, managing vendor rollouts, setting quality checks. LinkedIn’s data suggests employers are valuing blended profiles—AI fluency plus communication and problem-solving—because tools don’t manage themselves.
This is also the year to position yourself for “AI-adjacent” responsibilities within your current company: training colleagues, overseeing vendor rollouts, and establishing quality control measures. According to LinkedIn’s data, employers are valuing blended profiles—AI fluency plus communication and problem-solving—because tools don’t manage themselves.
Career-changers (pivot)
Changing careers in 2026 happens best through so-called bridge jobs, which reward the knowledge base you already possess as you acquire this new dimension.
Common bridges include implementation specialist, RevOps (revenue operations), QA/data labeling, compliance ops, and customer education. The learning plan doesn’t have to be mystical; it just needs to be timed.
A simple 12-week curriculum:
- Weeks 1–4: AI literacy basics + one domain toolset (CRM, BI dashboards, ticketing systems).
- Weeks 5–8: build two artifacts that mirror job descriptions (a process map, a dashboard, an onboarding guide, an SOP with controls).
- Weeks 9–12: narrative and network: 10 targeted informational interviews, three referrals requested, and a short “how I’d improve X” memo for each company you apply to.
The story you’re telling is: “Same problems, new tools.” That’s far more believable than “new person, new everything.”
How to pass screens in a cooler market
When hiring tightens, companies rely even more on filters, such as applicant tracking systems (ATS), structured rubrics, and keyword screens. If you think you can beat the process with keywords, think again: you won’t beat it by jamming keywords into your application; you’ll beat it by using keywords you see associated with the job you’ll be doing.
Three rules:
- Mirror the job description’s tools and outputs (not just skills). If they want “dashboards,” say dashboards and name the tool you used.
- Put AI where it belongs: as a method that improves speed/quality, not as a personality trait.
- Quantify one outcome per role: time saved, revenue influenced, errors reduced, backlog cleared.
Before you submit, run an ATS scan—use a resume checker to flag missing keywords, inconsistent dates, and formatting that may break parsing. Then write a short outreach note that connects your most relevant artifact to their most urgent problem.
A practical keyword list to pull from postings (only if you actually used them): “AI-assisted workflow,” “automation,” “prompting,” “quality assurance,” “governance,” “SQL,” “Python,” “CRM,” “dashboards,” “A/B testing,” “knowledge base,” “process redesign.”
Avoid these pitfalls
The easiest way to waste time in 2026 is to collect generic credentials without evidence that you can apply them. A certificate is not a portfolio.
Other common traps:
- Ignoring domain context. Tools don’t replace knowing what matters in healthcare, finance, logistics, or education.
- Only searching for fully remote roles. In selective markets, location flexibility can expand your odds.
- Chasing low-complexity work. If the role is mostly templated output, expect tougher competition and more automation pressure—exactly the hiring category Gartner warns employers about. (Gartner)
What to watch each quarter in 2026
You don’t need to become an economist, but you do need a few indicators to tell you whether to double down or pivot.
Track:
- JOLTS job openings and the openings rate (labor demand). (Use the BLS JOLTS portal.)
- Hires, quits, layoffs (churn and confidence).
- Share of postings mentioning AI (skill demand). Indeed’s AI Tracker is a useful pulse, especially by job family. (Indeed Hiring Lab)
- AI literacy growth signals (LinkedIn’s reporting can help you spot where employers are broadening expectations beyond technical roles).
Simple thresholds for action:
- If openings keep falling quarter over quarter, widen your target roles and geographies.
- If AI mentions rise in your field, invest in one portfolio artifact that proves you can use the tools responsibly.
- If quits rise sharply, competition may loosen; increase applications and networking volume.
Bottom line: neutral is not do nothing
“Net-neutral” is not a promise that nothing changes. It’s a warning that change arrives unevenly, whether it be by task, team, seniority, or region.
The best bet for 2026 is to find a profession where human judgment and AI complement each other. You can write and think more quickly and still make decisions accordingly. Build your proof, measure your results, and speak in terms of the job instead of the job you used to have.
If you’re feeling stuck, think of the next month as a mini project: one target role, three artifacts, five warm conversations, and one application package that you can make a case for line by line.
Meta title: 2026 Job Market Forecast: AI’s Net-Neutral Year
Meta description: AI won’t “erase jobs” in 2026—but hiring is selective. Here’s what’s growing, what’s risky, and how to compete.
References (if used): Bureau of Labor Statistics (JOLTS); Gartner; Indeed Hiring Lab; LinkedIn Economic Graph; The Conference Board; National Bureau of Economic Research; The Guardian
Read More From Techbullion