Policy Snapshot
Giving citizens a direct ownership stakes in AI infrastructure via equity stakes
Scenario
Gradual
Augmentation
All Scenarios
Rapid
Automation
Scope
Near Term
(Volatility Risks)
Medium Term
(Transition Risks)
Long Term
(Structural Risks)
Governance Level
Local
National
International
Target
Entrepreneurs
Displaced Workers
Primary Actor
Governments
Private Actors
Skill Gap Analyses
Standardized AI exposure measurement frameworks that direct workforce development funding toward the specific occupations and communities where task-level displacement data indicates greatest need
What it is:
Skill-gap analyses for AI apply task-based frameworks that break occupations down into their component tasks, then assess which tasks are susceptible to automation or augmentation by AI systems. Rather than predicting wholesale job elimination, this approach recognizes that AI affects work at the task level; some tasks within an occupation may be highly exposed while others remain resistant to automation.
Skill-gap analyses offer policymakers a more precise alternative to broad-based workforce interventions. By identifying which tasks, occupations, and communities face the greatest exposure to AI, these frameworks can help direct retraining funding where it is most needed, enable community colleges to anticipate demand for emerging credentials, and allow employers to distinguish between roles that are candidates for augmentation and those likely to shrink. Importantly, high task exposure does not automatically translate into job displacement; the same analysis that identifies automatable tasks may also reveal opportunities for workers to shift toward higher-value activities within the same occupation.
The primary challenge is that task-based frameworks depend heavily on the quality and currency of the underlying occupational data. O*NET and similar occupational taxonomies are updated infrequently relative to the pace of AI capability development, meaning exposure estimates can quickly become outdated. Moreover, whether theoretical task exposure translates into actual labor-market outcomes depends on several factors, including occupational context, regulatory constraints, educational pipelines, and whether AI functions as a substitute for or complement to existing skills.
Recommended Reading:
David Autor and Neil Thompson
Beyond Job Displacement: How AI Could Reshape the Value of Human Expertise
December 2025
Autor and Thompson argue that standard AI exposure analyses miss a crucial dimension: whether AI automates an occupation's expert or inexpert tasks, which determines whether wages rise or fall. Using linguistic analysis of 303 occupations over four decades, they find that occupations whose expertise requirements rose by one standard deviation saw 18% wage increases but lower employment, caused by a shrinking pool of qualified workers, while those whose requirements fell saw wage declines but employment growth. They propose that skill-gap frameworks should ask not just "which jobs are exposed?" but "which tasks within each job are exposed, and are those the expert or inexpert tasks for that occupation?"
Anthropic
Anthropic Economic Index
February 2026
The Anthropic Economic Index provides a novel approach to measuring AI's economic impact by analyzing actual usage patterns from millions of conversations with Claude rather than relying on theoretical exposure estimates. The initial report found that roughly 36% of occupations showed AI being used for at least a quarter of their tasks. Usage remains concentrated in software development and technical writing, with a balance leaning toward augmentation (57%) over automation (43%). The January 2026 report introduced "economic primitives" to allow for more precise measurement. These standardized metrics include: task complexity, skill level (measured by education years required), purpose/use case, AI autonomy (degree of delegation), and success rate. The index also tracks geographic adoption patterns, finding that uneven worldwide adoption remains well-explained by GDP per capita, and lower-usage U.S. states are catching up relatively faster, suggesting potential equalization within two to five years.
MIT & Oak Ridge National Laboratory
Project Iceberg
December 2025
The "Iceberg Index," is a skills-centered metric that simulates 151 million U.S. workers as autonomous agents across 32,000 skills and 3,000 counties, powered by Oak Ridge's Frontier supercomputer. The research found that current AI systems can technically perform tasks representing 11.7% of total U.S. wage value (approximately $1.2 trillion), far exceeding visible tech-sector disruption (2.2%, or $211 billion). The index challenges assumptions that AI risk is confined to coastal tech hubs; simulations show exposed occupations spread across all 50 states, including inland and rural regions. The tool “enables policymakers and business leaders to identify exposure hotspots, prioritize training and infrastructure investments, and test interventions before committing billions to implementation.”
W.E. Upjohn Institute for Employment Research
AI Exposure and the Future of Work: Linking Task-Based Measures to U.S. Occupational Employment Projections
September 2025
Erik Vasilauskas and Michael Horrigan extended the Pew Research Center’s task-based framework to categorize occupations by AI exposure intensity. Using O*NET data to measure the relative importance of 16 key AI-susceptible work activities, they clustered occupations into high, middle, and low exposure tiers. Their analysis of BLS 2023–2033 projections found that high-exposure roles are concentrated in high-skill, high-education fields (e.g., analysts, professional services). Crucially, the study distinguishes outcomes within this high-exposure group: while analytical and financial roles are projected to grow via augmentation, clerical and administrative roles face decline due to direct automation.
Real-world precedents:
The U.S. Department of Labor's Occupational Information Network (O*NET) provides a foundational taxonomy, cataloging approximately 20,000 task statements across 900+ occupations. Researchers classify these tasks by their exposure to AI capabilities, producing occupational exposure scores that can inform workforce development priorities.
BLS's Employment Projections program incorporates AI-related impacts for occupations where high exposure to automation is deemed likely, and released a suite of skills data tables providing information about skill importance across occupations.
At the state level, skill-gap measurement is becoming embedded in workforce planning.
Tennessee's AI Workforce Action Plan draws directly on MIT's Iceberg Index to identify county-level exposure patterns and target training investments.
North Carolina State Senator DeAndrea Salvador, who worked closely with MIT on the project, notes the tool enables policymakers to examine county-specific data to identify skills likely to be automated or augmented.
Utah's Office of AI Policy is preparing a similar state-level analysis.
© 2026 Windfall Trust. All rights reserved.