Updated: February 2026

/

Regulation & Market Design

/

Market Design Mechanisms

Fiduciary Mandates

Fiduciary mandates would impose legal duties of loyalty, care, and confidentiality on AI developers and deployers, requiring them to act in users' and society's interests rather than solely maximizing profit or engagement.

What it is:

Fiduciary mandates would require AI developers to ensure systems act in users' and society's genuine interests rather than optimizing solely for profit or engagement metrics that may accelerate labor displacement and exacerbate inequality. 

Companies face inherent tensions between maximizing efficiency gains and protecting public welfare. Fiduciary mandates would convert vague ethical commitments into legally enforceable obligations: the duty of loyalty would prohibit deploying systems that knowingly displace workers faster than retraining can occur.

Who's working on It:
Claire Boine

September 2023

Boine argues that while information fiduciaries focus on data collection, "AI fiduciaries" must address broader harms from system behaviors. She proposes that AI companies meet classic fiduciary criteria: users entrust power to systems they cannot monitor, creating vulnerability that requires legal protection. Under her framework, the duty of loyalty becomes "goal and value alignment", the capacity of AI systems to act with the same interests as beneficiaries.

Real-world precedents:

Fiduciary duties have governed professional relationships for centuries—physicians' Hippocratic obligations, attorneys' duties to clients, and financial advisors' obligations under the Investment Advisers Act of 1940.

Securing humanity's AI future

© 2026 Windfall Trust. All rights reserved.