AILegal

Questions In-House Legal Needs to Ask When Developing an AI Mandate

By Tom Dunlop, chief executive officer, Summize

Earlier this year, Microsoft began telling employees that using AI is no longer optional. According to internal communications, managers are now expected to evaluate staff based on how they integrate AI into their work, with performance reviews likely to include AI adoption metrics. In the words of Julia Liuson, president of Microsoft’s Developer Division: “AI is now a fundamental part of how we work… it’s core to every role and every level.” 

This move by one of the world’s largest companies sent shockwaves through boardrooms everywhere, resulting in more companies introducing AI-first mandates. For organizations already experimenting with AI, a mandate marks the shift from ideation to execution. It signals to shareholders, employees, and customers alike that AI is necessary for growth and acceleration.   

Why Legal Must Lead on AI Mandates 

Just as the business world grapples with how to make the most of AI, the legal function itself is undergoing rapid transformation. The Legal Disruptors 2025 Report, which surveyed 250 legal professionals, found that 77% of in-house legal professionals said their role has evolved over the past two years, with more than a third reporting their role has significantly evolved. Today’s general counsels are expected to go beyond their traditional responsibilities and quickly become strategic partners who help drive business success. 

Gartner has listed AI governance as a top priority in 2025 for legal leaders, making now the perfect opportunity to step fully into that role. Compliance, risk management, and corporate governance remain central to the job, but now legal’s influence extends to shaping enterprise-wide strategies that balance innovation with accountability. As such, legal is more uniquely positioned than most to define how AI should be adopted across the business.  

Currently, more than half of companies lack a formal AI mandate or strategy, even as nearly 90% of legal professionals are using AI tools themselves. This leaves organizations exposed to shadow AI, compliance risks, and reputational damage. Legal faces an exciting challenge: shape mandates that not only protect the business, but also position AI as a strategic asset. 

The Critical Questions for Building an AI Mandate 

When it comes time to introduce a mandate, you need to provide employees clarity on not only how and when to use AI, but also when it’s best not to. That requires defining clear use cases, establishing governance and compliance guidelines, and including plans for upskilling and training.   

With those foundations in place, in-house legal leaders can begin asking the critical questions that will shape a safe, structured, and strategic mandate:  

  1. Should company data be used for AI training?

Legal must decide if – and how – sensitive company information can be used to train AI systems. In most cases, proprietary or client data should be siloed to avoid exposing critical information to external models. 

  1. Do sensitivities require limits on use cases?

Some datasets, such as personal information or intellectual property, require stricter controls. Legal should define which categories of data can be safely processed by AI, and which must remain entirely off-limits.  

  1. What safeguards are needed to validate AI outputs?

It’s pretty well established that as powerful as AI is, it remains quite fallible and prone to hallucinations. Tools often produce inaccurate or biased outputs, so legal teams should establish validation processes that ensure AI-generated content is reviewed before it influences decisions.  

  1. Should personal AI subscriptions be restricted?

Individual employees often experiment with consumer AI tools, sometimes pasting in sensitive information without realizing the risks. Legal should determine whether only company-sanctioned AI tools are permitted and establish clear policies around personal use. 

  1. What regulatory frameworks apply?

From the EU AI Act to the U.S. Executive Order on AI, organizations must be prepared for evolving compliance demands. Legal teams are best placed to assess obligations in every jurisdiction where the business operates. 

What Does an AI Mandate Actually Look Like? 

An AI mandate isn’t about restricting creativity or slowing innovation. Done right, it’s an enabler, giving employees clarity, building trust, and empowering teams to explore AI tools within safe boundaries.  

The Legal Disruptors report found that the biggest barriers to AI adoption include privacy and security concerns (45%) and limited training (37%). A well-crafted mandate tackles these barriers head-on by: 

  • Defining boundaries – outlining what’s allowed, what’s prohibited, and where extra caution is needed. 
  • Setting safeguards – creating processes for validating AI outputs to ensure accuracy, security, and compliance. 
  • Providing support – pairing rules with training and upskilling so employees know how to use AI confidently and responsibly. 

With those elements in place, employees can experiment with AI without the fear of crossing undefined lines. 

The impact goes far beyond compliance. Clear mandates free teams to realize AI’s full value: accelerating contract review, surfacing risks faster, and unlocking capacity for higher-value work. 

A Mandate to Act 

The results of the report highlight both the urgency and the opportunity at the feet of in-house legal teams. AI adoption is already widespread, yet the absence of mandates leaves organizations exposed to risk. At the same time, legal’s role is evolving rapidly from narrowly focused legal advisors to integral business partners who shape strategy, anticipate risks, and help unlock commercial value.  

That evolution is exactly why legal must lead the creation of AI mandates. By asking the right questions around data, safeguards, governance, and regulation, legal can provide the clarity their organization needs to use AI responsibly. With thoughtful AI frameworks in place, the legal function has the chance to redefine itself as a guardian of compliance and a catalyst for progress. 

Author

Related Articles

Back to top button