August 01, 2025

Red Light, Green Light: New State Guidelines for Using GenAI

Generative AI is here — and changing how we work. Tools like ChatGPT, Gemini, and Copilot are showing up across state government, helping teams summarize notes, draft content, and automate repetitive tasks.

But with new tech comes new responsibility. That’s why the Georgia Technology Authority (GTA) and the AI Advisory Council have created a clear framework to guide state agencies in using generative AI (GenAI) safely, ethically, and effectively. GTA has published new standards and guidelines that govern the use of GenAI in state government. Before using GenAI tools, please take the time to review and understand the guidelines that have been put in place. This blog summarizes the key points.

GenAI tools can boost efficiency and support decision-making. But they also come with risks — like biased outputs, data leaks, and misinformation. Georgia’s approach centers on public trust: we gain it by being careful, consistent, and transparent in how we use AI. The standards and guidelines build on the Georgia AI program’s Five Guiding Principles.

Guiding Principles

At the heart of the standards are five principles every agency should follow:

  1. Implement responsible systems: Involve users in the design process. Test before launch. Monitor over time.
  2. Ensure fairness and ethics: Check for bias. Make decisions explainable. Assign accountability.
  3. Protect data quality and privacy: Use strong data governance. Avoid unnecessary data collection.
  4. Keep AI use transparent: Label AI-generated content. Provide public documentation.
  5. Center human involvement: Keep people in control. Use AI to support — not replace — human judgment.

These guidelines apply to all forms of generative AI, including but not limited to text, image, video, and audio generation.

The guidelines state:

What Agencies Need to Do

Get approval first. Agencies must request conditional approval from GTA before using GenAI tools regularly. This applies even for tasks like transcription, summarization, or note-taking. Use this form to make your request.

Use vetted tools. Only tools vetted and approved by GTA can be used. Approval is specific. Just because one version of a tool is cleared doesn’t mean newer versions are automatically allowed.

Use work accounts only. Log in with state-issued credentials. Never use personal email for state business involving GenAI.

Keep records. Save prompts, outputs, and who reviewed the content. These may be subject to public records laws.

Responsible AI Use in Practice

Review everything. GenAI tools can “hallucinate.” They are known to sometimes make up facts. Always verify AI-generated content before using it officially.

Label AI outputs. If AI helped create or edit content, say so. Include the tool name, the prompt, and who reviewed the content. Example: "This summary was generated using ChatGPT and reviewed by J. Lee."

Keep data private. Never input personally identifiable information (PII) or protected health information (PHI) into GenAI tools unless approved by GTA. Even “test data” needs to be reviewed for risk.

What’s Allowed, What’s Not

The standards include a helpful traffic-light table of common use cases. Here’s a sample:

  • OK: Drafting internal summaries, brainstorming, formatting documents.
  • Use with care: Chatbots, accessibility tools, policy drafting — only with human review and bias checks.
  • Not allowed: Using AI for legal or disciplinary decisions, processing sensitive data, or deploying unverified code.

For High-Risk Use Cases: Use the Sandbox

GTA’s GenAI Sandbox — part of the Horizons Innovation Lab — is a safe place to test GenAI tools for higher-risk projects. Agencies can try out tools in a secure environment, share lessons, and determine whether a use case is ready for production.

To participate, agencies submit a brief proposal and agree to specific data use and reporting rules. GTA provides oversight, helping teams scale successful pilots statewide.

Transparency and Public Trust

AI use in government affects real people. That’s why disclosure, public access, and audit trails are required. GenAI outputs may be subject to Georgia’s records retention laws and open records requests. Agencies need to store them properly and be prepared to explain how AI was used.

 

The first draft of this blog was generated using ChatGPT and edited by DSGa staff.

Related to: