Skip to content

AI at Clemson

Non-Academic AI Usage Quick Guide

Artificial intelligence tools can support productivity, communication and operational efficiency across Clemson University. This quick guide outlines key considerations for using AI in administrative, operational and non-academic contexts, with a focus on data protection, security and policy compliance.

As a general rule, classify the data first, then confirm the tool is approved for that data type and use case. Approval of a platform does not automatically approve its plug-ins, connectors or add-ons. When in doubt, do not enter the data and consult CCIT or CheckIT.

If you are an instructor or researcher, refer to the Academic AI Usage Quick Guide. For more details, review Clemson’s Generative AI Guidelines and the Data Classification Policy.

Green —  Allowed / Low Risk

These uses are generally acceptable when working with public information or approved University services and when following established security and data protection best practices. Continue to use sound judgment and comply with Clemson policies.

Approved Data and Tool Combinations

  • Public data with a Public AI tool (unapproved/public-facing)
  • Public data with a Clemson-approved, contract-protected AI service
  • Clemson Zoom AI Companion (approved configuration)

Safe Use Patterns

  • Brainstorming, outlining, rewriting for clarity (no sensitive inputs).
  • Summarizing publicly available documents.
  • Drafting general communications that do not include internal-use information.
  • Asking for explanations or examples using generic, non-identifying scenarios.

Examples

  • “Summarize this public news article.”
  • “Rewrite this paragraph on this public-facing webpage to be clearer.”
  • “Create a meeting agenda template (no names, no sensitive details).”
  • Using Zoom AI Companion to summarize a Clemson meeting in the approved Clemson configuration.

Yellow — Use with Caution

These uses may be appropriate in certain situations, but only after confirming required protections, approvals and safeguards are in place. Additional review, documentation or consultation may be necessary to reduce institutional and data risk.

When to Slow Down

  • If you are dealing with Internal Use or Confidential information. Only use a Clemson-approved service if it’s approved for that data type. Refer to the Data and Tool Matrix for more information.
  • If you don’t know whether the tool stores prompts/outputs or uses them for training.
  • If you are offered plug-ins, connectors, bots or extensions (often the biggest leak risk).
  • If you need a high degree of accuracy. Remember, AI can hallucinate incorrect facts, citations, calculations or code.

"Use Caution" Behaviors

  • Identify the data classification (Public / Internal / Confidential / Restricted).
  • Identify the tool category (public tool, Clemson-approved, third-party add-on).
  • Confirm protections: retention, training use, access controls, Clemson-managed login.
  • Minimize data: share least possible; remove identifiers; aggregate/de-identify.
  • Verify outputs via authoritative sources.
  • Document important use (tool, date, what was asked—without storing sensitive prompts).

Examples

  • Using an approved Clemson AI service to draft an internal staff update only if approved for Internal Use.
  • Turning on a “connect to Google Drive/LMS/ticketing” feature—pause and confirm approval.
  • Using a browser “AI writing assistant” that can read email or typed text—assume it may transmit data.
  • Using AI output for a decision memo—verify facts and keep a record of verification steps.

Red — Prohibited

These uses create significant policy, security or data protection risks and are not permitted. Engaging in these activities may violate University policy, contractual obligations or legal requirements and could result in serious consequences.

Prohibited Data and Tool Combinations

  • Internal Use, Confidential or Restricted data with a Public AI tool (unapproved).
  • Restricted data with a Clemson-approved AI service (unless explicitly approved, which is rare).
  • Internal Use, Confidential data with a third-party add-ons/bots/extension (Not unless reviewed/approved).
  • Restricted data with a third-party add-on/bot/extension (Never, unless explicitly approved, which is rare).
  • Third-party add-ons/bots/extensions with non-public Clemson info:
    • Generally, do not use unless reviewed/approved.
    • For Restricted: never unless explicitly approved (rare).

Access and Add-ons

Never Paste Secrets

  • Passwords, access tokens, private keys, credentials, security configurations should never be used in AI prompts or trained on.

Plug-in/Platform Trap

  • Key rule: Approval of the main platform does not approve every add-on/connector.

Examples

  • Pasting student info, HR/finance/vendor details, contract/grant terms, or non-public IP into a public chatbot.
  • Uploading Restricted data (FERPA/PHI/security-sensitive) into any unapproved AI tool.
  • Enabling a third-party “meeting bot” to join a sensitive Zoom meeting.
  • Asking AI to debug by sharing production credentials or private keys.

Connect with the AI Initiatives Team

Get connected to stay current on upcoming events, opportunities and more!

Mitch Shue
Provost Fellow
Professor Of Practice, School Of Computing
Executive Director,AI Research Institute For Science And Engineering
mshue@clemson.edu

Nathan J. McNeese, PhD
Associate Vice President for Technology & Innovation
McQueen Quattlebaum Endowed Professor of Human-Centered Computing
mcneese@clemson.edu