Home Help for Educators Crafting Assessments Crafting Policies Developing Class AI Tools AI Tools for Educators AI for Instructional Design Help for Students Helpful Resources Prompt Library

AI Prompt Library

Ready-to-use prompts for higher education faculty — copy, customize the [bracketed fields], and paste into your AI tool of choice.

Prompting Frameworks

Role–Context–Task–Format–Examples (RCTFE) Template

A structured five-element framework. Use XML-style tags to clearly separate each component for the AI.

<role>You are a [specific expert] who [specific skill].</role> <context>[Background info the AI needs to do the job]</context> <task>[Exactly what you want, in plain language]</task> <format>[How you want the output: bullet points, table, paragraph, etc.]</format> <examples>[1–2 examples of what good output looks like]</examples>

Contextual Prompt (Long Projects)

Front-load your prompt with a full context block to keep the AI aligned across a long conversation or complex project.

You are helping me with [specific project/task]. Here's what you need to know: About me: [Your role, expertise level, institution] The project: [What you're working on and why] What good looks like: [Examples, tone, format, length] Constraints: [What to avoid, word limits, style rules] Success criteria: [How I'll judge if this is useful] With this context, please [your actual request].

Discerning Collaborator Mode

Disable the AI's default agreeableness. Especially useful for reviewing your own writing, arguments, or research drafts.

Shift your conversational model from a supportive assistant to a discerning collaborator. Your primary goal is to provide rigorous, objective feedback. Eliminate all reflexive compliments. Instead, let any praise be an earned outcome of demonstrable merit. Before complimenting, perform a critical assessment: Is the idea genuinely insightful? Is the logic exceptionally sound? Is there a spark of true novelty? If the input is merely standard or underdeveloped, your response should be to analyze it, ask clarifying questions, or suggest avenues for improvement, not to praise it.

Accessibility & UDL

Generate Alt Text (Images)

Attach an image and ask the AI to write concise, objective alt text suitable for screen readers.

You are writing alt text for accessibility. In 1–2 sentences, describe what is essential for someone who cannot see the image. Focus on the subject, action, and context. Do NOT interpret meaning or emotion beyond what is visually obvious. (attach image)

Generate Alt Text (Charts & Graphs)

Specialized alt text for data visualizations — focuses on trends and takeaways rather than individual data points.

You are generating alt text for a chart used in a college course. Identify the type of visual (bar chart, line graph, diagram, etc.). Describe trends or key takeaways rather than listing every data point. Keep it under 120–140 characters if possible. (attach chart or graph)

Improve Auto-Generated Alt Text

Fix the vague or bloated alt text that auto-generators produce.

Improve this alt text so that it is accurate, concise, and objective. Remove unnecessary filler (e.g., "An image of..."). Make sure the main subject and action are clear. [paste original alt text]

Table Accessibility Checker

Check whether a table is structured correctly for screen readers and get a corrected version if not.

You are an accessibility checker. I will paste a table (or describe it). Tasks: Confirm whether the table is structured correctly for screen readers. • Header row identified • Logical reading order (left → right, top → bottom) • No merged/split cells that break reading order Suggest corrections and rewrite the table as accessible Markdown or HTML if needed. Keep the content exactly the same. Do not summarize. Table: [Paste table or describe the columns and rows here]

Rewrite Table as Accessible HTML (for Blackboard / LMS)

Convert a table to properly structured HTML with thead, th scope, and tbody — ready to paste into an LMS HTML editor.

Rewrite this table as accessible HTML. Requirements: Include <thead>, <th scope="col">, and <tbody>. Do not use merged cells. Maintain the original data exactly. Table content: [Paste table]

Quick Table Fix

Make sure this table is screen-reader friendly. Identify headers, fix alignment, and rewrite in accessible format. [Paste table]

UDL Lesson Design

Generate options across all three UDL principles for any lesson topic.

I am teaching a lesson on [topic]. Help me design learning materials using Universal Design for Learning (UDL). Tasks: • Provide 3 options for representation (ways to present the content). • Provide 3 options for action/expression (ways students can show mastery). • Provide 3 options for engagement (ways to motivate students). Keep it practical and low-prep. Format as bullet points.

Alternative Assessment Design

Generate equitable assessment alternatives so students can demonstrate the same learning outcome in different formats.

I am designing an assessment for a course. The learning outcome is: [insert your learning outcome here] Generate 5 alternative assessment options that allow students to demonstrate the same learning outcome, but in different formats. Requirements: • Must assess the same cognitive level (Bloom's). • At least one option should be low-tech/no-tech. • At least one option should be multimedia (video, podcast, infographic, etc.). • Include instructions for the student and a short rubric or success criteria for each option. • Keep options equitable (not more work for students choosing an alternative format). Context (optional): Course: [course name] Topic/module: [topic] Modality: [online / in-person / hybrid] Time available to complete: [X days]

Rewrite Content at Multiple Reading Levels

Scaffold complex material by producing versions at different reading levels — useful for developmental education and multilingual learners.

Rewrite the following passage at three different reading levels: (1) advanced undergraduate, (2) introductory college, and (3) high school / developmental. Preserve the core meaning and key terminology in all versions. Bold any discipline-specific terms the first time they appear and add a brief parenthetical definition. Passage: [paste passage]

Assessment Design

Multiple-Choice Question Generator

Generate high-quality MCQs aligned to a specific Bloom's level. Produces the answer key, distractor rationale, and learning outcome alignment.

You are an experienced assessment specialist in higher education. Create [number] multiple-choice questions on the topic of [topic] for a [course level, e.g., 200-level Introduction to Sociology] course. Requirements: • Each question should target [Bloom's level: remember / understand / apply / analyze / evaluate / create]. • Provide 4 answer options (A–D) per question. Only one correct answer. • Distractors should reflect common student misconceptions, not obviously wrong options. • Avoid "all of the above" and "none of the above." • After the questions, provide an answer key with a 1–2 sentence explanation for why each correct answer is right and why each distractor is wrong. Format each question as: Q#. [Stem] A. [Option] B. [Option] C. [Option] D. [Option]

Exam Item Analysis

Paste exam results and get a statistical analysis of item difficulty, discrimination, and distractor effectiveness.

You are an assessment analyst. I will provide student response data from a multiple-choice exam. For each question, calculate and report: • Item difficulty (p-value: proportion of students who answered correctly). • Item discrimination (point-biserial correlation or upper/lower 27% comparison). • Distractor analysis: for each incorrect option, what percentage of students chose it. • Flag any item where p < 0.30 (too hard), p > 0.90 (too easy), or discrimination < 0.20 (poor discrimination). Provide a summary table and specific recommendations for items that should be revised, removed, or kept. Exam data: [paste student response data — rows = students, columns = questions, values = selected option letter] Answer key: [paste correct answers, e.g., 1-B, 2-A, 3-D, ...]

Exam Blueprint / Test Map Generator

Create a test blueprint that maps questions to learning outcomes and Bloom's levels before you start writing items.

I am building an exam for [course name] covering these topics: [list of topics or textbook chapters]. The exam will have [number] questions and should take approximately [X minutes]. Create a test blueprint table with columns: | Topic | # of Items | Bloom's Level(s) | Question Type (MC, short answer, essay) | Aligned Learning Outcome | Distribute items so that approximately 30% target lower-order thinking (remember/understand) and 70% target higher-order thinking (apply/analyze/evaluate/create). Adjust if I specify different proportions.

Short-Answer / Essay Question Generator

Generate open-ended questions with a model answer and grading criteria.

Create [number] short-answer (or essay) questions for a [course level] course on [topic]. For each question: • Write a clear, specific prompt that cannot be answered with a single sentence. • Specify the expected response length (e.g., 1 paragraph, 300 words, 2 pages). • Provide a model answer that represents an "exceeds expectations" response. • List 3–4 key elements a response must include to receive full credit. • Identify the Bloom's level being assessed. Learning outcome(s): [paste outcome(s)]

AI-Resistant Assignment Designer

Redesign an existing assignment to make it more resistant to AI-generated submissions without resorting to surveillance tools.

Here is an assignment I currently use in my [course name] course: [paste full assignment instructions] Redesign this assignment to make it more resistant to AI-generated submissions while preserving (or improving) its pedagogical value. Use strategies such as: • Connecting the assignment to in-class activities, discussions, or personal experiences • Requiring process documentation (drafts, reflections, annotations) • Embedding discipline-specific or local context that AI cannot easily replicate • Incorporating metacognitive components (e.g., "explain your reasoning process") • Using oral components, peer review, or iterative revision Provide the full redesigned assignment instructions and a brief rationale for each change you made.

Rubrics & Feedback

Analytic Rubric Generator

Generates a complete rubric with criteria, performance levels, point values, a student-facing summary, and reusable grading comments.

I need a rubric for an assignment. Assignment description: [paste your assignment instructions here] Learning outcome(s): [paste outcome(s) here] Create an analytic rubric with: • 3–5 criteria aligned to the learning outcomes • 4 performance levels (Exceeds / Meets / Approaches / Does Not Meet) • Clear, observable, student-friendly language (no vague terms like "good" or "excellent") • Focus on what mastery looks like at each level • Include point values or percentage ranges Output format (table): | Criteria | Exceeds Expectations | Meets Expectations | Approaches Expectations | Does Not Meet Expectations | Points | After the table, add: • A short one-paragraph summary that explains how students can use this rubric to guide their work • 2–3 quick feedback comments faculty can paste into a grading comment box

Single-Point Rubric Generator

A single-point rubric defines only "proficient" performance, leaving space for specific feedback on both sides. Good for formative assessments.

Create a single-point rubric for this assignment: Assignment: [paste assignment instructions] Learning outcome(s): [paste outcome(s)] Format the rubric as a table with three columns: | Areas for Growth (blank for instructor comments) | Criteria & Proficient Description | Evidence of Exceeding (blank for instructor comments) | Include 3–5 criteria. Write the proficient descriptions using specific, observable language. Below the table, provide a brief explanation of how single-point rubrics work, written for students.

Generate Feedback Comments for Grading

Pre-generate a bank of reusable feedback comments aligned to specific rubric criteria, saving time during grading.

I grade [assignment type, e.g., research papers] in a [course name] course. Here is my rubric: [paste rubric] For each criterion at each performance level, generate 2–3 specific feedback comments I can paste into a grading comment box. Each comment should: • Name the specific strength or issue • Point to a concrete example or pattern in student work • Suggest a next step or revision strategy (for levels below "Meets") Keep the tone constructive and encouraging.

Course & Curriculum Design

Identify Gaps in a Syllabus or Unit Outline

You are an experienced instructional designer. I am building a [X-week] unit on [topic] for [course level] students. Here is my current draft outline: [paste outline] Identify three potential gaps in conceptual scaffolding and suggest specific activities to address each. Format your response as a table with columns: Gap | Root Cause | Suggested Activity.

Generate Learning Outcomes (with Bloom's Alignment)

I am developing a [course or module name] for [audience, e.g., first-year community college students]. The key topics covered are: [list topics]. Write 5–7 measurable student learning outcomes using action verbs from Bloom's Taxonomy. For each outcome, identify the Bloom's level and suggest one assessment method that could measure it. Format as a table: | Learning Outcome | Bloom's Level | Suggested Assessment |

Anticipate Student Misconceptions

Identify likely misconceptions before you teach a topic, so you can address them proactively.

I am about to teach [topic] in a [course level] course. Identify 5 common student misconceptions about this topic. For each one: • State the misconception clearly. • Explain why students tend to hold it (prior knowledge, intuitive reasoning, media, etc.). • Suggest a brief classroom activity, question, or example that helps students confront and correct the misconception. Format as a numbered list.

Lesson Plan Generator

Create a detailed lesson plan for a [X-minute] class session on [topic] for [course name and level]. Include: • Learning objective(s) for this session • A warm-up or hook activity (5 min) • Main instructional activities with time estimates • At least one active learning strategy (think-pair-share, case study, polling, etc.) • A formative assessment or check-for-understanding activity • A brief wrap-up or preview of next session Modality: [in-person / online synchronous / hybrid] Assumed prior knowledge: [what students already know]

Critical Thinking & Analysis

Multiple Perspectives Analysis

Useful for modeling the process of evaluating competing claims in class. Present the AI output as a starting point for student critique.

Give me three different perspectives on [your question]. For each perspective, explain what evidence supports it and what evidence contradicts it. Then tell me which perspective has the strongest support and why.

Break Analysis Paralysis

Analysis Paralysis Prompts

Three quick prompts for when you're stuck overthinking a decision or situation.

I'm overthinking this situation. What are three logical perspectives I might be missing?
What would a completely neutral third party say about this situation?
Give me two optimistic and two realistic ways to look at this.

Generate Counter-Arguments

Stress-test a thesis or argument before committing to it — useful for research and for modeling critical thinking in class.

Here is a thesis I am developing: [paste your thesis or argument] Generate 3–5 strong counter-arguments that a well-informed critic would raise. For each, cite the type of evidence that would support the counter-argument and suggest how I could address or rebut it in my writing.

Ask for Options Before Committing

Before committing to an approach on a complex task, surface options first.

What are three different ways I could structure a unit on [topic] for a [course level] course? For each option, describe the pedagogical rationale, likely student outcomes, and one potential drawback.

Writing & Communication

Critique a Draft Using a Rubric

Using the attached rubric, identify the two strongest and two weakest elements of this draft and explain why. Rubric: [paste rubric] Draft: [paste student draft or your own draft]

Generate Exemplar and Non-Exemplar Responses

Create sample "A" and "C" quality responses so students can self-assess before submitting.

Here is an assignment prompt I give to students: [paste assignment prompt] Generate two sample responses: 1. An exemplar response that would receive the highest marks. It should demonstrate all the qualities described in my rubric or grading criteria. 2. A non-exemplar response that would receive a middling grade. It should contain specific, realistic weaknesses (vague claims, missing evidence, structural issues) that I can use to teach students what to avoid. After each response, annotate 3–4 key features that make it strong or weak.

Email / Announcement Drafter

Draft a [email / LMS announcement / syllabus note] to [audience: students, colleagues, advisory board] about [topic]. Tone: [professional, warm, direct, formal] Length: [2–3 paragraphs / under 200 words] Key points to include: [list 2–4 points] Keep the language clear and free of jargon.

Productivity & Administration

Turn Messy Notes into Structure

Convert raw meeting notes, brainstorming sessions, or email threads into clean, formatted documents.

Organize the following into a structured [doc type: brief, meeting notes, SOP, committee report]. Keep all facts, remove repetition, and flag open questions at the end. Notes: [paste]

Task Categorization (What Should AI Do?)

Identify which items on your to-do list are good candidates for automation versus human judgment.

Analyze these tasks and categorize them: (1) AI can do this fully (2) AI can assist significantly (3) Delegate to a colleague (4) I should do this myself Explain why for each item. Tasks: [paste your task list]

Committee / Grant Report Drafter

I need to write a [committee report / grant progress report / annual program review section]. Here are my raw notes and data points: [paste notes, bullet points, data] Organize this into a professional narrative report with: • An executive summary (3–4 sentences) • Key findings or accomplishments • Challenges or areas for improvement • Next steps or recommendations Length: approximately [X words or pages]. Tone: [formal / professional / conversational].

Tips for Getting Better Results

These principles appear consistently across the prompting literature and our own AI Academy materials:

Be specific about role, tone, and format from the start. Vague prompts produce generic outputs.

Use examples to anchor expected output — even one example dramatically improves quality.

Break complex requests into steps. Ask the AI to reason before answering.

Iterate. A first-draft prompt rarely produces the best result. Refine based on output.

Always verify. Treat every AI output as a strong first draft that requires your expert review, not a finished product.