Co-Constructing AI Boundaries Framework Component - Prompts
Definition in This Study
Prompts refers to how students directed, constrained, or collaborated with the AI model through their questions, commands, and conversational moves. This component captures the specific language and strategies students used to shape the AI's responses.
Mollick & Mollick (2023) Connection
The Prompts component mirrors M&M's core mechanism of interaction:
- Assignment roles (Tutor, Coach, Mentor, Teammate, etc.)
- The function of prompting to shape AI behavior
- Strategic use of constraints and direction
Key M&M Principle: Prompts are the central tool for defining and controlling the AI interaction.
The entire M&M paper is subtitled "with Prompts" and dedicates sections to example prompts for all seven approaches, underscoring the critical role of prompting in shaping learning experiences.
What This Component Analyzes
Primary Focus
- Prompt complexity: Simple/generic vs. complex/specific
- Constraint level: Open-ended vs. constrained
- Cognitive demand: What kind of thinking does the prompt require?
- Role assignment: Does student assign the AI a specific role?
Secondary Focus
- Evidence of iteration (revising prompts based on outputs)
- Metacognitive awareness in prompt design
- Strategic use of prompt engineering techniques
Agency in Prompts: The Cognitive Task Decision
This component captures Agency over Cognitive Task:
| Evidence of High Agency | Evidence of Low Agency |
|---|---|
| Complex, multi-step prompts | Generic, single-step prompts |
| Constraining language (boundaries) | Open-ended, vague requests |
| Demands critique, synthesis, comparison | Asks for summary or generation |
| Assigns specific roles to AI | No role assignment |
| Iterates on prompts strategically | One-shot prompting |
Prompt Types and Cognitive Demand
Low Cognitive Demand (Low Agency)
- Summarization: "Summarize this article"
- Generation: "Write a paragraph about..."
- Definition: "What is critical literacy?"
Moderate Cognitive Demand
- Comparison: "Compare Freire and hooks on literacy"
- Application: "Apply this theory to my teaching context"
- Organization: "Organize these themes"
High Cognitive Demand (High Agency)
- Critique: "Critique this argument using critical race theory"
- Synthesis: "Synthesize these three perspectives into a coherent framework"
- Evaluation: "Identify logical fallacies in this text"
- Devil's Advocate: "Challenge my thesis from a [specific] perspective"
Boundary-work in Prompts
Students engage in Boundary-work through:
- Constraining AI capabilities: "Only use these sources; don't search the web"
- Setting behavioral boundaries: "Don't write the paper; help me think"
- Limiting AI authority: "Suggest options, but I'll decide"
- Demanding transparency: "Explain your reasoning"
M&M Assignment Roles in Student Prompts
| M&M Role | Purpose | Example Student Prompt |
|---|---|---|
| AI as Tutor | Explain concepts, answer questions | "Explain the zone of proximal development" |
| AI as Coach | Prompt metacognition, guide process | "What questions should I ask myself as I analyze this?" |
| AI as Mentor | Long-term guidance, goal-setting | "Help me plan my research project over the next month" |
| AI as Teammate | Collaborative problem-solving | "Let's brainstorm solutions together" |
| AI as Student | Student teaches AI to deepen learning | "I'll teach you about critical literacy" |
| AI as Simulator | Practice scenarios or dialogues | "Simulate a parent-teacher conference" |
| AI as Tool | Mechanical task completion | "Check this text for grammar errors" |
Key Analytic Questions
When coding Prompts, ask:
-
Complexity:
- Is the prompt simple or multi-layered?
- Does it require multiple cognitive operations?
-
Constraint:
- Does student limit AI's scope or behavior?
- Are boundaries explicitly set?
-
Cognitive Demand:
- What level of thinking does this prompt require from the AI?
- Summarize? Critique? Synthesize?
-
Role Assignment:
- Does student assign AI a specific role?
- Which M&M role (if any)?
-
Strategic Intent:
- Is there evidence of prompt engineering knowledge?
- Does student revise prompts based on results?
Examples from Data
High Agency Prompt
"Act as a critical reviewer trained in critical race theory.
Analyze this lesson plan and identify where it might
inadvertently perpetuate deficit thinking about students
of color. Provide specific examples and suggest revisions."
Analysis:
- High complexity (multiple steps)
- Constraining (specific theoretical lens)
- High cognitive demand (critique, application)
- Clear role assignment (critical reviewer)
Low Agency Prompt
"Summarize these articles."
Analysis:
- Low complexity (single step)
- No constraints
- Low cognitive demand (summarization)
- No role assignment
Connection to Epistemic Stance
Prompt design reveals epistemic stance:
- Self-authoritative: Uses AI as thinking partner, not answer generator
- AI-authoritative: Asks AI to produce final knowledge products
- Co-constructed: Engages in dialogue, iterates collaboratively
Coding Categories for Prompts
| Code | Definition | Example |
|---|---|---|
| Generic Ask | Simple, common prompt | "Summarize this" |
| Role Assignment | Assigns specific role | "Act as a literacy coach" |
| Constraining | Sets explicit boundaries | "Only use these sources" |
| Devil's Advocate | Requests critical challenge | "Challenge my argument" |
| Synthesis | Requests integration of multiple ideas | "Synthesize these three theories" |
| Critique | Requests critical analysis | "Critique this from a feminist lens" |
| Iterative Refinement | Revises prompt based on output | "Actually, focus more on..." |
Relationship to Other Framework Components
- ← Co-Constructing AI Boundaries Framework Component - Inputs: Inputs provide context; prompts direct action
- → Co-Constructing AI Boundaries Framework Component - Outputs: Prompt quality influences output quality
- → Co-Constructing AI Boundaries Framework Component - Integration: High-demand prompts may yield more useful outputs
Pedagogical Implications
Teaching effective prompting:
- Model prompt progression (simple → complex)
- Teach constraint setting for boundary-work
- Practice assigning roles strategically
- Discuss cognitive demand in prompts
- Reflect on prompt-output relationships
- Encourage iteration and refinement
Data Collection Notes
Where to find evidence:
- NotebookLM chat logs (every prompt)
- Student reflections on prompting strategies
- Iteration patterns (revisions to prompts)
- Explicit metacommentary about prompting
Related Notes
- Analytic Framework for AI Human Meaning-Making Practices
- How learners should engage Large Language Models framework
- Agency
- Boundary-work
- Epistemic Stance
Tags
#framework-component #prompts #cognitive-demand #agency #AI-literacy