ChirpAI: Augmenting Classroom Dialogue via AI-mediated Interaction

ChirpAI: Augmenting Classroom Dialogue via AI-mediated Interaction

Transforming traditional "Exit Tickets" into a bridge for student-teacher dialogue—leveraging LLMs to provide actionable pedagogy for the US K-12 market.

ChirpAI Product Interface Demo
Product Logo
ChirpAI is an AI-powered educational platform designed for the US K-12 market. It enables teachers to collect real-time student feedback and translates it into visualized teaching insights and personalized learning suggestions.
Role
Product Designer (UR → UX → UI)
Domain
AI, EdTech, SaaS
Timeline
2024.4 - 2025.1 (Concept to MVP)
Company
ViewSonic
Platform
Web (Responsive)
Team
1 PM, 3 AI Engineers, 4 Developers (FE/BE)

TL;DR

Strategy & Innovation

  • 0-to-1 Product Lead: Spearheaded the AI-native SaaS journey for real-time classroom interaction.
  • The Pivot: Redefined focus from "Post-class Testing" to Real-time Interaction to bridge learning gaps.
  • Insight Visualization: Designed the interface to transform complex LLM outputs into a glanceable Binary Tier system for immediate teacher action.
  • AI-UX Quality System: Developed a content evaluation scale and a tri-functional workflow (AI/UI/PM) to synchronize technical logic with user experience.

Quantifiable Impact

  • 100% (10/10) Pilot Intent: Secured commitment for trials from all interviewed US school administrators.
  • 86% (6/7) Purchase Advocacy: Strong intent from trial educators to advocate for departmental procurement.
  • 90% (9/10) Efficiency Gain: Users reported a significant reduction in feedback loop overhead.
  • Market Entry: Secured the first institutional deal in Taiwan following the Beta program.

The 0-1 Journey: Pioneering AI-First SaaS in a Hardware Legacy

As a Solo Product Designer, I led the 0-to-1 evolution of ChirpAI, ViewSonic’s first AI-native SaaS. This wasn’t just a design project; it was a digital transformation effort to bridge a hardware-centric legacy with the future of AI-driven classroom intelligence. I orchestrated the entire lifecycle—from decoding fragmented market policies to delivering a pilot-ready MVP.

Key Achievement: Successfully achieving a 100% pilot intent from US school administrators by solving a critical data-visibility problem.

The Challenge: Breaking the “One-Size-Fits-All” Barrier

A Silent Crisis in US K-12 Classrooms

In a typical US classroom of 30 students, teaching is often a "One-to-Many Broadcast." While the teacher delivers a lesson, the "silent majority" of students often fall through the cracks because their individual needs are invisible in real-time.

image reference: unsplash

Quantifying the Gap

  • Teacher Burnout: 33% of educators are too overwhelmed by administrative tasks to provide 1-on-1 mentorship.
  • Student Isolation: 19% of students navigate their entire learning journey in total silence, receiving zero personalized guidance.
  • The Trust Barrier: 25% of students feel no meaningful connection with any adult at school, making them less likely to ask for help.
Project image

First Concept: The AI-Driven Formative Assessment

To validate our core hypothesis, we architected a “Check for Understanding” (CFU) framework. The goal was to digitize the feedback loop using an AI-mediated system that transforms raw engagement into targeted intervention.

PhaseAI-mediated LogicEducational Value
1 Assessment CreationCurriculum-Led Generation: Automated assessment creation based on selected teaching standards.Removes the manual “prompt engineering” burden for teachers.
2 Student InteractionGuardrail-First Chatbot: A “Topic-and-Level-Constrained” interface for student engagement.Ensures pedagogical safety and prevents hallucinations.
3 Insight and Intervention3x3 Matrix Analysis: Automated clustering into 9 proficiency tiers for a Second-Round Intervention.Scales Differentiated Learning without increasing teacher workload.

Strategic Pivot: Failing Fast to Find PMF

Through early experimentation, we identified critical gaps in our initial direction. These iterations allowed us to refine our product-market fit before scaling.

IterationHypothesisThe GapStrategic Pivot
Trial 1Post-Class Assessment: A quiz tool to verify understanding after class.High Friction: Students felt over-tested; teachers felt over-burdened. Low participation due to lack of instant feedback.Shift from “Post-class Testing” to “In-class Interaction” to capture real-time engagement.
Trial 2Topic-Based Insights: A dashboard to cluster performance by topic.Low Actionability: Required student-platform ecosystem support that was outside the product scope.Shift from “Data Visualization” to “Actionable Pedagogy” using LLMs to group students for immediate intervention.
Trial 33x3 Granular Matrix: A 9-grid system (High/Mid/Low) to categorize students.Signal vs. Noise: 1. AI Limits: Dialogue data is too unstructured for accurate 3-tier grading.Optimized to 2 levels: Priority Topics x Binary Tiers (Emergent Understanding x Deep Comprehension) to ensure accurate, high-stakes intervention.

From Internal Insights to Market Strategy: Shaping the MVP Direction

The initial pivots were more than internal trial and error. Before committing to the final MVP development, we summarized the internal vision and external market constraints, ensuring we focused and navigated to the right problem with the right technology.

Constraints

  • Scenario Fragmentation: Confusion between pre-class, in-class, and after-class user scenarios.
  • Market Saturation: A crowded landscape of established AI "test" and "practice" tools.
  • Data Scalability: Extracting stable checkpoints is hindered by fragmented regional curricula, requiring extensive timelines for data crawling and RAG architecture development.

Enablers

  • Policy & Budget Shifts: Favorable changes in US government educational funding.
  • Refining "Insights": The demand for data-driven insights remains strong, provided the content is more actionable and integrated.

Lean Discovery: Rapidly Grounding the Core Value Proposition

To navigate the ambiguity of a new market, I spearheaded a series of Lean Discovery cycles. Rather than pursuing formal academic rigor, we focused on rapidly triangulating high-impact pain points from fragmented market signals. This agile approach allowed us to move fast while ensuring the 0-1 trajectory was anchored in objective US policy and stakeholder needs.

StudiesMethodologyDesign Impact
Policy & Ecosystem MappingDesktop Research: Synthesized complex US frameworks (MTSS, ESSA, SEL) and intervention systems.Mitigated Early-Stage Risks: Bypassed low-viability directions to establish a validated project foundation.
In-Class Activity HabitsInterviews & Routine Field Observations: Mapped activity workflows with teachers and US trainers to identify pain points.Feature Integration: Standardized UI components and introduced “Exit Ticket” features aligned with curriculum standards.
B2B Stakeholder MappingStakeholder Interviews: Analyzed AI perceptions and concerns within the “Class-School-District” hierarchy.User vs. Buyer Strategy: Visualized ecosystem priorities to distinguish buyers from end-users and define core requirements.
Cross-Cultural Concept TestingComparative Concept Testing: Evaluated 3 prototype variations to probe instructional mental models and expectations.Strategic Pivot: Identified a critical cultural gap, shifting the US product strategy from “Control” to “Student Empowerment.”

From Concept to MVP

Information Architecture: The “Talk”-Centric Model

Constructed a "one-to-many" interaction model using "Talk" as the foundational unit. By referencing the state management of survey platforms, I designed a scalable flow to handle asynchronous interactions between teachers, students, and AI. This core logic unified the entire system, from sitemaps and permissions to the final UI layout.

Project image

Design Highlights

Create: Scenario-Driven Talk Creation

  1. Streamlined the AI setup process by offering context-specific templates based on core classroom scenarios. This “template-first” approach minimizes decision friction and ensures each session is powered by a tailored AI prompt structure, input logic, and insight goals.

High-Efficiency Configuration Pipeline

  1. Asynchronous Updates: Teachers can upload materials and refine prompts without freezing the UI.
  2. AI-Assisted Metadata: The system automatically extracts educational standards and generates chatbot descriptions from uploaded content.
  3. Live Preview: Integrated a real-time testing panel for instant verification of the AI’s persona and logic.

Inclusive & Safe Student Interaction

  1. Real-time Safeguards: Integrated an AI-driven monitoring system that flags sensitive topics with labels, triggering immediate notifications for teacher intervention.
  2. Bilingual Support: A “Swift Switch” language toggle allows students to interact in their preferred language while maintaining a unified data view for the teacher.

Real-time Talk Results: Teachers can capture student thinking without closing the session.

  1. Enabled teachers to capture the “pulse” of the classroom without ending the session. The dashboard provides a Live Activity Feed for instant feedback, while AI automatically summarizes safeguard triggers and extracts key student quotes to highlight learning gaps.

UX Writing Guide for Education

To ensure a consistent and expressive product voice, I established a UX writing framework based on NN/g guidelines, co-developed with a U.S. Education Ph.D. This guide serves as the foundation for our AI’s personality and safety:

  1. Pedagogical Tone Strategy: Strategic use of encouraging vs. authoritative tones to guide students effectively.
  2. User-Specific Adaptation: Tailored vocabulary and tone nuances for distinct teacher and student interactions.
  3. Interaction Safeguards: Defined communication boundaries and “taboos” to ensure AI-user safety and professionalism.
  4. Linguistic Standards: A centralized system of Voice & Tone adjectives and a product-wide Glossary.

Product Market Fit Evaluation

Hypothesis

  1. Addressing Differentiated Learning Needs: “In the past, my teaching methods couldn’t effectively accommodate students who were falling behind. ChirpAI provides an excellent channel to bridge that gap.”
  2. Empowering Student-Centered Teaching to Boost Motivation: “The chatbot guides me on how to express my thoughts, which makes me feel truly understood.”

Recruiting and Study

  • Participants were recruited through online seminars, on-site teacher training sessions, product demos, and the ViewSonic US educator community.
  • From an initial pool of 49 participants, we collected 10 valid samples (3 online and 7 on-site), including both IT specialists and educators.

User Validation Results (3 Online + 7 On-site)

Through structured interviews and concept testing, we quantified the product’s impact across three key dimensions:

  • 10/10 Pilot Intent: Secured trial commitments from all interviewed US school administrators.
  • 6/7 Purchase Advocacy: High intent from trial educators to advocate for institutional procurement.
  • 9/10 Efficiency Gain: Users reported a significant reduction in feedback loop and administrative overhead.

The Core Differentiator

The primary driver behind these results is ChirpAI’s distinction from ChatGPT:

Unlike a standard chatbot, this platform empowers teachers to manage and monitor AI-student interactions in real-time—fostering a safe, pedagogically-sound environment that motivates learning.

Interview Quotes

  • Existing ViewSonic customers value the potential for student data integration and future cohesive solutions.
  • Non-ViewSonic customers see ViewSonic’s AI innovation as a strong branding opportunity.

Market Traction: Taiwan Pilot

Conversion: 100% (2 out of 2 schools) expressed intent to purchase or collaborate.

Target Audience: International and bilingual schools.

Real-world Adoption: After initial cold outreach and demos, teachers immediately adopted the product, incorporating it into their lesson plans the very next week.

Project image

Reflections: My Framework for AI-mediated Interaction

Building ChirpAI taught me that AI success is measured by the quality of human augmentation. I’ve distilled my learning into four pillars:

  1. Intelligence: Transforming raw data into context-aware insights. True value is created when the user perceives “intelligence” through actionable classroom-bound suggestions.
  2. Modality: The choice of interaction (chatbot vs. file-input) is a strategic one. It must be dictated by domain-specific scenarios, not default patterns.
  3. Human-in-the-loop: It’s about the timing of AI intervention. The goal is to substitute manual effort while preserving creative space for user customization.
  4. AI Ethics: Defining the limit of AI commitment. Trust is built by preventing over-reliance and keeping the AI strictly within pedagogical boundaries.

Final Acknowledgments

This 0-to-1 journey was a rare and rewarding challenge. I am grateful for the opportunity to have shaped the entire user experience from the research phase onward. Deep thanks to my PM, Abbie, for the late-night brainstorming and discussion.

US B2B Education: Full-Lifecycle Ecosystem Research

US B2B Education: Full-Lifecycle Ecosystem Research

A comprehensive study mapping the US B2B education journey from procurement to classroom usage, synthesizing insights from all stakeholders: buyers, IT, teachers, students, and parents.

UNIVERSE: Architecting Spatial Collaboration

UNIVERSE: Architecting Spatial Collaboration

Defining interaction standards for the 3D virtual classroom - balancing technical constraints with intuitive usability to drive student engagement.