What Is This?

Colleagues: Most of us have complicated feelings about AI; some curious, some skeptical, some totally on board, some quietly wondering what it means for our work as professors. This program is for all of you. You don't need to have made up your mind. You just need to be willing to spend a summer finding out.

Here is our view: AI is absolutely not everything, but it helps you in many ways. We hear consistent stories from friends who were skeptical about AI becoming daily users of AI on all kinds of tasks (preparing exams, grading homework, automating experiments, and writing committee reports!) AI is not past, not future, but now. If you are still hypothesizing what AI can do in your classroom, trying it out once may change the way you think about this technology.
 
So, join us this summer. We invite you to participate in this wonderful program: discover what these tools can do, find where they fall short, and build something genuinely useful for your teaching before Fall kicks off.

Program Overview

We have two participation tracks:

  • Group A is for faculty who want a free Claude subscription to explore AI from scratch. *Claude is an AI assistant similar to ChatGPT and Gemini, made by Anthropic.
  • Group B is for faculty who already have an AI tool and want to join and compete on their own.
Category Details
Duration May 18 – Aug 10, 2026
Purpose Explore AI tools hands-on, build something useful for your teaching, and share what you learned with colleagues.
Who Is Eligible All Purdue College of Engineering teaching faculty (WL and Indy) tenure-track, professors of practice, and lecturers.

Rule of thumb: If you walk into a classroom and teach, you qualify.
Cohort Size

Group A Participants: We will select up to 100 participants to receive a 3-month free Claude Pro subscription. If entering the challenge as a team, you need to decide who in the team will receive the license.

Group B Participants: Unlimited number of participants and teams for those who use their own AI tools.

Grand Prize 3 winners receive 1-year Claude Max 5x subscription.
Time Commitment 1 workshop + 10-15 hours development time for a self-defined project.

Registration and Application

  • Application deadline: May 11, 2026
  • Notification of acceptance: May 18, 2026
  • Deliverable deadline: Aug 10, 2026
  • Announcement of Grand Prize winners: Aug 18, 2026
  • Regardless of whether you are Group A (need a free Claude license) or Group B (do not need a free Claude license), you need to register. To register, complete the Google Form. Tell us briefly what teaching challenge you'd like to explore (in 100 words). Even a rough idea is fine.
  • For Group A participants, we will select up to 100 participants (or teams) based on the proposal. Teams with less experience in AI will be given a higher priority.
  • You are more than welcome to form a team if you want to compete for something more complex. However, if you are applying for Group A, we will only provide one license per team.
  • This program is designed for you to build hands-on experience personally. Please do not ask your graduate students to do the project on your behalf. Please do the work yourself because that's where the learning happens!
  • A note on data: please avoid including identifiable student information in any AI prompts or uploads.

Summer Workshops

Workshop schedule and speakers will be announced in May. We are working to invite representatives from Microsoft, Google, Walmart, Amazon, as well as Purdue faculty. Workshops will be held over two or three sessions in June and July, approximately 1 hour each, hybrid format.

Your Commitment

All participants (Group A and Group B) agree to two things over the summer:

  • Attend at least one summer workshop. (Workshop schedule will be announced by May 18.)
  • Submit a teaching-related AI deliverable by Aug 10, 2026 (see Deliverable section below).

We estimate the total time investment at roughly 10–15 hours across the summer.

What We Offer

  • Group A participants receive a free 3-month Claude Pro account subscription, courtesy of the College (subject to standard daily/weekly usage limits).
  • Group B participants are welcome to join and compete with no cap. There is no quota for this group.
  • Summer workshops feature speakers from industry and academia sharing hands-on experience with AI in teaching and research. (Schedule announced by May 18.)
  • After the program concludes, we will publish an internal faculty website showcasing all submissions with full attribution. Your work will be preserved and credited.
  • Three Grand Prize winners receive a 1-year Claude Max 5x subscription (valued at $1,200). The prize is a bonus, not the point. Every participant walks away with real experience, workshop access, and a finished deliverable.

Deliverable

Your deliverable should be something genuinely useful to other faculty. It must:

  • Include a pptx presentation template (download)
  • Be supported by an actual product, such as a short video clip or a blog or an app or an agent, etc.
  • Involve at least one AI tool (Claude, ChatGPT, Gemini, or similar).
  • Show concretely how you used AI and share lessons others can apply.

You retain full ownership of your deliverable and are free to publish or share it externally.

Examples

  • A blog post walking through how to generate a customized exam in Claude by feeding it past homework and exam questions, with specific instructions on tone, difficulty, and length.
  • A simulation demo of textbook examples for your class.
  • A short app or script that pulls exam questions from Gradescope and auto-generates grading feedback.
  • A recorded walkthrough showing how to build a course website using Claude or Gemini.

Evaluation Criteria for Grand Prize

The Grand Prize is a bonus. Every participant benefits from the program regardless of outcome. For those who wish to compete, submissions will be evaluated on:

  • Usefulness: Does it solve a real teaching problem?
  • Replicability: Can other faculty adopt or adapt your approach?
  • Learning value: Do you share lessons others can act on?
  • Complexity: How ambitious is the project relative to your starting point?

Evaluation Committee

We have strong support from the schools. Your school’s AI representatives are our judges! We will evaluate based on the pptx you submit, plus other supplementary video / blog posts. We will have a semi-automated evaluation system based on AI, and then we will have school representatives to manually assess and vote on the winers. So, to help your team pass the automated evaluation system, please follow the pptx template!

For questions about the evaluation, please contact David Inouye (dinouye@purdue.edu).

Resources

Our colleague Eugenio Culurciello is curating a list of recommended YouTube videos and tutorials on using AI in teaching. We will share the full list with all participants by May 18, 2026.

In the meantime, here are a few resources to get you started:

Contact