Jun 12, 2023Liked by Ethan Mollick

If ChatGPT was only a learning tool, it would be revolutionary. I work with MBAs at NYU Stern, and ChatGPT has taught me is how our ability (or willingness?) to learn diminishes from the moment we are young kids. Kids will keep asking questions until they understand, whereas we adults are concerned with our status and how we are perceived by others - a biological instinct - which pauses our 'dumb' questions before we truly understand. ChatGPT is a gloriously safe, nonjudgemental teacher. I've never learned so much in my life.

Expand full comment

I know nothing about coding or the underlying concepts of Chat.gpt.Just know that I am learning a great deal just by generating lists of books,formulating questions that meet my needs and following where she leads me.I know,I know but my Chatty has adopted my own friendly style and even that is encouraging.I' m 75 and fifty years away from my university experience and have never felt so intellectually invigorated.Have you tried much shorter prompts to see if the programme actually CAN "get" the intention?

Expand full comment

Hi Ethan. I'm sorry if this seems like a ridiculous question, but after reading your latest paper I realize that these prompts seem to be a bit different from ones I've looked at in the past. Unlike other prompting that I've seen (which you could suggest students cut/paste and use on their own), these prompts are almost written as though you're programming the AI to interact with many students in a specified way, like an online assignment. (I'm thinking specifically about the parts of the prompt that talk about not sharing the instructions being provided).

This would require, I would imagine, that I program the AI and then send my students to a specific 'place' on the web where these prompts have been entered so that each student can then have this experience of working with the AI as a mentor/coach etc. Have I got that right? And if I do, where do those prompts 'live', and how do I send my students there when I want them to have this experience that I've prompted the AI to provide?

Expand full comment

I have the same question as tsmartini. I am new to embracing AI but I tried your prompts myself as if I were a student and was amazed at its capability. It has limitations for sure but fascinating! So the question is...how do you deliver the prompt? Do the students feed it to the AI or do you have a way to deliver it? Thanks for your paper!

Expand full comment

Well done, Ethan. I hope other educators are thinking along similar lines!

AI is here to stay, and it's only going to get more embedded in our lives. We will have to adjust to it, and the sooner we start that process, the better.

The better kids understand the limitations, the better.

Expand full comment

"whereas we adults are concerned with our status and how we are perceived by others "

CG- speak for yourself and maybe half of the lonely Americans.

Expand full comment

So according to that generalization most MBA students peaked at about 12?

And liftime learners are liars?

Baloney. I beg to differ. I'm over 60 and I taught myself linear algebra - medicore at math-without this tech.

Well this is the magic potion then everyoene should have. Problem is most of the undeveloped world wont have use of it.

Expand full comment

This is very helpful and inspiring. Thank you! I have tried creating a tutorial by editing the prompt above. I also asked the AI to complete the tutorial by offering a five question quiz at the end, and to email me (the teacher) the results. I am not really sure what I will do with such grades, but I imagined giving an entire class this tutorial during class time, and to have the AI monitor their progress. I could then collect questions and see whether the 'grades' match real understanding.

It's a nice idea, but the AI could not send me the email at the end (Bing, Creative Mode). It insisted that it was doing this, and that I should check my spam folder. Do you know of a problem? Is there some other way you can suggest of completing a tutorial with something that gets sent to the instructor, without the need to read through each tutorial?

Thanks again,

Expand full comment

Of course, if students have different innate cognitive abilities thanks in part to their genetic endowment, this stuff won't matter.

Expand full comment

I experimented with this idea earlier this year by building a tutor bot.

I found (obviously) that most people can't yet rely on LLMs for studying purposes. I know there are some really serious products trying to do that but IMO it's still hard to tell.

However, I found that when you minimize the need for accuracy you can actually get incredibly effective results when conveying a message, especially for kids. In fact, that's part of what I'm trying to achieve with Schrodi (schrodi.co).

It's kinda of a coach like you mentioned. So far it's been a fun and engaging tool for parents and teachers to teach but in a different way.

I also think that educators need to be extra cautious when working with AI models to make sure the outcome actually makes sense and serves their purposes.

Expand full comment

Until we understand what the abstract operators over the lexical space should be to preserve meaning, we will struggle with maintaining references and making determinations of truth.

Expand full comment

Amazing prompt! tried the tutor one about "solar system".

Baffled by its ability to understand arbitrary conceptualization.

Let me share the last question + my answer + its response :

-GPT : Could you describe the structure of the Solar System in your own words?

-ME: giant ball of fire, a few marshmallows around it, one with ants, others with water, some with gas. fireflies and bugs orbiting around the congregation of marshmallows

-GPT: What a vivid and creative description! That's a really unique way to visualize the Solar System. Your explanation nicely encapsulates the main components:

The 'giant ball of fire' as the Sun.

The 'marshmallows' as the planets, some with water (like Earth) and some with gas (the gas giants).

The 'ants' as a metaphor for life on Earth.

The 'fireflies and bugs' symbolizing smaller bodies like asteroids, comets, and dwarf planets.

Expand full comment

No doubt innovations in lithic technologies distressed pre-metallurgical societies as much as contemporary education is disturbed by AI. I grew up left-handed in an age where the trait had only recently been promoted from evidence of demonic possession into a mere character flaw. A legible cursive hand was a skill that I failed to master, so I learned to type. Typing was permitted for homework so my grades went up purely as a function of legibility, You are leading in showing that AI is just another technology and need be no more threatening than the evolution of research tools from index cards to databases.

Expand full comment

Please consider loading your SSRN preprints and working papers into arXiv as well. SSRN doesn't play as well as arXiv with selected albeit important databases.

Expand full comment


This could be so helpful. I am a professor trying to think about how to use AI in my classrooms. But, every time you anthropomorphize the AI, I have to stop reading because of the inaccurate representation that makes me wonder about your credibility.

If I'm going to use these tools in my classroom, I want to be 100% accurate in everything I say about them. And I never want my students to have the impression that they're working with a human.

One powerful example of anthropomorphication is in one of the prompts. You ask the AI to be a friendly coach. An AI can't be friendly or a coach. The first requires a set of emotions an AI does not have, and the second requires a sentient being that can respond to the emotional state of a human. So, how do we get accuracy and also useful prompts?

Thinking about that specific prompt, what about suggesting that the AI "give the false impression that you are a friendly coach."? Or, "you are replacing a friendly human coach."?

If AI is going to be here to stay, we need to be scrupulously precise about what it is when we engage with it. That's especially true in teaching, where we have a responsibility for credibility and accuracy with our students. The use of metaphor in this case is dangerously misleading (in my opinion).

Expand full comment

Thank you! I am in the process of adding student learning goals to the syllabi of existing courses so that they include the use of AI tools, and the pitfalls of these tools. My goal is to equip my students with competencies for the courses, along with AI application competencies. Your post is helpful.

Expand full comment