20 Comments

It’s a game changer. I just rewrote my syllabus for an upcoming course. What I appreciated most: Perspective. My limits are my experience--ChatGPT provided ideas and direction I might have struggled to consider. The syllabus is now better as a result. My take: tools like this will become a “constant companion” much like spell and grammar checks, or the technology that enables my phone to automatically improve my inherent poor photography skills. Thanks for the inspiration!

Expand full comment

I'm surprised by this and would love to hear more if you don't mind. My experience (both plugging in my own q's and my take on Ethan's prompts and responses) is that it's able to create passable syllabi and assignments but wouldn't be particularly useful. Were you using it to create a template/reference language for class policies and assignments, to give ideas for class readings and class topics, or something else?

In either case, was the advantage over googling to find other course syllabi (for example) that you could have a "conversation" with ChatGPT and refine your queries, or was it something else?

So far, I've gotten the sense that this is a really nice search engine, especially when you're not sure what you're looking for, but I haven't found a good use of the actual text it outputs so I'd love to hear more about how you're using it!

Expand full comment

Jorgen - my prompt was (as I recall) “write a 15-session syllabus for a college level course on the future of advertising.” I wasn’t expecting and didn’t receive highly nuanced ideas from ChatGPT. I did get 15 sessions outlined with 2-3 sentences each; generally covering topics I would expect. What surprised me were ideas about regulation and ethics 😆 which I hadn’t considered. And in 2 seconds! For me, the experience was like having a creative partner who quickly offers suggestions. Beats doing all the work alone by myself!

Expand full comment

Thanks Tim! That makes sense. I had a similar experience with asking for a course outline for a class I've taught on intergenerational mobility with suggested readings--it gave a few suggested readings that I didn't know (some of which didn't exist but some of which did!)

The creative partner analogy is great and makes sense--it's useful for your own creative process to have someone/something to bounce off of and offer suggestions.

I could see this tech becoming really useful if it was designed primarily as a search engine/interface for specific resources. If I could have a conversation with jstor/google scholar about the type of papers I was looking for, for instance.

Expand full comment

I think the most interesting takeaways from this is that those who already know their stuff will find it easier to use this in their workflow and get even more productive.

Expand full comment
author

I think that is the most positive takeaway...

Expand full comment

Yes that does jive with how I look at it

Expand full comment

I wasn't amazed _here_, but only because I had already been amazed experimenting myself in the last few days.

I'm able to use it to find info, like that for machine learning PlaidML is the solution I seek that will run on my graphics card, that older versions of GPT-3 were not able to get and which I have been having trouble even finding looking online myself! Often it will come up with incorrect solutions as well, but if even one solution can lead me in the right direction, that is amazing!

Talked with someone else who gave prompts, and it was able to summarise how to replace part of a motorcycle I knew nothing about, and apparently did so well!

Furthermore, I was even able to give it a very badly-formatted infodump of all the things I want to get done and things I might need to do in a given day, and it was actually able to give me a decent schedule. I might base my day on this now! And in general, I can give it problems in my life and it can give decent solutions while being very patient.

I've also used it to write and edit messages for me, no problem. The results when doing this are remarkably good. To be fair I often edit these so much to match my style that it actually still takes a really long time, often longer than it would normally take me to write a message (and I often like to do it only by telling the AI what to do so that the final message is written by AI), but, still, very amazing! (On another note, I also tried editing this comment with AI too but it has not been working well so I did not use it in the end. I think it may be having issues with the length of it (both the comment itself and when I try to give the article for context), often cutting the answer off when giving long answers. The AI may also just be a bit overloaded right now; even simple questions have a huge delay when trying to get it to answer and in the past I've certainly been able to much get longer answers without them being cut off than I currently am!).

The one are where I find the AI does badly is when trying to use it to write book outlines given a certain plot and conditions. When I did that it kept falling into annoying cliches, and when I told it NOT to use the cliches would make inconsistent writing where it said the cliche then said the thing against it – for example, after writing "Without using or being augmented by AI, peopls are often not competitive, but these augmentations ARE very cheap and widely available, with pretty much anyone who wants access to them being able to access them", it would write "However, not everyone has access to AI augmentations, which are necessary to remain competitive in the job market. These augmentations, based on artificial neural networks, are widely available and affordable, but not everyone is willing to undergo the modifications" and that they were illegal for some reason. I had to do lots and lots of edits to turn it into a consistent story where anything made sense at all. Actually a pretty interesting plot idea after that though.

It, weirdly, seems to understand the world much, much less well when writing stories. On the other hand, one time when trying to get it to summarise my prompt for writing an outline and pointed out an inconsistency, it was even able to tell me exactly why it did not make sense! And then re-write it so it did make sense after I asked it to do so.

I really wish OpenAI would save transcripts in-between browser restarts, because although I copy-pasted a lot and screenshotted even more, I still did not get everything I would have liked.

Expand full comment

To me this example shows how robotic the work of how many professors has become. As a math. professor I have always loathed making syllabi. Many of my colleagues in management do not seem to have this problem, and dedicate a significant amount of time to the procedural aspect of learning. If we do not want humans to become cogs in a machine we may want to avoid teaching our students as if they were cogs.

Expand full comment

Count me as impressed. Don’t love all the passive and rather bland voice, but as a starting point for a project, I can see it helping the author get beyond the blank page. I’ve written a lot in my professional career and always prefer the editing phase. Is there a citation protocol when you’ve used an AI tool in your work? I wonder also how readers/students are influenced by knowing that an author has used an AI tool in their work.

Expand full comment

However there is a catch. I noticed that if you ask AI to add a list of references on a subject it often provides inaccurate feedback: paper titles and authors do not exist, and even the papers are not on the journals that the systems reports them to be. On one hand this is excellent as when my students try to used I can identify who did the work and who used it to create an automatic response. On the other hand it is frustrating as it is not accurate and often it produces results just to get things printed out.

Expand full comment

A few days ago, I searched in ChatGPD, “what is entrepreneurship?” The answer was straight out of a 1980s textbook. That’s scary for anyone assuming the advice is accurate.

Expand full comment

Fascinating read, Ethan. I am curious: I replicated your queries in chat.openai.com, and got slightly different answers. I also tried a few additional queries, where the Chat Bot's answers were "less creative" than I would have expected. To that end, I was curious whether you used the Chat bot interface on the web, or whether you programmed queries in python to get more robust answers. Specifically, do you know what the default "temperature" setting is for the chat.openai.com interface? I suspect the default may be "zero", hence more limited answers in the open chat interface. (From your old classmate, Todd...)

Expand full comment
author

I only used the actual web chatbot here. The different answers are normal (we can’t control the seed or temperature here, but you can on Playground, but that doesn’t keep state). I do think they have been dialing down the creativity over time this past week.

Expand full comment

Does anyone know the name of the artist who made the image at the bottom?

Expand full comment
author

I created the picture in Midjourney. Prompt: **black and white drawing of a steampunk robotic professor in front of a class, cut away drawing shows the professor is full of gears, lots of detail --v 4**

Expand full comment

lol. this AI is getting out of hand!

(I also run a steampunk website, neverwasmag.com, where - among other things - we feature steampunk artists, that's why I was curious. This really drives home the whole question of how to deal with AI-created art...)

Expand full comment

Ask it what’s heavier - a pound of lead or a pound of butter.

Expand full comment
author

"A pound of lead and a pound of butter both weigh the same amount because they both weigh one pound. The weight of an object is determined by its mass, not its composition, so a pound of lead and a pound of butter would have the same weight even though they are made of different materials"

Expand full comment

I get that sometimes and

„ A pound of lead weighs more than a pound of air.“

Or

„ A pound of lead weighs more than a pound of air. This is because lead is a dense metal, while air is a mixture of gases that is much less dense. A pound of lead has a much smaller volume than a pound of air, but it has more mass due to its density.“

Sometimes. The latter two more often.

Expand full comment