Just Joshin' #109 (GPT)



1 Family Photo:
Generative Pre-Trained Transformers

My dad asked, "How do I use the AI?" Seems like a good prompt for the week.

What is AI? Artificial Intelligence is a general phrase we use for computers doing things that historically require human intelligence. Deciphering handwritten letters or recognizing speech are classic AI examples. Decision making is another. Netflix uses AI to recommend new shows. Google uses AI to put more profitable ads into your googling. If you've ever watched the next show or searched the internet, you've used the AI.

But 'the AI' everyone is talking about these days is about GPTs and LLMs.

GPT stands for Generative Pre-trained Transformer. GPT takes an input, uses some math to transforms it, and generates an output. The model equation 3(x) = y is a type of transformer. I give an input of (x) = 2, and it transforms that into an output of (y) = 6.

So how does the GPT transform the input? It's pre-trained! As my friend Joris wrote in The Great Text Predictor, LLMs (Large Language Models) are big math equations designed to statistically predict what should come next. In training, they're fed lots of data, try to guess what should come next, see what comes next, and the equation is adjusted from there. In the end, the model itself doesn't contain the training data (I think), it just contains lots of parameters. My equation above had 1 parameter, GPT4 (released in March of 2023) has 1.8 trillion parameters.

The first 10 minutes of this video by Scott Hanselman is a great non-technical demonstration of how this all works.

video preview

A lot of companies are training their own LLMs. Some notable examples:

So how do you use the AI?

As Scott Hanselman says in the video above, we're still trying to figure that out.

If you're interested, click on one of the LLM links above. Try out one of the free services. Think of a question you have or a problem you want to solve, something you'd normally try to figure out with some internet searches, and instead chat with the AI about it like you were chatting with an intelligent friend who was knowledgeable on question. For a lot of the services, you can use the microphone as your input instead of typing. See where that takes you.


1 Dad Joke:
Transformers


Highlights:
Prompts

Generative AI in a Nutshell by Henrik Kniberg (2024)

Basically, intelligence is now available as a service, kind of like a giant brain floating in the sky that anyone can talk to. It's not perfect, but it is surprisingly capable, and it is improving at an exponential rate. This is a big deal. It's going to affect just about every person and company on the planet, positively or negatively.

Here's a silly but useful mental model for this: you have Einstein in your basement.

You can talk to Einstein whenever you want. He has instant access to the sum of all human knowledge and will answer anything you want within seconds, never running out of patience, he can also take on any role you want, a comedian, poet, doctor, coach, and will be an expert within that field. He has some humanlike limitations, though. He can make mistakes, he can jump to conclusions. He can misunderstand you.

But the biggest limitation is actually your imagination and your ability to communicate effectively with him. This skill is known as prompt engineering, and in the age of AI, this is as essential as reading and writing.
video preview

I hope you weren't getting too comfortable by Ethan Mollick (2023)

To be clear: the Bing AI is not that different than ChatGPT, upon which it is built. Most importantly it doesn’t really solve the accuracy problem. The AI still lies, and results can be full of subtle inaccuracies and hallucinations. But just like ChatGPT was a massive qualitative increase over the very technically-similar GPT-3 AI, the new Bing AI offers the same improvements over the technically-similar ChatGPT.

The problems remain mostly the same: prompt engineering matters, it lies a lot, if you don’t spend effort you will get bad results, etc. But, again, I don’t think that is going to matter much because it can already do so much for us.

We can’t get comfortable knowing what generative AI is capable of, because capabilities are increasing very rapidly, and small changes can increase what they can do by a large amount. An approach to AI that worked last week won’t work this week.

Using GPT-3 to augment human intelligence by Henrik Olof Karlsson (2022)

Prompt engineering is the term AI researchers use for the art of writing prompts that make a large language model output what you want. Instead of directly formulating what you want the program to do, you input a string of words to tickle the program in such a way it outputs what you are looking for. You ask a question, or you start an essay, and then you prompt the program to react, to finish what you started.

Soon everyone can have an alien intelligence in their inbox.

A guide to prompting AI (for what it is worth) by Ethan Mollick

There are no secret prompts.
In fact, I think the emphasis on prompting as the key to using AI is a mistake for a number of reasons. But, I also have come to realize that there are some things people don’t understand about prompts1, which can help make the task of using AI easier. So I do want to share those.

Being “good at prompting” is a temporary state of affairs. The current AI systems are already very good at figuring out your intent, and they are getting better. Prompting is not going to be that important for that much longer. In fact, it already isn’t in GPT-4 and Bing. If you want to do something with AI, just ask it to help you do the thing. “I want to write a novel, what do you need to know to help me?” will get you surprisingly far.
--
The best way to use AI systems is not to craft the perfect prompt, but rather to use it interactively. Try asking for something. Then ask the AI to modify or adjust its output. Work with the AI, rather than trying to issue a single command that does everything you want. The more you experiment, the better off you are. Just use the AI a lot, and it will make a big difference - a lesson my class learned as they worked with the AI to create essays.
--
It can help to tell the system “who” it is, because that gives it a perspective. Act as a teacher of MBA students will result in different output than if you ask it to act as a circus clown. This isn’t magical — you can’t say Act as Bill Gates and get better business advice — but it can help make the tone and direction appropriate for your purposes.
--
Then add additional constraints. You can add styles like write this in the style of the New Yorker or write this in casual way. You can tell it to avoid repetition or make it accessible to a 10th grader. You will find some of these approaches work better than others in different contexts, and it can be hard to know which things will work in advance, so experiment.

The secret is practice.

iamJoshKnox Highlight:

I applied to the Roots of Progress Fellowship this week. I'd like to write more about AI and education.

I'll probably do it with or without the Fellowship...we'll see how that goes.

--
Also, my book is available on Amazon here or reply to this email and I'd love to gift you a copy.


Want to Talk?

Grab some time on my calendar to share a story from this week:

Let's Chat!

Book some time even if you don't know what you want to talk about:
https://calendly.com/iamjoshknox

Until next week,
iamJoshKnox​


Thoughts? Feedback?
😊Hit Reply and let me know😊


Josh Knox

Hi! I am Josh Knox. Read more of me here: 👇

Read more from Josh Knox

1 Family Photo:Bernoulli's Principle Lawrence experiencing Bernoulli's Principle at the SLO Children's Museum. Bernoulli's Principle states that as a fluid's velocity increases, its pressure decreases. It turns out air is a fluid—this insight helped the Wright brothers get their airplane off the ground. If you hold the end of a piece of paper and blow across the top, the other end will rise. The fast-moving air from your breath has lower pressure than the stationary air beneath the paper. The...

1 Family Photo:Kindergarten Transitions In 2021, California added a new grade to elementary school: TK. TK—Transitional Kindergarten—would be available to all four-year-olds in the state. However, scaling up takes time. In 2022, California offered TK to 25% of four-year-old Californians. Each year, California has offered TK to an increasing percentage of four-year-olds. Next year, in 2025, TK will finally be available to every four-year-old in the state—these intervening years have been...

1 Family Photo:Important Meeting I checked a box on his enrollment form, so Calvin had to take a language assessment test before starting school. Calvin is bilingual, though the state of California isn't concerned with how well he knows the Portuguese language—just "does he speak English good?" -- The evening before his language assessment, we were walking home from the park. The girl across the street was with us. As they walked side by side, Calvin turned to the girl: "You can hold my hand...