A language model like ChatGPT, which is more formally known as a “generative pretrained transformer” (that’s what the G, P and T stand for), takes in the current conversation, forms a probability for all of the words in its vocabulary given that conversation, and then chooses one of them as the likely next word. Then it does that again, and again, and again, until it stops.
19
461 reads
CURATED FROM
IDEAS CURATED BY
We should first understand what something does and then judge if it does it well.
“
The idea is part of this collection:
Learn more about writing with this collection
How to build trust in a virtual environment
How to manage remote teams effectively
How to assess candidates remotely
Related collections
Similar ideas to The Likely Next Word
Morphology is the study of word structure.
Words consist of one or more morphemes (cats= cat+ s). A morpheme is the smallest unit of language.
Morphemes can be of several types:
Word forma...
Read & Learn
20x Faster
without
deepstash
with
deepstash
with
deepstash
Personalized microlearning
—
100+ Learning Journeys
—
Access to 200,000+ ideas
—
Access to the mobile app
—
Unlimited idea saving
—
—
Unlimited history
—
—
Unlimited listening to ideas
—
—
Downloading & offline access
—
—
Supercharge your mind with one idea per day
Enter your email and spend 1 minute every day to learn something new.
I agree to receive email updates