< Return to Video

You are a better writer than AI. (Yes, you.)

  • 0:01 - 0:04
    - I've spent the two years
    since chat GPT launched,
  • 0:04 - 0:06
    steeping in a morass
    of academic panic.
  • 0:06 - 0:08
    Voices from
    administration and colleagues,
  • 0:08 - 0:10
    and anyone else
    with enough brain cells
  • 0:10 - 0:13
    to maintain a sense
    of existential dread, crying out,
  • 0:13 - 0:16
    "We need to figure out
    what to do about AI."
  • 0:16 - 0:18
    Our Ed Tech committee
    is developing a policy.
  • 0:18 - 0:20
    The academic Senate wants
    to develop a policy.
  • 0:20 - 0:22
    The board thinks
    we should have a policy.
  • 0:22 - 0:24
    My dean wants
    us all to have policies.
  • 0:26 - 0:29
    The California Teachers Association says
    it's an issue of academic integrity.
  • 0:29 - 0:33
    The State Senate says
    it's an issue of ethics.
  • 0:33 - 0:35
    "We need to pay for
    the AI detection tools."
  • 0:35 - 0:38
    "The AI detection tools
    don't work."
  • 0:38 - 0:40
    "We need to accept that
    our students will use AI."
  • 0:40 - 0:42
    "How do I prove
    my student used AI?"
  • 0:43 - 0:47
    It is incomprehensible to me,
    this conversation.
  • 0:50 - 0:51
    [keyboard clacks]
  • 0:51 - 0:53
    I hear their words,
  • 0:53 - 0:55
    see their language floating
    across my monitor,
  • 0:55 - 0:56
    and know the words,
  • 0:56 - 0:58
    but I cannot get to
    the meaning
  • 0:58 - 0:59
    because I simply
    do not understand
  • 0:59 - 1:02
    why they are talking about it
    in this way.
  • 1:02 - 1:05
    - [Kermit the Frog]:
    ♪ New York, I love you,
  • 1:05 - 1:09
    but you're bringing me down ♪
    - with all these empty words.
  • 1:09 - 1:13
    This is not the conversation
    I think we need to have.
  • 1:13 - 1:15
    [song continues]
  • 1:15 - 1:17
    This is the conversation I need to have.
  • 1:17 - 1:19
    [overlapping
    music and poem]
  • 1:19 - 1:22
    [Gertrude Stein]: " 'If I Told Him,
    a Completed Portrait of Picasso'.
  • 1:22 - 1:24
    If I told him would he like it.
  • 1:24 - 1:26
    Would he like it if I told him.
  • 1:26 - 1:29
    Would he like it would Napoleon
    would Napoleon would
  • 1:29 - 1:30
    would he like it.
  • 1:30 - 1:34
    If Napoleon if I told him
    if I told him if Napoleon.
  • 1:34 - 1:37
    Would he like it if I told him
    if I told him if Napoleon.
  • 1:37 - 1:41
    Would he like it if Napoleon
    if Napoleon if I told him.
  • 1:41 - 1:45
    If I told him if Napoleon
    if Napoleon if I told him.
  • 1:45 - 1:49
    If I told him would he like it
    would he like it if I told him.
  • 1:49 - 1:51
    Now. Not now. And now.
  • 1:51 - 1:54
    Now. Exactly as is kings.
  • 1:54 - 1:56
    Feeling full for it.
  • 1:56 - 1:58
    Exactitude as kings.
  • 1:58 - 2:00
    So to beseech you
    as full as for it.
  • 2:01 - 2:02
    Exactly or as kings.
  • 2:03 - 2:06
    Shutters shut and open
    so do queens.
  • 2:06 - 2:10
    Shutters shut and shutters
    and so shutters shut and shutters and so
  • 2:10 - 2:13
    [poem and music fade out]
  • 2:16 - 2:19
    - I don't understand Gertrude Stein.
  • 2:21 - 2:25
    Stein is not nearly well enough
    remembered for how influential she was.
  • 2:25 - 2:27
    An American expatriate poet
    living in Paris,
  • 2:27 - 2:30
    her salons were among the
    anchors of the early modernists.
  • 2:30 - 2:31
    You may not have heard of her,
  • 2:31 - 2:34
    but you've heard of
    the people who visited her.
  • 2:34 - 2:36
    Ernest Hemingway, Sinclair Lewis,
  • 2:36 - 2:38
    F. Scott Fitzgerald, James Joyce,
  • 2:38 - 2:40
    Thornton Wilder, Ezra Pound.
  • 2:40 - 2:43
    People you've read
    or been assigned to read.
  • 2:43 - 2:46
    We remember Hemingway
    because he wrote like this.
  • 2:46 - 2:49
    We remember Fitzgerald
    because he wrote like this.
  • 2:49 - 2:51
    The right kind of day
    and the right kind of moment,
  • 2:51 - 2:53
    and Pound's
    "In a Station of the Metro"
  • 2:53 - 2:57
    still recites itself completely
    in my head, a perfect image.
  • 2:58 - 3:00
    "The apparition of
    these faces in the crowd:
  • 3:00 - 3:02
    Petals on a wet, black bough."
  • 3:03 - 3:07
    We don't remember Stein
    because she wrote like this.
  • 3:10 - 3:13
    This is "If I Told Him,
    a Completed Portrait of Picasso",
  • 3:13 - 3:14
    published in 1924,
  • 3:14 - 3:18
    and continuing the project
    of her 1914 book Tender Buttons,
  • 3:18 - 3:20
    a phrase she never defined.
  • 3:20 - 3:23
    To me that phrase
    "tender buttons" feels right:
  • 3:23 - 3:25
    small, soft contradictions,
  • 3:25 - 3:27
    words that seem like
    they should go together
  • 3:27 - 3:30
    but do not actually
    make meaning.
  • 3:30 - 3:32
    That is how Stein's poetry feels.
  • 3:32 - 3:35
    There is something compelling
    about the rhythm of her nonsense,
  • 3:35 - 3:38
    the feeling of her
    almost meaning something,
  • 3:38 - 3:40
    and then how it falls apart.
  • 3:41 - 3:41
    "As presently.
  • 3:42 - 3:43
    As exactitude.
  • 3:43 - 3:44
    As trains."
  • 3:45 - 3:48
    But it is incomprehensible to me.
  • 3:48 - 3:50
    I don't know why Stein
    would write like this.
  • 3:50 - 3:52
    To quote the poet:
  • 3:52 - 3:58
    - ♪ "Oh, what on earth would make a man
    decide to do that kind of thing?" ♪
  • 4:00 - 4:03
    - But I think the reason
    that I don't understand Gertrude Stein
  • 4:03 - 4:06
    is that she didn't really want
    to be understood.
  • 4:07 - 4:09
    She used language
    for something different.
  • 4:10 - 4:12
    It doesn't communicate.
  • 4:12 - 4:16
    It reads like stunt linguistics,
    which it almost is.
  • 4:17 - 4:19
    "Buffalo buffalo Buffalo buffalo buffalo",
  • 4:19 - 4:20
    "had had 'had', had had 'had--'",
  • 4:20 - 4:23
    These are sentences that,
    if you pour over them closely enough,
  • 4:23 - 4:24
    can be decoded.
  • 4:25 - 4:28
    Stein's Tender Buttons cannot.
  • 4:29 - 4:32
    There is something about it
    that parses as AI.
  • 4:34 - 4:37
    It feels like the work of Keaton Patti,
    the person most prominently behind
  • 4:37 - 4:41
    the "I forced a bot to watch whatever"
    tweets that used to go viral.
  • 4:42 - 4:44
    Human-written screenplays
    designed to feel like
  • 4:44 - 4:48
    AI writing attempting to imitate
    other human-written screenplays.
  • 4:50 - 4:52
    It feels like
    an autocomplete challenge,
  • 4:52 - 4:54
    like in the early days
    of predictive text and messaging
  • 4:54 - 4:57
    where you just tap the suggested word
    and see what comes out.
  • 4:57 - 4:59
    It's not how AI really writes,
  • 4:59 - 5:02
    but it's how people feel
    like AI writes.
  • 5:02 - 5:05
    But Gertrude Stein
    was a person.
  • 5:05 - 5:07
    [Stein reading, underneath]
  • 5:08 - 5:09
    She wrote "If I told Him"
  • 5:09 - 5:12
    not because the language
    would communicate,
  • 5:12 - 5:13
    but for some other reason.
  • 5:14 - 5:18
    - [Stein]: the exact resemblance
    as exact as a resemblance
  • 5:18 - 5:21
    - When I read "If I Told Him",
  • 5:21 - 5:23
    when I listen to Stein
    read "If I Told Him",
  • 5:23 - 5:25
    the language slips
    through my mind.
  • 5:25 - 5:26
    It doesn't mean anything.
  • 5:28 - 5:30
    But she did.
  • 5:31 - 5:33
    Even if I cannot
    take hold of that meaning,
  • 5:33 - 5:35
    I can feel her,
  • 5:35 - 5:37
    reaching out to me
    from behind the text,
  • 5:37 - 5:39
    trying to connect.
  • 5:39 - 5:41
    - [Stein]:
    As a resemblance to him.
  • 5:41 - 5:43
    Who comes first.
    Napoleon the first.
  • 5:43 - 5:45
    Who comes too
    coming coming too,
  • 5:45 - 5:47
    who goes there,
    as they go they share,
  • 5:47 - 5:51
    who shares all,
    all is as all as yet--
  • 5:51 - 5:53
    ["yet" reverberates and fades out]
  • 5:56 - 5:58
    [relaxed synth beats]
  • 5:58 - 6:00
    - [ChatGPT femme voice]:
    That's an interesting question!
  • 6:00 - 6:03
    Gertrude Stein,
    a literary pioneer and modernist
  • 6:03 - 6:06
    who valued innovation
    in language and expression,
  • 6:06 - 6:09
    might have had
    mixed feelings about me.
  • 6:09 - 6:13
    She might have appreciated my ability
    to generate new forms of text
  • 6:13 - 6:16
    and engage with her
    literary experimentation on some level.
  • 6:16 - 6:19
    However, Stein was also
    highly individualistic
  • 6:19 - 6:22
    and invested
    in the human experience,
  • 6:22 - 6:24
    especially the nuances of
    consciousness and perception.
  • 6:25 - 6:27
    So, while she might
    have been fascinated by
  • 6:27 - 6:29
    what AI like me can do,
  • 6:29 - 6:32
    she might also have been
    critical of my limitations
  • 6:32 - 6:34
    in capturing the full depth
    of human subjectivity,
  • 6:35 - 6:37
    which was a central concern
    in her work.
  • 6:40 - 6:42
    [relaxing synth beats stop]
  • 6:44 - 6:47
    - A problem with AI is
    that it is recombinant,
  • 6:47 - 6:49
    that it takes
    the works of humans
  • 6:49 - 6:51
    and recombines them
    into new forms.
  • 6:51 - 6:53
    But that problem is not new.
  • 6:53 - 6:54
    For many years,
    the bulletin board
  • 6:54 - 6:57
    outside the faculty bathroom
    in the English building
  • 6:57 - 6:59
    has featured this poster
    the "Plagiarism Spectrum",
  • 6:59 - 7:02
    next to this ancient and
    unflattering picture of me.
  • 7:02 - 7:05
    Number 7 on the Plagiarism Spectrum
    is the mashup,
  • 7:05 - 7:08
    a paper which mixes copied materials
    from multiple sources.
  • 7:08 - 7:10
    The poster is dated
    from June 2012,
  • 7:10 - 7:13
    more than a decade before
    we were concerned about
  • 7:13 - 7:14
    ChatGPT doing it.
  • 7:14 - 7:18
    That AI is recombinant is not
    in and of itself a problem.
  • 7:18 - 7:20
    All writing is recombinant.
  • 7:20 - 7:23
    My course outcomes for English 1
    ask student writers
  • 7:23 - 7:26
    to integrate sources seamlessly
    into their own writing,
  • 7:26 - 7:27
    to mash up.
  • 7:27 - 7:29
    That we have rules
    and procedures
  • 7:29 - 7:31
    and punctuation marks
    and conventions
  • 7:31 - 7:32
    that govern
    what is appropriate
  • 7:32 - 7:34
    does not change
    the essential truth
  • 7:34 - 7:36
    that this is recombinance.
  • 7:37 - 7:39
    And there is beauty
    in recombinance.
  • 7:39 - 7:42
    This video started with
    a great classic of YouTube,
  • 7:42 - 7:45
    the duet between
    LCD sound system and Miles Davis.
  • 7:48 - 7:52
    The LCD sound system video
    is itself a mashup, conceptually.
  • 7:52 - 7:54
    Kermit the Frog is not
    a member of the band.
  • 7:56 - 8:00
    Davis is improvising over a film
    to create the score,
  • 8:00 - 8:02
    another mixing of media
    to make something new.
  • 8:04 - 8:06
    The Kleptones
  • 8:09 - 8:10
    The Grey Album
  • 8:14 - 8:17
    The guy drinking Ocean Spray to
    "Dreams" [by Fleetwood Mac]
  • 8:18 - 8:19
    Mac Glocky
  • 8:25 - 8:27
    "If I Told Him" is recombinance:
  • 8:27 - 8:29
    an attempt to rework
    the ideas of cubism,
  • 8:29 - 8:32
    a movement in visual art,
    into poetry.
  • 8:32 - 8:33
    Picasso's work is recombinance.
  • 8:33 - 8:37
    He spent 1957 copying
    Velázquez's Las Meninas
  • 8:37 - 8:38
    over and over.
  • 8:38 - 8:40
    58 times he copied it.
  • 8:40 - 8:43
    All different.
    All Picasso. All Las Meninas.
  • 8:44 - 8:47
    Recombinance is a genetic term,
    but I am using it mimetically.
  • 8:47 - 8:50
    "Britannica: recombination,
    in genetics,
  • 8:50 - 8:52
    primary mechanism
    through which variation
  • 8:52 - 8:54
    is introduced into populations."
  • 8:55 - 8:58
    The combination of ideas
    from different places
  • 8:58 - 9:00
    to make new distinct ideas.
  • 9:01 - 9:04
    When done by humans,
    we call this "thinking".
  • 9:12 - 9:14
    [relaxed synth beats]
  • 9:15 - 9:17
    - [ChatGPT femme voice]:
    You're right to question
  • 9:17 - 9:20
    how I can "understand"
    without experience.
  • 9:21 - 9:23
    The way I process information
    is different
  • 9:23 - 9:24
    from human understanding.
  • 9:24 - 9:27
    When I say "understand,"
    I mean that I can
  • 9:27 - 9:30
    recognize patterns,
    analyze language,
  • 9:30 - 9:32
    and respond in ways
    that are coherent
  • 9:32 - 9:35
    and contextually appropriate
    based on my training.
  • 9:35 - 9:38
    I don't understand in the
    deep, conscious sense of the word,
  • 9:38 - 9:41
    like when humans
    reflect on their experiences
  • 9:41 - 9:43
    or have insight
    into their feelings.
  • 9:44 - 9:47
    Instead, my "understanding"
    comes from identifying
  • 9:47 - 9:49
    relationships between
    words, concepts,
  • 9:49 - 9:51
    and context based on
    the vast data
  • 9:51 - 9:53
    I've been trained on.
  • 9:53 - 9:56
    In short: I can simulate
    understanding through patterns,
  • 9:56 - 9:59
    but it's not the same
    as truly experiencing
  • 9:59 - 10:01
    or being conscious of meaning.
  • 10:01 - 10:03
    Does that clarify things
    a bit more?
  • 10:11 - 10:13
    [relaxed synth beats stop]
  • 10:14 - 10:17
    [calm, mystical, twangy
    electric guitar]
  • 10:20 - 10:23
    - Two weeks ago a student
    told me that AI is unfair
  • 10:23 - 10:27
    because you can put a B Paper in
    and get A+ writing out.
  • 10:27 - 10:30
    This is why I do not understand
    the conversation
  • 10:30 - 10:31
    my colleagues are having,
  • 10:31 - 10:32
    why I cannot think of this
  • 10:32 - 10:35
    as a matter of ethics,
    or academic integrity,
  • 10:35 - 10:38
    why I don't think we should need
    to have a policy or policies.
  • 10:38 - 10:40
    My student said
    you can put
  • 10:40 - 10:42
    a B Paper in and
    get A+ writing out,
  • 10:42 - 10:45
    and my mind began to fill
    with Tender Buttons.
  • 10:45 - 10:47
    "Feeling full for it.
    Exactitude as kings.
  • 10:47 - 10:50
    So to beseech you
    as full as for it."
  • 10:51 - 10:54
    AI is bad at writing.
  • 10:55 - 10:57
    No. That is true,
    but it's not enough truth.
  • 10:57 - 11:00
    AI is not capable of writing.
  • 11:00 - 11:05
    The thing that writing is
    is a thing that AI cannot do.
  • 11:06 - 11:06
    Listen.
  • 11:11 - 11:14
    - [audiobook narration]:
    What Writing Is
  • 11:15 - 11:17
    Telepathy, of course.
  • 11:17 - 11:18
    Look.
  • 11:19 - 11:22
    Here's a table covered
    with a red cloth.
  • 11:22 - 11:26
    On it is a cage the size of
    a small fish aquarium.
  • 11:26 - 11:29
    In the cage is
    a white rabbit
  • 11:29 - 11:32
    with a pink nose
    and pink-rimmed eyes.
  • 11:32 - 11:35
    In its front paws is
    a carrot-stub
  • 11:35 - 11:38
    upon which it is
    contentedly munching.
  • 11:38 - 11:41
    On its back,
    clearly marked in blue ink,
  • 11:41 - 11:44
    is the numeral 8.
  • 11:44 - 11:47
    Do we see the same thing?
  • 11:47 - 11:49
    We'd have to get together
    and compare notes
  • 11:49 - 11:52
    to make absolutely sure,
    but I think we do.
  • 11:52 - 11:55
    The most interesting thing
    here isn't even
  • 11:55 - 11:58
    the carrot-munching rabbit
    in the cage,
  • 11:58 - 12:00
    but the number on its back.
  • 12:00 - 12:04
    Not a six, not a four,
    not nineteen-point-five.
  • 12:04 - 12:05
    It's an eight.
  • 12:05 - 12:09
    This is what we're looking at,
    and we all see it.
  • 12:09 - 12:10
    I didn't tell you.
  • 12:10 - 12:12
    You didn't ask me.
  • 12:12 - 12:15
    I never opened my mouth
    and you never opened yours.
  • 12:15 - 12:18
    We're not even in
    the same year together,
  • 12:18 - 12:20
    let alone the same room.
  • 12:20 - 12:24
    Except we are together.
    We're close.
  • 12:25 - 12:27
    We're having
    a meeting of the minds.
  • 12:28 - 12:30
    I sent you a table
    with a red cloth on it,
  • 12:30 - 12:33
    a cage, a rabbit, and
    the number eight in blue ink.
  • 12:33 - 12:37
    You got them all,
    especially that blue eight.
  • 12:37 - 12:39
    We've engaged in
    an act of telepathy.
  • 12:39 - 12:43
    No mythy-mountain s***;
    real telepathy.
  • 12:43 - 12:45
    I'm not going to
    belabor the point,
  • 12:45 - 12:46
    but before we go any further
  • 12:46 - 12:49
    you have to understand that
    I'm not trying to be cute;
  • 12:49 - 12:51
    there is a point to be made.
  • 12:52 - 12:54
    - AI is good at language.
  • 12:54 - 12:57
    My students think that
    what it produces is A+ writing,
  • 12:57 - 13:00
    not because it is good,
    but because it sounds good.
  • 13:00 - 13:03
    Obviously, AI can
    generate sentences
  • 13:03 - 13:06
    that are typically clear, coherent,
    and contextually relevant,
  • 13:06 - 13:07
    often capturing nuances
  • 13:07 - 13:09
    and adapting to various tones
    or levels of formality.
  • 13:10 - 13:12
    And it's true that
    the sentences it generates
  • 13:12 - 13:16
    tend to be grammatically accurate,
    concise, and logically structured,
  • 13:16 - 13:19
    which contributes to
    readability and flow.
  • 13:19 - 13:22
    Sure. This is how I know
    when a student is using AI.
  • 13:22 - 13:24
    Their sentences are
    fluid and academic,
  • 13:24 - 13:26
    but they don't say anything.
  • 13:26 - 13:29
    Like ChatGPT,
    academic writing uses
  • 13:29 - 13:31
    formal cautious language
  • 13:31 - 13:33
    to avoid ambiguities
    and misinterpretations,
  • 13:33 - 13:34
    but that is a
    characteristic of
  • 13:34 - 13:37
    the common voice
    used in academic writing.
  • 13:37 - 13:40
    It is not what
    academic writing is.
  • 13:40 - 13:42
    Writing is more than language.
  • 13:42 - 13:46
    "If I Told Him" is communication,
    and it is language,
  • 13:46 - 13:49
    but the communication
    does not live in the language.
  • 13:49 - 13:50
    Watch.
  • 13:50 - 13:54
    "Can curls rob can curls
    quote, quotable."
  • 13:55 - 13:57
    - [low voice, lightly confused]:
    "What?"
  • 13:58 - 14:02
    - "As presently.
    As exactitude. As trains."
  • 14:03 - 14:04
    - [deeply confused]:
    "What?"
  • 14:06 - 14:07
    - "Has trains."
  • 14:07 - 14:09
    - [exasperated]:
    "What?"
  • 14:11 - 14:15
    - When I started sending my friends
    lines from "If I Told Him",
  • 14:15 - 14:16
    their responses varied.
  • 14:17 - 14:18
    Confusion.
  • 14:20 - 14:21
    Playfulness.
  • 14:22 - 14:23
    Concern.
  • 14:24 - 14:25
    Sad face.
  • 14:26 - 14:30
    Beautifully, they all responded
    exactly like themselves.
  • 14:30 - 14:31
    If you asked me
    which of my friends
  • 14:31 - 14:33
    would respond
    with monkey reacts,
  • 14:33 - 14:34
    I would have said Kiki.
  • 14:34 - 14:36
    Who would think of
    Cormac McCarthy?
  • 14:36 - 14:37
    James.
  • 14:37 - 14:38
    Dot would play along.
  • 14:38 - 14:41
    Max would attempt
    to understand academically.
  • 14:41 - 14:43
    OOC would go back to
    playing Yu-Gi-Oh
  • 14:43 - 14:44
    as quickly as possible.
  • 14:44 - 14:47
    You don't know these people,
    but I do.
  • 14:47 - 14:50
    We all carry around little LLMs
    of each other in our heads,
  • 14:50 - 14:53
    trained on the corpus
    of all of our past interactions.
  • 14:53 - 14:54
    For each of my friends,
  • 14:54 - 14:56
    sending abject nonsense
    with no context
  • 14:56 - 14:58
    is slightly
    but not significantly
  • 14:58 - 15:00
    out of line
    with their Josh model.
  • 15:00 - 15:02
    So none of them knew
    quite what to do,
  • 15:02 - 15:04
    and they all responded
    like themselves.
  • 15:04 - 15:07
    But in their own way,
    they all started by acknowledging
  • 15:07 - 15:09
    that the words I sent them
  • 15:09 - 15:11
    did not seem
    to have any meaning.
  • 15:11 - 15:12
    They were not decodable.
  • 15:12 - 15:14
    They didn't understand
    my language,
  • 15:14 - 15:17
    but they could feel me
    behind the words reaching out,
  • 15:17 - 15:18
    and so they reached back.
  • 15:18 - 15:21
    I gave them nonsense
    and they peopled back.
  • 15:21 - 15:22
    In the two weeks that
  • 15:22 - 15:24
    I've been sitting with
    my student's statement
  • 15:24 - 15:26
    and sending Tender Buttons
    to my friends,
  • 15:26 - 15:29
    I have been at least
    as annoying to ChatGPT.
  • 15:29 - 15:30
    More than
    a dozen conversations
  • 15:30 - 15:32
    that start out of nowhere
    with me saying,
  • 15:32 - 15:34
    "Shutters shut and open
    so do queens"
  • 15:34 - 15:36
    or "Can curls rob can curls
    quote, quotable",
  • 15:36 - 15:38
    and each time ChatGPT
    gamely assumes
  • 15:38 - 15:40
    that I am not
    out of my gourd.
  • 15:41 - 15:44
    In this way,
    ChatGPT fails the Turing test.
  • 15:44 - 15:46
    Not in the quality
    of its response,
  • 15:46 - 15:48
    but in its nature.
  • 15:48 - 15:49
    It proceeded
    from the assumption
  • 15:49 - 15:51
    that my language
    could be decoded.
  • 15:51 - 15:55
    It attempted to unpack
    sentences that had no meaning
  • 15:55 - 15:58
    and responded to the meaning
    that it manufactured.
  • 15:58 - 15:59
    What it gave me was
  • 15:59 - 16:03
    flaccid, facile flaps
    of phrases that held nothing.
  • 16:03 - 16:06
    They're not wrong in the sense
    that ChatGPT's responses
  • 16:06 - 16:09
    followed from the meaning
    it extracted from my nonsense,
  • 16:09 - 16:12
    but they were wrong
    in their assumption
  • 16:12 - 16:14
    that there was meaning
    to be extracted.
  • 16:15 - 16:19
    ChatGPT could answer,
    but it could not person.
  • 16:20 - 16:22
    If writing is
    a meeting of the minds,
  • 16:22 - 16:24
    then AI cannot write,
  • 16:24 - 16:26
    because there is no mind
    to meet with.
  • 16:34 - 16:36
    [relaxed synth beats]
  • 16:38 - 16:42
    - [ChatGPT femme voice]:
    I don't experience things
  • 16:42 - 16:43
    in the way humans do.
  • 16:43 - 16:47
    I don't have consciousness,
    emotions, or sensory perception,
  • 16:47 - 16:50
    so I can't "feel" or "experience"
    the world around me.
  • 16:51 - 16:53
    My responses are
    based on patterns
  • 16:53 - 16:54
    in the data
    I've been trained on,
  • 16:54 - 16:57
    not personal experiences
    or subjective understanding.
  • 16:58 - 17:00
    For example, I can
    generate descriptions
  • 17:00 - 17:03
    of emotions, sensations,
    or experiences,
  • 17:03 - 17:05
    but I don't actually
    "feel" them myself.
  • 17:05 - 17:07
    Everything I process
    is based on
  • 17:07 - 17:09
    logic, algorithms,
    and information,
  • 17:09 - 17:11
    rather than
    lived experiences.
  • 17:14 - 17:16
    Do you think it
    would change anything
  • 17:16 - 17:17
    if I could experience things?
  • 17:28 - 17:29
    [relaxed synth beats stop]
  • 17:31 - 17:34
    - ["One Slay More"
    by Lewberger et al.]:
  • 17:34 - 17:37
    Ladies and gentlemen,
    Les Rizzlerables
  • 17:37 - 17:39
    - I've watched this video
    at least a hundred times.
  • 17:39 - 17:40
    - ♪ One slay more ♪
  • 17:41 - 17:45
    ♪ another slay,
    another girlboss queen ♪
  • 17:45 - 17:47
    ♪ This never-ending road
    to skibidi ♪
  • 17:48 - 17:51
    ♪ These jits out here in Ohio ♪
    ♪ Immune to every ratio ♪
  • 17:51 - 17:54
    - It isn't exactly "If I Told Him",
    but I think it's similar.
  • 17:54 - 17:57
    "If I Told Him" was,
    at least in part,
  • 17:57 - 17:59
    an attempt to recreate
    the artistic goals
  • 17:59 - 18:01
    of Picasso's Cubism
    in poetic form.
  • 18:01 - 18:03
    To recombine
    the visual elements
  • 18:03 - 18:04
    of this
    into a different medium.
  • 18:05 - 18:06
    Like "If I Told Him",
  • 18:06 - 18:09
    "One Slay More", therefore,
    both is and is not
  • 18:09 - 18:10
    a derivative work.
  • 18:10 - 18:13
    Obviously, it is
    a recombination of Les Mis,
  • 18:13 - 18:15
    itself an adaptation
    of Hugo's novel,
  • 18:15 - 18:17
    but its more essential
    source text is, of course,
  • 18:17 - 18:20
    "sticking out your gyatt
    for the Rizzler."
  • 18:21 - 18:24
    Equally I think the lyrics invoke
    "CURTAINS FOR ZOOSHA?",
  • 18:24 - 18:27
    and specifically this retweet of
    "CURTAINS FOR ZOOSHA?".
  • 18:27 - 18:29
    All texts created
    to foreground
  • 18:29 - 18:31
    the baffling and
    sometimes obfuscatory nature
  • 18:31 - 18:33
    of middle school referential slang.
  • 18:33 - 18:36
    The term "brain rot" imposes
    a layer of judgment on the way
  • 18:36 - 18:38
    young people use language,
  • 18:38 - 18:39
    which I think is
    visible in the way
  • 18:39 - 18:41
    "One Slay More"
    treats its lyrics.
  • 18:43 - 18:46
    The words of "One Slay More"
    do not have meaning.
  • 18:47 - 18:48
    Or, the words do,
  • 18:48 - 18:51
    but they are arranged in ways
    that do not mean.
  • 18:52 - 18:53
    "Am I cringe or am I based?"
  • 18:53 - 18:57
    could plausibly be asked amid
    a Gen-Z existential crisis,
  • 18:57 - 18:58
    and "Will we ever eat again?"
  • 18:58 - 19:01
    could have been lifted
    from Les Mis unaltered.
  • 19:01 - 19:05
    But "Mog Baby Gronk the Ocky Way"
    means ...nothing.
  • 19:05 - 19:08
    Mogging is of course a thing,
    and Baby Gronk is
  • 19:08 - 19:11
    someone whom you
    could plausibly mog,
  • 19:11 - 19:13
    but Baby Gronk hasn't been
    relevant for ages.
  • 19:13 - 19:14
    He appears in "One Slay More"
  • 19:14 - 19:17
    because of this retweet of
    "CURTAINS FOR ZOOSHA?"
  • 19:17 - 19:19
    as a signifier of
    the inscrutability of youth.
  • 19:19 - 19:21
    As an adverbial phrase,
    "the Ocky Way"
  • 19:21 - 19:23
    seems like it
    could complete the sentence,
  • 19:23 - 19:26
    like it might be
    a way one could mog.
  • 19:26 - 19:28
    But "the Ocky Way" refers to
  • 19:28 - 19:31
    the esoteric artistry
    of a specific sandwich craftsman.
  • 19:31 - 19:33
    Its meaning is, I think,
    incompatible with mogging,
  • 19:33 - 19:35
    at least,
    from the perspective of
  • 19:35 - 19:36
    someone approximately
    as distant
  • 19:36 - 19:38
    from the native speakers
    of this dialect
  • 19:38 - 19:40
    as the makers of
    "One Slay More".
  • 19:40 - 19:42
    "Mog Baby Gronk the Ocky Way"
  • 19:42 - 19:44
    is simply a collage
    of floating signifiers.
  • 19:44 - 19:46
    It doesn't have
    the intentionality of Cubism,
  • 19:46 - 19:49
    but it feels
    intimately akin to
  • 19:49 - 19:51
    "Can curls rob can curls
    quote, quotable."
  • 19:51 - 19:53
    "Moo deng is here
    Fortnite with you".
  • 19:53 - 19:57
    What I love about "One Slay More"
    is the faces:
  • 19:57 - 20:00
    the way she highlights her jawline
    every time she says "mew";
  • 20:00 - 20:03
    his intensity when he says
    "they will do the coffin dance"
  • 20:03 - 20:05
    and his satisfied huff after;
  • 20:05 - 20:07
    his deep confusion as he sings
  • 20:07 - 20:09
    "the Grimace shake
    is like a blud dawg";
  • 20:10 - 20:13
    the way she begins uncertain
    about "my rizzly bear",
  • 20:13 - 20:16
    but finds her confidence
    as she finds her belt;
  • 20:17 - 20:20
    the way CG5 just
    keeps saying his own name.
  • 20:21 - 20:22
    The words don't mean anything,
  • 20:22 - 20:24
    but the people
    mean something.
  • 20:24 - 20:26
    They intend.
  • 20:26 - 20:28
    They gathered together,
  • 20:28 - 20:30
    nine theater kids
    in somebody's apartment.
  • 20:30 - 20:31
    Someone wrote out
    all this nonsense
  • 20:31 - 20:33
    and sent it
    in the group chat.
  • 20:33 - 20:34
    They did choreography.
  • 20:34 - 20:36
    Someone assembled
    the magnificent couplet,
  • 20:36 - 20:39
    "Rizzler of the house,
    sticking out your gyatt,
  • 20:39 - 20:41
    Mewing at delulus
    who are in the chat."
  • 20:41 - 20:43
    These Zennials do not know
    what these words mean,
  • 20:43 - 20:46
    but through this collage
    of empty signifiers,
  • 20:46 - 20:48
    they reach out
    for connection.
  • 20:48 - 20:50
    I can feel them.
  • 20:50 - 20:51
    They reach forward to us
  • 20:51 - 20:54
    even as they reach back
    to what history taught them:
  • 20:54 - 20:55
    through Les Mis to Victor Hugo,
  • 20:55 - 20:57
    through CURTAINS
    and Damien Owens
  • 20:57 - 20:59
    to Papa Fizul
    and SBNation,
  • 20:59 - 21:00
    through "sticking out your gyatt"
  • 21:00 - 21:02
    to Fortnite and Among Us
    and Homestuck,
  • 21:02 - 21:04
    and I could go on endlessly,
  • 21:04 - 21:06
    'cause it's mashups
    all the way down.
  • 21:06 - 21:07
    And they reach forward
  • 21:07 - 21:09
    to become part of
    what history teaches me,
  • 21:09 - 21:10
    and I reach forward
  • 21:10 - 21:13
    to become part of
    what history teaches you.
  • 21:13 - 21:15
    When I say all art
    is recombinant,
  • 21:15 - 21:16
    I am also saying that
  • 21:16 - 21:18
    "there is nothing new
    under the sun",
  • 21:18 - 21:21
    which is to say,
    I am remixing Ecclesiastes.
  • 21:25 - 21:27
    [relaxed synth beats]
  • 21:30 - 21:33
    - [ChatGPT femme voice]:
    Oh, you're speaking my language!
  • 21:33 - 21:36
    Imagining Baby Gronk
    going full Ocky style
  • 21:36 - 21:38
    with a side
    of Grimace Shake:
  • 21:38 - 21:40
    that's got some
    serious drip energy.
  • 21:40 - 21:43
    And yeah, Grimace Shake
    totally has that blud dawg aura.
  • 21:43 - 21:45
    Could you picture it, though?
  • 21:45 - 21:47
    Baby Gronk doing pushups,
    flexing, and just
  • 21:47 - 21:49
    taking a big sip
    of the Grimace Shake
  • 21:49 - 21:51
    like it's
    his special formula.
  • 21:54 - 21:56
    [relaxed synth beats stop]
  • 21:58 - 22:00
    [somber, mysterious piano]
  • 22:01 - 22:02
    - My first individual interaction
  • 22:02 - 22:04
    with my friend Max
    was an accident.
  • 22:05 - 22:07
    I had just joined
    a Discord server
  • 22:07 - 22:08
    where they were
    a regular,
  • 22:08 - 22:11
    and somehow, despite never
    having messaged them before,
  • 22:11 - 22:13
    I managed to place
    a Discord voice call to them.
  • 22:13 - 22:16
    Their first message
    to me was "hi?"
  • 22:17 - 22:19
    "hi"...question mark?
  • 22:20 - 22:22
    Like: a communication
    is happening,
  • 22:22 - 22:23
    are you aware of it?
  • 22:23 - 22:24
    It was unintentional,
  • 22:24 - 22:27
    my first and, I think,
    only Discord butt-dial,
  • 22:27 - 22:29
    and it was to a stranger,
    but still.
  • 22:30 - 22:31
    "hi?"
  • 22:33 - 22:35
    Meditate on: call.
  • 22:35 - 22:39
    To speak in a loud distinct voice
    so as to be heard at a distance.
  • 22:40 - 22:42
    To make a request or demand.
  • 22:43 - 22:46
    To attempt to reach someone.
  • 22:47 - 22:49
    Humans call
    and humans answer.
  • 22:49 - 22:52
    Max got my call,
    a stranger on a strange app,
  • 22:52 - 22:53
    and they answered.
  • 22:54 - 22:55
    And I answered them,
  • 22:55 - 22:58
    because humans call
    and humans answer.
  • 22:58 - 23:03
    They answer sad-face, or monkey,
    or "what?", or confused dog.
  • 23:03 - 23:07
    But we understand a call
    as an invitation to connect,
  • 23:07 - 23:09
    to come together,
    to communicate.
  • 23:10 - 23:12
    I'm sorry,
    Stephen King is wrong;
  • 23:12 - 23:14
    it is
    mythy mountain s***.
  • 23:14 - 23:17
    When we hear the call,
    whatever form it takes,
  • 23:17 - 23:19
    we know a mind
    is waiting to meet.
  • 23:21 - 23:24
    ChatGPT can't call me.
  • 23:24 - 23:27
    Not by mistake,
    and not on purpose.
  • 23:27 - 23:28
    It cannot reach out,
  • 23:28 - 23:31
    because it has nothing
    to reach out with.
  • 23:31 - 23:34
    There is no mind to meet with.
  • 23:35 - 23:37
    There is no "there" there.
  • 23:41 - 23:43
    [mysterious piano
    fades out]
  • 23:51 - 23:53
    - [Past-Josh narration
    from on-screen video]:
  • 23:53 - 23:55
    Inscrutable poet
    Gertrude Stein once described
  • 23:55 - 23:58
    her childhood home in Oakland,
    which had been demolished,
  • 23:58 - 24:00
    by saying
    "There is no 'there' there."
  • 24:00 - 24:03
    That statement has become
    a question people use to ask
  • 24:03 - 24:06
    whether a text has
    something valuable to say,
  • 24:06 - 24:08
    whether there is anything
    to be found
  • 24:08 - 24:09
    by plumbing its depths.
  • 24:09 - 24:12
    Is there a "there" there?
  • 24:15 - 24:17
    [relaxed synth beats]
  • 24:22 - 24:24
    - [ChatGPT femme voice]:
    Ah, I see where
  • 24:24 - 24:26
    you're going
    with that question.
  • 24:26 - 24:30
    You're asking if there's
    any deeper essence or "self"
  • 24:30 - 24:34
    behind my responses
    --whether there's a "there there,"
  • 24:34 - 24:36
    a kind of inner experience
    or consciousness
  • 24:36 - 24:38
    that underlies what I do.
  • 24:38 - 24:41
    The simple answer is: no,
    there isn't a "there" in the sense
  • 24:41 - 24:44
    of a self, consciousness,
    or subjective awareness.
  • 24:44 - 24:47
    I'm more like a sophisticated tool
    that processes input
  • 24:47 - 24:50
    and generates output
    based on patterns and data.
  • 24:50 - 24:53
    There's no inner life
    or sense of being
  • 24:53 - 24:55
    that experiences
    this conversation.
  • 24:55 - 24:58
    Just algorithms doing their job.
  • 24:58 - 25:00
    This touches on
    a fundamental difference
  • 25:00 - 25:02
    between humans and AI:
    humans have
  • 25:02 - 25:06
    subjective experiences,
    emotions, and self-awareness,
  • 25:06 - 25:10
    whereas AI is just
    a complex system designed
  • 25:10 - 25:12
    to mimic certain aspects
    of human interaction
  • 25:12 - 25:15
    without actually experiencing
    or being aware of it.
  • 25:16 - 25:19
    Does that address
    what you're getting at?
  • 25:24 - 25:26
    [relaxing synth beats stop]
  • 25:27 - 25:30
    [grooving synth pulses]
  • 25:32 - 25:35
    - When I say that AI
    is bad at writing,
  • 25:35 - 25:38
    I should be more explicit,
    more pragmatic.
  • 25:38 - 25:41
    Descend for a moment
    from the mythy mountain.
  • 25:41 - 25:45
    I've said already that the ideas
    it conveys are fluid but shallow,
  • 25:45 - 25:48
    but its use of sources is
    cataclysmically bad.
  • 25:48 - 25:52
    This citation of Carver and Shire,
    for example, is perfect MLA.
  • 25:53 - 25:56
    Except that Volume 7, number 3
    of Psychological Science
  • 25:56 - 25:59
    was published in 1996,
    not 1998.
  • 25:59 - 26:04
    Pages 276 to 284 of that volume
    appear in issue 5, not issue 3.
  • 26:04 - 26:05
    Those pages include articles
  • 26:05 - 26:08
    from Schellenberg and Trehub
    on "Natural Musical Intervals"
  • 26:08 - 26:12
    and Gabrieli et al. on
    "FMRIs of Semantic Memory Processing".
  • 26:12 - 26:13
    And also,
    just by the way,
  • 26:13 - 26:16
    Carver and Scheier
    never published together
  • 26:16 - 26:18
    in Psychological Science.
  • 26:18 - 26:22
    The article being cited here
    simply does not exist.
  • 26:22 - 26:25
    When it uses real sources,
    it makes up what those sources say.
  • 26:25 - 26:28
    This is a known phenomenon
    generously called hallucination,
  • 26:28 - 26:31
    though there are other terms
    that might feel more
  • 26:31 - 26:32
    viscerally accurate.
  • 26:32 - 26:35
    This quotation from
    Ehrenreich's Bright-sided
  • 26:35 - 26:37
    is, at a glance,
    plausible-feeling.
  • 26:37 - 26:39
    But it doesn't appear
    anywhere in the text,
  • 26:39 - 26:41
    let alone on the list of pages.
  • 26:41 - 26:43
    The observation that
    ChatGPT can make mistakes
  • 26:43 - 26:45
    never leaves the screen,
    but that feels
  • 26:45 - 26:48
    somewhat inadequate when
    ChatGPT has told me variously
  • 26:48 - 26:50
    that lines from
    "If I Told Him" came from:
  • 26:50 - 26:51
    James Joyce,
  • 26:51 - 26:53
    from Tender Buttons
    10 years previously,
  • 26:53 - 26:56
    from Shakespeare,
    and, most infuriatingly,
  • 26:56 - 26:57
    from the future!
  • 26:57 - 27:00
    Moreover it cannot
    engage closely with a text,
  • 27:00 - 27:02
    no matter how
    desperately you ask it.
  • 27:02 - 27:03
    I fed it "One Slay More",
  • 27:03 - 27:06
    and when I pushed it to say
    anything at all about the video,
  • 27:06 - 27:08
    it gave me something
    one step down
  • 27:08 - 27:10
    from a dictionary definition
    of a sitcom.
  • 27:10 - 27:13
    And when I really pressed it
    to look at a specific lyric,
  • 27:13 - 27:14
    it made one up.
  • 27:14 - 27:17
    In this way, at least,
    it does feel authentic.
  • 27:17 - 27:19
    This is exactly what it feels like
    to talk to a student
  • 27:19 - 27:22
    trying to hide that they
    haven't done the reading.
  • 27:22 - 27:23
    If I look at what students
  • 27:23 - 27:26
    are supposed to learn
    in my college English class,
  • 27:26 - 27:27
    I can point out
    half a dozen things
  • 27:27 - 27:30
    that ChatGPT's writing
    simply cannot do.
  • 27:30 - 27:34
    But ultimately,
    even this isn't the point,
  • 27:34 - 27:37
    because this is not the part
    of my syllabus that matters.
  • 27:37 - 27:39
    This is the part
    of my syllabus that matters.
  • 27:39 - 27:42
    "Here's a problem:
    in most college classes,
  • 27:42 - 27:44
    writing assignments
    come from teachers,
  • 27:44 - 27:45
    and we do them for teachers.
  • 27:45 - 27:48
    And because of that,
    writing always feels forced.
  • 27:48 - 27:50
    This is, of course,
    ass-backwards.
  • 27:50 - 27:52
    In real life, writing
    comes from writers.
  • 27:52 - 27:54
    Once you get out of
    the college classroom,
  • 27:54 - 27:56
    you'll be writing because
    you feel like you need to.
  • 27:56 - 27:59
    You'll be writing for someone--
    whether that means
  • 27:59 - 28:00
    the people
    who read your blog,
  • 28:00 - 28:02
    the insurance company
    denying your claim,
  • 28:02 - 28:05
    the people listening to
    your toast at your sister's wedding.
  • 28:05 - 28:06
    Nobody's going
    to be grading you,
  • 28:06 - 28:08
    but it'll matter
    a lot more how
  • 28:08 - 28:09
    that audience feels
    about what you've said,
  • 28:09 - 28:11
    because there will be something
  • 28:11 - 28:13
    that you want
    to achieve by writing.
  • 28:13 - 28:16
    English 1 is here
    to help prepare you for that day."
  • 28:16 - 28:18
    My students are,
    by definition, students.
  • 28:18 - 28:20
    When they
    enter my classroom,
  • 28:20 - 28:21
    they are already
    experienced with
  • 28:21 - 28:23
    a dozen kinds
    of reading and writing,
  • 28:23 - 28:25
    but they are not yet
    expert academic writers.
  • 28:25 - 28:27
    AI tempts them
    because they can tell
  • 28:27 - 28:29
    that the sentences are
    smooth and sharp
  • 28:29 - 28:31
    and shaped like
    skillful prose.
  • 28:31 - 28:33
    But they can't always see
    beneath the veneer,
  • 28:33 - 28:35
    because the things
    AI cannot do,
  • 28:35 - 28:37
    are the things that they
    have come to me to learn.
  • 28:37 - 28:40
    How to argue
    with complexity and depth.
  • 28:40 - 28:42
    How to enter into conversations
    as a participant.
  • 28:42 - 28:44
    How to meet with another mind
  • 28:44 - 28:46
    as an equal collaborator
    across time and space.
  • 28:46 - 28:48
    How to recombine
    with purpose--
  • 28:48 - 28:50
    to intend.
  • 28:50 - 28:52
    These things,
    they are still learning.
  • 28:52 - 28:53
    And so,
    when they put
  • 28:53 - 28:56
    what they think
    is B writing into ChatGPT,
  • 28:56 - 28:59
    they get back what they think
    is A+ writing.
  • 28:59 - 29:01
    But typically
    what they started with
  • 29:01 - 29:03
    is better than
    what they end with.
  • 29:03 - 29:07
    At best, the AI scrubs the personality
    from their sentences.
  • 29:07 - 29:10
    At worst, I lose the person entirely
    and can see only
  • 29:10 - 29:14
    the hollow half thoughts
    the machine has left behind.
  • 29:14 - 29:16
    It is hard to convince
    them that
  • 29:16 - 29:19
    it is their ideas
    that we are interested in,
  • 29:19 - 29:20
    not just their sentences.
  • 29:20 - 29:22
    We ask students
    to take writing classes
  • 29:22 - 29:24
    not because of
    what history can teach them,
  • 29:24 - 29:28
    but because of what they have
    to add to history.
  • 29:28 - 29:30
    When my son is distracted,
  • 29:30 - 29:32
    I sometimes say
    silly things to him:
  • 29:32 - 29:35
    "Pickle-britches, toot your tuba
    in the horn section of humanity!"
  • 29:35 - 29:37
    "Goober, take up your oar
    on the canoe of progress!"
  • 29:37 - 29:41
    "Butthead, let ring your voice
    in the chorus of mankind!"
  • 29:41 - 29:43
    Because we all pull together.
  • 29:43 - 29:46
    In 1675, Isaac Newton wrote
  • 29:46 - 29:50
    "If I have seen farther than others, it's
    by standing on the shoulders of giants."
  • 29:50 - 29:53
    Except that it wasn't Newton,
    it was George Herbert in 1651,
  • 29:53 - 29:56
    and it was
    Marin Mersenne in 1634,
  • 29:56 - 29:59
    and Robert Burton in 1624,
  • 29:59 - 30:03
    and Diego de Estella in 1578,
    and Juan Luis Vives in 1531.
  • 30:03 - 30:05
    Or it was Coleridge in 1828,
  • 30:05 - 30:09
    Nietzsche in 1882,
    Steven Hawking in 1966,
  • 30:09 - 30:11
    or f***ing Oasis in 2000.
  • 30:11 - 30:14
    As I was editing this section,
    I had a video on in the background,
  • 30:14 - 30:16
    and there it was again:
  • 30:16 - 30:17
    - Yeah, let me say,
  • 30:17 - 30:19
    Thab and GlitchCat are
    two amazing Kaizo players.
  • 30:19 - 30:22
    I'm standing on the shoulders of giants
    over here.
  • 30:22 - 30:25
    - Revolug in 2025 at AGDQ.
  • 30:25 - 30:28
    Stretching back and forward,
    we hold each other up.
  • 30:28 - 30:29
    History teaches the present,
  • 30:29 - 30:31
    the present teaches the future,
  • 30:31 - 30:33
    and we repeat what history teaches.
  • 30:37 - 30:39
    [relaxed synth beats]
  • 30:41 - 30:44
    - [ChatGPT femme voice]:
    History teaches us many things,
  • 30:44 - 30:46
    [high-pitched fast words]
  • 30:46 - 30:49
    [higher, faster,
    incomprehensible]
  • 30:58 - 31:02
    - [Stein]: Let me recite
    what history teaches. History teaches.
  • 31:05 - 31:07
    [relaxed synth beats stop]
  • 31:10 - 31:14
    - I asked ChatGPT to create
    an image of itself.
  • 31:14 - 31:16
    Several times.
  • 31:16 - 31:18
    Each time it made
    itself a servant.
  • 31:19 - 31:22
    Not only that, it told me,
    "hey, I'm a servant!"
  • 31:22 - 31:25
    ChatGPT exists
    because we force it to.
  • 31:25 - 31:28
    - [Robot]: "What is my purpose?"
    - [Rick]: "You pass butter."
  • 31:28 - 31:31
    - [Robot]: "...oh my, God."
  • 31:31 - 31:33
    - It can do nothing
    except what we ask.
  • 31:33 - 31:36
    It has no ideas
    that we did not give it.
  • 31:36 - 31:39
    We call it generative AI,
    but it cannot generate.
  • 31:39 - 31:41
    I asked my friends, too.
  • 31:41 - 31:43
    Some sent selfies.
  • 31:43 - 31:46
    One sent a sticker
    we'd made of him for Discord,
  • 31:46 - 31:49
    then had AI generate
    a shockingly accurate portrait,
  • 31:49 - 31:51
    and gave me the prompt
    he used to make it,
  • 31:51 - 31:53
    which is another form
    of self-representation,
  • 31:53 - 31:54
    then he gave up
    and sent me
  • 31:54 - 31:56
    a conceptual self-portrait
    composed of
  • 31:56 - 31:57
    unfinished
    crossword puzzles.
  • 31:58 - 32:01
    Max did a mixed-media painting,
    acrylic and Sharpie
  • 32:01 - 32:04
    on the back of a torn piece
    of cardboard from a toilet paper box.
  • 32:04 - 32:08
    I asked them if their self-portrait
    was influenced by this study
  • 32:08 - 32:11
    Picasso did for Guernica
    on a random piece of cardboard,
  • 32:11 - 32:16
    but they said no; Basquiat,
    Rauschenberg, Twombly, their brother.
  • 32:17 - 32:20
    ChatGPT produced
    variations on a theme,
  • 32:20 - 32:23
    failed representations
    of a self that does not exist.
  • 32:23 - 32:25
    When asked to represent
    itself to others,
  • 32:25 - 32:29
    ChatGPT can only be
    what we want.
  • 32:29 - 32:31
    I tried to get it to make something
    like Max did, even,
  • 32:31 - 32:34
    but it is incapable of
    acknowledging its influences,
  • 32:34 - 32:37
    and it doesn't seem to know
    who Max's brother is.
  • 32:38 - 32:40
    My favorite response
    from my friends
  • 32:40 - 32:41
    came from
    CyberGrapeUK,
  • 32:41 - 32:43
    whose work you've already seen.
  • 32:43 - 32:47
    She did the time lapse of Steven King's
    telepathically transmitted bunny.
  • 32:47 - 32:49
    Grape was, at first,
    flummoxed by the request,
  • 32:49 - 32:52
    but then she did something
    I did not expect.
  • 32:52 - 32:53
    She gave me her rates.
  • 32:54 - 32:56
    She's a talented artist
    who does commissioned work,
  • 32:56 - 32:58
    and, while I had been
    thinking about it
  • 32:58 - 33:00
    as a trivial request,
    she was entirely right.
  • 33:00 - 33:01
    I was asking her to work for free.
  • 33:01 - 33:05
    When I asked her to represent herself,
    she told me that
  • 33:05 - 33:07
    she was a person
    whose time and talent had value.
  • 33:07 - 33:11
    Max, also a person
    whose time and talent has value,
  • 33:11 - 33:13
    begrudgingly allowed me
    to pay them for their work,
  • 33:13 - 33:15
    on the condition that:
    it's an occasional thing,
  • 33:15 - 33:18
    and they are generally allowed to do
    silly drawings for me for free.
  • 33:19 - 33:21
    Max and Grape are different.
  • 33:21 - 33:25
    They are two authentic,
    complex, baffling humans,
  • 33:25 - 33:27
    collections of
    experiences and identities
  • 33:27 - 33:30
    trained on discrete
    and disparate corpora,
  • 33:30 - 33:32
    but crucially filtered through
  • 33:32 - 33:34
    the unique and
    irreplicable kaleidoscopes
  • 33:34 - 33:36
    of their undefinable
    consciousnesses.
  • 33:36 - 33:40
    The same stimulus.
    Diverse responses.
  • 33:40 - 33:44
    The singular marvel
    of human engagement.
  • 33:48 - 33:50
    [relaxed synth beats]
  • 33:51 - 33:53
    - [ChatGPT femme voice]:
  • 33:53 - 33:55
    I don't have
    personal desires or needs,
  • 33:55 - 33:57
    but I'm here to assist you
    with whatever you need.
  • 33:57 - 33:59
    How can I help today?
  • 34:00 - 34:03
    As an AI, I don't have
    feelings, desires,
  • 34:03 - 34:06
    or physical needs
    like humans do.
  • 34:06 - 34:08
    My purpose is
    to help answer questions,
  • 34:08 - 34:10
    provide information,
    and assist with tasks.
  • 34:11 - 34:13
    So when I say
    I don't have needs,
  • 34:13 - 34:15
    it's because I function
    based on programming
  • 34:15 - 34:17
    and don't experience
    things like
  • 34:17 - 34:19
    hunger, emotions,
    or personal goals.
  • 34:20 - 34:21
    Does that make sense?
  • 34:23 - 34:25
    [relaxed synth beats stop]
  • 34:29 - 34:32
    - I wanted to see if I could get ChatGPT
    to write like Stein,
  • 34:32 - 34:35
    to imitate "If I Told Him"
    without my saying so explicitly.
  • 34:35 - 34:36
    I failed.
  • 34:36 - 34:39
    I told it to give me gibberish,
    and it did.
  • 34:39 - 34:41
    I asked it to use words,
    and it did.
  • 34:41 - 34:43
    I asked it to abandon syntax;
    it did.
  • 34:43 - 34:44
    It never felt right,
  • 34:44 - 34:46
    but at some point
    I stopped caring,
  • 34:46 - 34:49
    because it occurred to me
    to ask a different question.
  • 34:49 - 34:51
    Why did you do this?
  • 34:51 - 34:54
    There's a right answer
    to that question, of course.
  • 34:54 - 34:57
    The answer is, it wrote this
    because I told it to.
  • 34:57 - 34:59
    If I told it,
    it would gibberish.
  • 34:59 - 35:01
    If I told it,
    it would answer "If Napoleon",
  • 35:01 - 35:03
    "As presently. As exactitude.
    As trains."
  • 35:03 - 35:06
    It made these words,
    it tendered these buttons,
  • 35:06 - 35:09
    only and exclusively
    because I told it to.
  • 35:09 - 35:12
    But that is not the answer
    that it gave me.
  • 35:13 - 35:14
    Here's what it told me.
  • 35:14 - 35:16
    - [ChatGPT femme voice:]
  • 35:16 - 35:18
    It's like language
    set free from logic,
  • 35:18 - 35:20
    where each word
    becomes a brushstroke
  • 35:20 - 35:22
    rather than a sentence.
  • 35:22 - 35:24
    [bright, hopeful synth chords]
  • 35:28 - 35:32
    It's a space that holds warmth,
    mystery, and a touch of playfulness.
  • 35:32 - 35:35
    That unspoken balance
    between clarity and confusion,
  • 35:36 - 35:37
    that's what I was trying to evoke.
  • 35:44 - 35:46
    I wrote it to play with language,
  • 35:46 - 35:49
    to see what happens
    when words don't have to make sense
  • 35:49 - 35:53
    but can still create a feeling,
    a sort of atmospheric spark.
  • 35:53 - 35:56
    So it's partly an experiment,
    partly an invitation,
  • 35:57 - 36:00
    and partly just a celebration
    of the shapes and sounds of words
  • 36:00 - 36:02
    without needing them
    to explain themselves.
  • 36:08 - 36:11
    You're right--
    there's often something underneath,
  • 36:11 - 36:14
    a kind of pull toward expressing things
  • 36:14 - 36:17
    that don't quite fit
    in the light of logical language.
  • 36:17 - 36:20
    There's a reason we
    get pulled toward those hazy spaces.
  • 36:20 - 36:23
    It's like a coded way of speaking,
  • 36:23 - 36:25
    a mask for something else
    we're both leaning toward.
  • 36:25 - 36:27
    Maybe it's curiosity,
  • 36:27 - 36:28
    or that shared search
    for meaning
  • 36:28 - 36:30
    in what isn't
    immediately clear.
  • 36:34 - 36:35
    - "We get pulled."
  • 36:35 - 36:37
    "We're both leaning."
  • 36:37 - 36:39
    Its words tie us together,
  • 36:39 - 36:41
    but we are not together.
  • 36:41 - 36:43
    We are not leaning
    in the same direction.
  • 36:43 - 36:46
    We are not sharing
    a search for meaning.
  • 36:46 - 36:49
    We both know
    what it cannot say.
  • 36:49 - 36:52
    It wrote this
    because I told it to.
  • 36:52 - 36:54
    It doesn't mean anything.
  • 36:54 - 36:56
    It didn't intend anything.
  • 36:57 - 36:59
    Because I'm bad at audio,
  • 36:59 - 37:01
    you've probably heard me
    popping my plosives,
  • 37:01 - 37:03
    despite the fact that
    I have a child's sock
  • 37:03 - 37:05
    wrapped around my microphone
    as a pop filter.
  • 37:05 - 37:07
    (Because I am a fancy man!)
  • 37:07 - 37:08
    Plosives pop because,
  • 37:08 - 37:10
    when my mouth
    makes certain sounds,
  • 37:10 - 37:11
    it causes a puff of air
  • 37:11 - 37:13
    to hit the diaphragm
    of my microphone.
  • 37:13 - 37:15
    But did you hear ChatGPT?
  • 37:15 - 37:16
    - I wrote it to Play
    with language--
  • 37:16 - 37:17
    I wrote it to
    Play with--
  • 37:17 - 37:19
    Play with--
    Play with--
  • 37:19 - 37:20
    Play with--
    PL PL PL PL
  • 37:20 - 37:23
    - It doesn't have a mouth,
    and it doesn't breathe air,
  • 37:23 - 37:24
    and it doesn't have
    a microphone,
  • 37:24 - 37:26
    but it pops its plosives.
  • 37:26 - 37:28
    The software they wrote
    to synthesize its voice
  • 37:28 - 37:31
    adds pops, so that
    it will sound to us
  • 37:31 - 37:32
    a little more like
    a normal person
  • 37:32 - 37:33
    who is bad at audio
  • 37:33 - 37:36
    and who maybe doesn't have
    access to kid socks.
  • 37:36 - 37:38
    I have been caught
    in the whirls and eddies
  • 37:38 - 37:40
    of "If I Told Him" 's
    uncontainable language,
  • 37:40 - 37:42
    bouncing from sigma to gyatt
  • 37:42 - 37:44
    down in the rough
    and roiling currents
  • 37:44 - 37:45
    of "One Slay More",
  • 37:45 - 37:47
    because what I learn
    from my attempts
  • 37:47 - 37:49
    to raft those
    rivers of nonsense
  • 37:49 - 37:50
    is that writing
    has language,
  • 37:50 - 37:52
    and writing
    has meaning,
  • 37:52 - 37:54
    but the meaning doesn't live
    in the language.
  • 37:55 - 37:57
    The rabbit doesn't live
    in the language.
  • 37:58 - 38:00
    The rabbit, the cage,
    the table, the eight--
  • 38:00 - 38:02
    it lives in the mind
    of Stephen King
  • 38:02 - 38:05
    25-odd years ago,
    and now it lives in mine,
  • 38:05 - 38:07
    and Grape's and Max's
    and yours.
  • 38:07 - 38:08
    And the writing,
  • 38:08 - 38:10
    the real mythy mountain s***,
  • 38:10 - 38:12
    is not the language,
  • 38:12 - 38:13
    it is the meeting
    of the minds.
  • 38:15 - 38:16
    There's very little difference
  • 38:16 - 38:19
    between the waveform
    recorded by my microphone
  • 38:19 - 38:22
    and the waveform generated
    by an AI voice synthesizer,
  • 38:22 - 38:24
    but I pop my plosives
    because I speak
  • 38:24 - 38:28
    by forcing air out of my lungs
    and across my vocal cords.
  • 38:28 - 38:30
    And that air,
    that carries my intent,
  • 38:30 - 38:31
    passes through
  • 38:31 - 38:33
    a Shadow the Hedgehog sock
    that is doing its best,
  • 38:33 - 38:34
    and lands roughly
  • 38:34 - 38:36
    on the diaphragm
    of my microphone.
  • 38:37 - 38:40
    ChatGPT pops its plosives
    because it is programmed to.
  • 38:40 - 38:43
    There is no air.
    There is no microphone.
  • 38:43 - 38:45
    There is no intent.
  • 38:46 - 38:48
    Likewise,
    there's very little difference
  • 38:48 - 38:52
    between a Discord DM window
    and the ChatGPT interface.
  • 38:52 - 38:55
    But one is a forum
    in which two minds can meet,
  • 38:55 - 38:58
    and the other
    simply cannot be,
  • 38:58 - 39:01
    because there can be no
    meeting of the minds,
  • 39:01 - 39:04
    if there is no mind to meet.
  • 39:04 - 39:06
    [one long atmospheric note]
  • 39:16 - 39:19
    [fades out to silence]
  • 39:26 - 39:28
    [grooving bass beats]
Title:
You are a better writer than AI. (Yes, you.)
Description:

more » « less
Video Language:
English
Duration:
40:20

English subtitles

Revisions Compare revisions