< Return to Video

Psychological Research - Crash Course Psychology #2

  • 0:00 - 0:02
    Can week-old pizza cause
    psychedelic hallucinations?
  • 0:02 - 0:04
    Does coffee makes you smarter
  • 0:04 - 0:06
    or does it just make you
    do dumb stuff faster?
  • 0:06 - 0:07
    Like much much of psychology itself,
  • 0:07 - 0:09
    questions like this can seem pretty intuitive.
  • 0:09 - 0:12
    I mean, people may not be the easiest
    organisms to understand, but...
  • 0:12 - 0:13
    You're a person, right?
  • 0:13 - 0:17
    So you must be qualified to draw, like,
    some conclusions about other people,
  • 0:17 - 0:18
    and what makes them tick.
  • 0:18 - 0:21
    But it's important to realize that
    your intuition isn't always right.
  • 0:21 - 0:25
    In fact, sometimes it is exactly wrong,
  • 0:25 - 0:28
    and we tend to grossly underestimate
    the dangers of false intuition.
  • 0:28 - 0:31
    If you have some idea about a person and
    their behavior that turns out to be right,
  • 0:31 - 0:34
    that reinforces your trust in your intuition.
  • 0:34 - 0:35
    Like if I warned my buddy Bob
  • 0:35 - 0:38
    against eating the deep-dish pizza
    that's been in the fridge for the past week,
  • 0:38 - 0:41
    but he eats it anyway and
    soon starts to wig out,
  • 0:41 - 0:43
    I'm gonna say: "Dude, I told you so!"
  • 0:43 - 0:45
    But if I'm wrong, and he's totally fine,
  • 0:45 - 0:48
    I'll probably won't even
    think about it ever again.
  • 0:48 - 0:52
    This is known as hindsight bias,
    or the "I-knew-it-all-along" phenomenon.
  • 0:52 - 0:55
    This doesn't mean that common sense is wrong,
  • 0:55 - 0:56
    it just mean that our intuitive sense
  • 0:56 - 1:00
    more easily describes what JUST happened
    than what WILL happen in the future.
  • 1:00 - 1:03
    Another reason you can't
    blindly trust your intuition
  • 1:03 - 1:05
    is your natural tendency toward overconfidence.
  • 1:05 - 1:09
    Sometimes, you just really, really
    feel like you're right about people
  • 1:09 - 1:12
    when actually, you're really, really wrong!
  • 1:12 - 1:13
    We've all been there...
  • 1:13 - 1:16
    We also tend to perceive order in random events,
  • 1:16 - 1:17
    which can lead to false assumptions.
  • 1:17 - 1:20
    For example, if you flip a coin five times,
  • 1:20 - 1:25
    you have equal chances of getting all tails
    as you do getting alternating heads and tails,
  • 1:25 - 1:29
    but we see the series of five tails
    as something unusual, as a streak,
  • 1:29 - 1:33
    and thus, giving that result some kind of
    meaning that it very definitely does not have.
  • 1:33 - 1:35
    That is why we have the methods and safeguards
  • 1:35 - 1:39
    of psychological research and experimentation,
  • 1:39 - 1:42
    and the glorious process of scientific inquiry.
  • 1:42 - 1:44
    They help us to get around these problems
  • 1:44 - 1:48
    and basically, save the study of our
    minds from the stupidity of our minds.
  • 1:48 - 1:53
    So I hope that it won't be a spoiler if I
    tell you now that pizza won't make you trip,
  • 1:53 - 1:56
    and coffee doesn't make you smart. Sorry.
  • 1:56 - 2:00
    [on-screen animations and
    ribbons of science sentences]
  • 2:04 - 2:06
    Title screen says "Episode 2:
    Research & Experimentation."
  • 2:06 - 2:08
    In most ways, psychological research
  • 2:08 - 2:10
    is no different than in the
    other scientific discipline.
  • 2:10 - 2:15
    Like, step one is always figuring out how to
    ask general questions about your subject,
  • 2:15 - 2:18
    and turn them into measurable,
    testable propositions.
  • 2:18 - 2:21
    This is called "operationalizing" your questions.
  • 2:21 - 2:23
    So you know how the scientific method works.
  • 2:23 - 2:25
    It starts with a question and a theory.
  • 2:25 - 2:28
    And I don't mean theory in the
    sense of like, a hunch that says
  • 2:28 - 2:30
    "a quad-shot of espresso
    makes you think better."
  • 2:30 - 2:36
    Instead, in science, a theory is what explains
    and organizes lots of different observations
  • 2:36 - 2:37
    and predicts outcomes.
  • 2:37 - 2:39
    And when you come up
    with a testable prediction,
  • 2:39 - 2:41
    that's your hypothesis.
  • 2:41 - 2:43
    Once your theory and hypothesis are in place
  • 2:43 - 2:46
    you need a clear and common
    language to report them with.
  • 2:46 - 2:48
    So, for example,
    defining exactly what you mean
  • 2:48 - 2:50
    by "thinking better" with
    your espresso hypothesis
  • 2:50 - 2:53
    would allow other researchers
    to replicate the experiment.
  • 2:53 - 2:55
    And replication is key.
  • 2:55 - 2:58
    You can watch a person exhibit
    a certain behavior once,
  • 2:58 - 3:00
    and it won't prove very much.
  • 3:00 - 3:04
    But if you keep getting consistent results
    even as you change subjects or situations,
  • 3:04 - 3:05
    you're probably onto something.
  • 3:05 - 3:09
    This is a problem with one popular
    type of psychological research:
  • 3:09 - 3:12
    case studies, which take an
    in-depth look at one individual.
  • 3:12 - 3:14
    Case studies can sometimes be misleading,
  • 3:14 - 3:16
    because by their nature,
    they can't be replicated;
  • 3:16 - 3:18
    so, they run the risk of over-generalizing.
  • 3:18 - 3:21
    Still, they're good at showing
    us what CAN happen,
  • 3:21 - 3:25
    and end up framing questions for more
    extensive and generalizable studies.
  • 3:25 - 3:26
    They're also often memorable
  • 3:26 - 3:30
    and a great story-telling device psychologists
    use to observe and describe behavior.
  • 3:30 - 3:34
    Like, say, the smell of coffee makes
    Carl suddenly anxious and irritable.
  • 3:34 - 3:37
    That obviously doesn't mean that it
    has the same effect on everyone.
  • 3:37 - 3:41
    In fact, Carl has terrible memories
    associated with that smell,
  • 3:41 - 3:43
    and so his case is actually quite rare.
  • 3:43 - 3:44
    Poor Carl... :(
  • 3:44 - 3:46
    But, you will still have to look at lots of
  • 3:46 - 3:48
    other cases to determine that conclusively.
  • 3:48 - 3:52
    Another popular method of psychological
    research is naturalistic observation,
  • 3:52 - 3:55
    where researchers simply watch
    behavior in a natural environment,
  • 3:55 - 3:58
    whether that's chimps poking
    anthills in the jungle,
  • 3:58 - 4:01
    kids clowning in a classroom,
    or drunk dudes yelling at soccer games.
  • 4:01 - 4:04
    The idea is to let the subjects
    just "do their thing"
  • 4:04 - 4:06
    without trying to manipulate
    or control the situation.
  • 4:06 - 4:09
    So yeah, basically just spying on people.
  • 4:09 - 4:12
    Like case studies, naturalistic observations
    are great at describing behavior,
  • 4:12 - 4:15
    but they're very limited in explaining it.
  • 4:15 - 4:18
    Psychologists can also collect behavioral
    data using surveys or interviews,
  • 4:18 - 4:21
    asking people to report
    their opinions and behaviors.
  • 4:21 - 4:24
    Sexuality researcher Alfred Kinsey
    famously used this technique
  • 4:24 - 4:28
    when he surveyed thousands of men
    and women on their sexual history
  • 4:28 - 4:29
    and published his findings.
  • 4:29 - 4:33
    in a pair of revolutionary texts:
    "Sexual Behavior in the Human Male"
  • 4:33 - 4:35
    and "Sexual Behavior in the Human Female."
  • 4:35 - 4:37
    Surveys are a great way to access people's
  • 4:37 - 4:39
    consciously held attitudes and beliefs,
  • 4:39 - 4:43
    but how to ask the questions can be tricky;
    subtle word choices can influence results.
  • 4:43 - 4:47
    For example, more forceful
    words like "ban" or "censor"
  • 4:47 - 4:50
    may elicit different reactions
    than "limit" or "not allow."
  • 4:50 - 4:53
    Asking: "Do you believe in space aliens?"
    is a much different question than
  • 4:53 - 4:56
    "Do you think that there is intelligent
    life somewhere else in the universe?"
  • 4:56 - 4:59
    It's the same question, but in the first,
    the subject may assume that you mean
  • 4:59 - 5:02
    "aliens visiting the Earth
    and making crop circles
  • 5:02 - 5:04
    and abducting people and poking them."
  • 5:04 - 5:06
    And if how you phrase surveys is important,
  • 5:06 - 5:07
    so is who you ask.
  • 5:07 - 5:11
    I could ask a room full of students at a pacifist
    club what they think about arms control,
  • 5:11 - 5:14
    but the results wouldn't be a representative
    measure of where the students stand,
  • 5:14 - 5:17
    because there's a pretty clear
    sampling bias at work here.
  • 5:17 - 5:20
    To fairly represent a population,
    I'd need to give a random sample
  • 5:20 - 5:22
    where all members of the target group
  • 5:22 - 5:26
    (in this case, students) had an equal chance
    of being selected to answer the question.
  • 5:26 - 5:28
    So, once you've described behavior
  • 5:28 - 5:30
    with surveys, case studies,
    or naturalistic observation,
  • 5:30 - 5:34
    you can start making sense out of it
    and even predict future behavior.
  • 5:34 - 5:36
    One way to do that is to look at
  • 5:36 - 5:39
    how one trait or behavior is related
    to another or how they correlate.
  • 5:39 - 5:41
    So let's get back to my buddy Bob,
  • 5:41 - 5:44
    who seems to think that his refrigerator
    is actually some kind of time machine
  • 5:44 - 5:46
    that can preserve food indefinitely.
  • 5:46 - 5:49
    Let's say that Bob is just tucked into
    a lunch of questionable leftovers...
  • 5:49 - 5:52
    Pizza that may very well have
    had a little bit of fungus on it...
  • 5:52 - 5:55
    But he was hungry. And lazy.
    And so he doused it in sriracha.
  • 5:55 - 5:57
    Suddenly, he starts seeing things.
  • 5:57 - 6:00
    Green armadillos with laser-beam-eyes.
  • 6:00 - 6:03
    From here we can deduce that eating
    unknown fungus predicts hallucination.
  • 6:03 - 6:07
    That's a correlation;
    but correlation is not causation.
  • 6:07 - 6:11
    Yes, it makes sense that eating questionable
    fungus would cause hallucinations,
  • 6:11 - 6:14
    but it's possible that Bob was already
    on the verge of a psychotic episode
  • 6:14 - 6:17
    and those fuzzy left-overs were actually benign!
  • 6:17 - 6:20
    Or, they could be an entirely
    different factor involved,
  • 6:20 - 6:22
    like maybe he hadn't slept in 72 hours
  • 6:22 - 6:24
    or had an intense migraine coming on,
  • 6:24 - 6:26
    and one of those factors
    caused his hallucinations.
  • 6:26 - 6:28
    It's tempting to draw
    conclusions from correlations,
  • 6:28 - 6:30
    but it's super important to remember
  • 6:30 - 6:34
    that correlations predict the POSSIBILITY
    of a cause-and-effect relationships;
  • 6:34 - 6:36
    they can not prove them.
  • 6:36 - 6:39
    So we've talked about how to describe
    behavior without manipulating it,
  • 6:39 - 6:42
    and how to make connections and
    predictions from those findings,
  • 6:42 - 6:44
    but that can only take you so far.
  • 6:44 - 6:46
    To really get to the bottom of
    cause-and-effect behaviors,
  • 6:46 - 6:48
    you're gonna have to start experimenting.
  • 6:48 - 6:51
    Experiments allow investigators
    to isolate different effects
  • 6:51 - 6:53
    by manipulating an independent variable
  • 6:53 - 6:57
    and keeping all other variables constant
    (or as constant as you can).
  • 6:57 - 7:00
    This means that they need at least two groups:
  • 7:00 - 7:02
    the experimental group, which is
    gonna get "messed with";
  • 7:02 - 7:05
    and the control group, which is
    not going to get "messed with".
  • 7:05 - 7:07
    Just as surveys use random samples,
  • 7:07 - 7:10
    experimental researchers need to
    randomly assign participants to each group
  • 7:10 - 7:14
    to minimize potential confounding variables
    or outside factors that may skew the results.
  • 7:14 - 7:17
    You don't want all grumpy teenagers in one group
  • 7:17 - 7:19
    and wealthy Japanese servers
    in the other; they gotta mingle.
  • 7:19 - 7:24
    Sometimes one or both groups are not
    informed about what's actually being tested.
  • 7:24 - 7:27
    For example, researchers can test
    how substances affect people
  • 7:27 - 7:30
    by comparing their effects to
    placebos, or inert substances.
  • 7:30 - 7:32
    And often, the researchers themselves
  • 7:32 - 7:35
    don't know which group is experimental
    and which is control,
  • 7:35 - 7:39
    so they don't unintentionally influence
    the results through their own behavior.
  • 7:39 - 7:41
    In which case, it's called...
  • 7:41 - 7:43
    You guessed it! A double-blind procedure.
  • 7:43 - 7:46
    So let's put these ideas into practice
    in our own little experiment.
  • 7:46 - 7:49
    Like all good work, it starts with a question.
  • 7:49 - 7:51
    So the other day, my friend
    Bernice and I were debating.
  • 7:51 - 7:54
    We were debating coffee
    and its effect on the brain.
  • 7:54 - 7:57
    Personally, she's convinced that coffee
    helps her focus and think better,
  • 7:57 - 8:00
    but I get all jittery, like a caged meerkat
    and can't focus on anything.
  • 8:00 - 8:04
    And because we know that over-confidence
    can lead to belief that are not true,
  • 8:04 - 8:06
    we decided to do some critical thinking.
  • 8:06 - 8:07
    So let's figure out our question:
  • 8:07 - 8:10
    "Do humans solve problems
    faster when given caffeine?"
  • 8:10 - 8:13
    Now we've got to boil that down
    into a testable prediction.
  • 8:13 - 8:17
    Remember: Keep it clear, simple, and
    eloquent so that it can be replicated.
  • 8:17 - 8:20
    "Caffeine makes me smarter"
    is not a great hypothesis.
  • 8:20 - 8:22
    A better one would be, say...
  • 8:22 - 8:27
    "Adults humans given caffeine will navigate
    a maze faster than humans not given caffeine."
  • 8:27 - 8:31
    The caffeine dosage is your independent
    variable (the thing that you can change).
  • 8:31 - 8:32
    So, you'll need some coffee.
  • 8:32 - 8:34
    Your result or dependent variable--
  • 8:34 - 8:37
    (the thing that depends on
    the thing that you can change),
  • 8:37 - 8:40
    is going to be the speed at which the
    subject navigates this giant corn maze.
  • 8:40 - 8:43
    Go out on the street, wrangle up a
    bunch of different kinds of people,
  • 8:43 - 8:45
    and randomly assign them
    into three different groups.
  • 8:45 - 8:48
    Also at this point, the American
    Psychological Association suggests
  • 8:48 - 8:51
    that you acquire everyone's
    informed consent to participate.
  • 8:51 - 8:54
    You don't want to force anyone
    to be in your experiment,
  • 8:54 - 8:55
    no matter how cool you think it is.
  • 8:55 - 8:58
    So the control group gets
    a placebo (in this case, decaf).
  • 8:58 - 9:01
    Experimental group 1 gets
    a low dose of caffeine,
  • 9:01 - 9:02
    which we'll define at 100mg
  • 9:02 - 9:05
    (just an eye opener,
    like a cup of coffee's worth).
  • 9:05 - 9:07
    Experimental group 2 gets 500 mg
  • 9:07 - 9:10
    (more than a quad-shot of espresso
    dumped in a Red Bull).
  • 9:10 - 9:13
    Once you dose everyone,
    turn them loose in the maze
  • 9:13 - 9:15
    and wait at the other end with a stopwatch.
  • 9:15 - 9:18
    All that's left is to measure your
    results from the three different groups
  • 9:18 - 9:20
    and compare them, just to see if
    there were any conclusive results.
  • 9:20 - 9:22
    If the highly-dosed folks got through it
  • 9:22 - 9:24
    twice as fast as the low dose, placebo groups
  • 9:24 - 9:26
    then Bernice's hypothesis was correct
  • 9:26 - 9:29
    and she can rub my face in it,
    saying she was right all along,
  • 9:29 - 9:33
    but really, that would just be
    the warm flush of hindsight bias
  • 9:33 - 9:35
    telling her something she didn't
    really know until we tested it.
  • 9:35 - 9:38
    Then, because we've used clear
    language in defining our parameters,
  • 9:38 - 9:41
    other curious minds can easily
    replicate this experiment and
  • 9:41 - 9:44
    we can eventually pool all the data together,
  • 9:44 - 9:48
    and have something solid to say about what
    that macchiato was doing to your cognition.
  • 9:48 - 9:51
    Or at least the speed at which
    you can run through a maze.
  • 9:51 - 9:52
    Science!
  • 9:52 - 9:54
    Probably the best tool that you have
    for understanding other people.
  • 9:54 - 9:57
    Thanks for watching this episode
    of Crash Course Psychology!
  • 9:57 - 9:58
    If you've paid attention,
  • 9:58 - 10:01
    you've learned how to apply the scientific
    method to psychological research
  • 10:01 - 10:05
    through case studies, naturalistic observation,
    surveys and interviews, and experimentation.
  • 10:05 - 10:09
    You've also learned about different
    kinds of bias in experimentation,
  • 10:09 - 10:11
    and how research practices help us avoid them.
  • 10:11 - 10:13
    Thanks specially to our Subbable subscribers,
  • 10:13 - 10:16
    who make this and all of Crash Course possible.
  • 10:16 - 10:20
    If you'd like to contribute to help us keep
    Crash Course going and also get awesome perks,
  • 10:20 - 10:25
    like an autographed science poster or
    even be animated into an upcoming episode
  • 10:25 - 10:28
    go to subbable.com/crashcourse to find out how.
  • 10:28 - 10:41
    [Host reads the credits]
Title:
Psychological Research - Crash Course Psychology #2
Description:

You can directly support Crash Course at http://www.subbable.com/crashcourse Subscribe for as little as $0 to keep up with everything we're doing. Also, if you can afford to pay a little every month, it really helps us to continue producing great content.

So how do we apply the scientific method to psychological research? Lots of ways, but today Hank talks about case studies, naturalistic observation, surveys and interviews, and experimentation. Also he covers different kinds of bias in experimentation and how research practices help us avoid them.
--
Table of Contents

The Scientific Method 2:06
Case Studies 3:05
Naturalistic Observation 3:48
Surveys and Interviews 4:15
Experimentation 6:35
Proper Research Practices 8:40
--
Want to find Crash Course elsewhere on the internet?
Facebook - http://www.facebook.com/YouTubeCrashCourse
Twitter - http://www.twitter.com/TheCrashCourse
Tumblr - http://thecrashcourse.tumblr.com
Support CrashCourse on Subbable: http://subbable.com/crashcourse

more » « less
Video Language:
English
Duration:
10:51

English subtitles

Revisions