Psychological Research - Crash Course Psychology #2
-
0:00 - 0:02Can week-old pizza cause
psychedelic hallucinations? -
0:02 - 0:04Does coffee makes you smarter
-
0:04 - 0:06or does it just make you
do dumb stuff faster? -
0:06 - 0:07Like much much of psychology itself,
-
0:07 - 0:09questions like this can seem pretty intuitive.
-
0:09 - 0:12I mean, people may not be the easiest
organisms to understand, but... -
0:12 - 0:13You're a person, right?
-
0:13 - 0:17So you must be qualified to draw, like,
some conclusions about other people, -
0:17 - 0:18and what makes them tick.
-
0:18 - 0:21But it's important to realize that
your intuition isn't always right. -
0:21 - 0:25In fact, sometimes it is exactly wrong,
-
0:25 - 0:28and we tend to grossly underestimate
the dangers of false intuition. -
0:28 - 0:31If you have some idea about a person and
their behavior that turns out to be right, -
0:31 - 0:34that reinforces your trust in your intuition.
-
0:34 - 0:35Like if I warned my buddy Bob
-
0:35 - 0:38against eating the deep-dish pizza
that's been in the fridge for the past week, -
0:38 - 0:41but he eats it anyway and
soon starts to wig out, -
0:41 - 0:43I'm gonna say: "Dude, I told you so!"
-
0:43 - 0:45But if I'm wrong, and he's totally fine,
-
0:45 - 0:48I'll probably won't even
think about it ever again. -
0:48 - 0:52This is known as hindsight bias,
or the "I-knew-it-all-along" phenomenon. -
0:52 - 0:55This doesn't mean that common sense is wrong,
-
0:55 - 0:56it just mean that our intuitive sense
-
0:56 - 1:00more easily describes what JUST happened
than what WILL happen in the future. -
1:00 - 1:03Another reason you can't
blindly trust your intuition -
1:03 - 1:05is your natural tendency toward overconfidence.
-
1:05 - 1:09Sometimes, you just really, really
feel like you're right about people -
1:09 - 1:12when actually, you're really, really wrong!
-
1:12 - 1:13We've all been there...
-
1:13 - 1:16We also tend to perceive order in random events,
-
1:16 - 1:17which can lead to false assumptions.
-
1:17 - 1:20For example, if you flip a coin five times,
-
1:20 - 1:25you have equal chances of getting all tails
as you do getting alternating heads and tails, -
1:25 - 1:29but we see the series of five tails
as something unusual, as a streak, -
1:29 - 1:33and thus, giving that result some kind of
meaning that it very definitely does not have. -
1:33 - 1:35That is why we have the methods and safeguards
-
1:35 - 1:39of psychological research and experimentation,
-
1:39 - 1:42and the glorious process of scientific inquiry.
-
1:42 - 1:44They help us to get around these problems
-
1:44 - 1:48and basically, save the study of our
minds from the stupidity of our minds. -
1:48 - 1:53So I hope that it won't be a spoiler if I
tell you now that pizza won't make you trip, -
1:53 - 1:56and coffee doesn't make you smart. Sorry.
-
1:56 - 2:00[on-screen animations and
ribbons of science sentences] -
2:04 - 2:06Title screen says "Episode 2:
Research & Experimentation." -
2:06 - 2:08In most ways, psychological research
-
2:08 - 2:10is no different than in the
other scientific discipline. -
2:10 - 2:15Like, step one is always figuring out how to
ask general questions about your subject, -
2:15 - 2:18and turn them into measurable,
testable propositions. -
2:18 - 2:21This is called "operationalizing" your questions.
-
2:21 - 2:23So you know how the scientific method works.
-
2:23 - 2:25It starts with a question and a theory.
-
2:25 - 2:28And I don't mean theory in the
sense of like, a hunch that says -
2:28 - 2:30"a quad-shot of espresso
makes you think better." -
2:30 - 2:36Instead, in science, a theory is what explains
and organizes lots of different observations -
2:36 - 2:37and predicts outcomes.
-
2:37 - 2:39And when you come up
with a testable prediction, -
2:39 - 2:41that's your hypothesis.
-
2:41 - 2:43Once your theory and hypothesis are in place
-
2:43 - 2:46you need a clear and common
language to report them with. -
2:46 - 2:48So, for example,
defining exactly what you mean -
2:48 - 2:50by "thinking better" with
your espresso hypothesis -
2:50 - 2:53would allow other researchers
to replicate the experiment. -
2:53 - 2:55And replication is key.
-
2:55 - 2:58You can watch a person exhibit
a certain behavior once, -
2:58 - 3:00and it won't prove very much.
-
3:00 - 3:04But if you keep getting consistent results
even as you change subjects or situations, -
3:04 - 3:05you're probably onto something.
-
3:05 - 3:09This is a problem with one popular
type of psychological research: -
3:09 - 3:12case studies, which take an
in-depth look at one individual. -
3:12 - 3:14Case studies can sometimes be misleading,
-
3:14 - 3:16because by their nature,
they can't be replicated; -
3:16 - 3:18so, they run the risk of over-generalizing.
-
3:18 - 3:21Still, they're good at showing
us what CAN happen, -
3:21 - 3:25and end up framing questions for more
extensive and generalizable studies. -
3:25 - 3:26They're also often memorable
-
3:26 - 3:30and a great story-telling device psychologists
use to observe and describe behavior. -
3:30 - 3:34Like, say, the smell of coffee makes
Carl suddenly anxious and irritable. -
3:34 - 3:37That obviously doesn't mean that it
has the same effect on everyone. -
3:37 - 3:41In fact, Carl has terrible memories
associated with that smell, -
3:41 - 3:43and so his case is actually quite rare.
-
3:43 - 3:44Poor Carl... :(
-
3:44 - 3:46But, you will still have to look at lots of
-
3:46 - 3:48other cases to determine that conclusively.
-
3:48 - 3:52Another popular method of psychological
research is naturalistic observation, -
3:52 - 3:55where researchers simply watch
behavior in a natural environment, -
3:55 - 3:58whether that's chimps poking
anthills in the jungle, -
3:58 - 4:01kids clowning in a classroom,
or drunk dudes yelling at soccer games. -
4:01 - 4:04The idea is to let the subjects
just "do their thing" -
4:04 - 4:06without trying to manipulate
or control the situation. -
4:06 - 4:09So yeah, basically just spying on people.
-
4:09 - 4:12Like case studies, naturalistic observations
are great at describing behavior, -
4:12 - 4:15but they're very limited in explaining it.
-
4:15 - 4:18Psychologists can also collect behavioral
data using surveys or interviews, -
4:18 - 4:21asking people to report
their opinions and behaviors. -
4:21 - 4:24Sexuality researcher Alfred Kinsey
famously used this technique -
4:24 - 4:28when he surveyed thousands of men
and women on their sexual history -
4:28 - 4:29and published his findings.
-
4:29 - 4:33in a pair of revolutionary texts:
"Sexual Behavior in the Human Male" -
4:33 - 4:35and "Sexual Behavior in the Human Female."
-
4:35 - 4:37Surveys are a great way to access people's
-
4:37 - 4:39consciously held attitudes and beliefs,
-
4:39 - 4:43but how to ask the questions can be tricky;
subtle word choices can influence results. -
4:43 - 4:47For example, more forceful
words like "ban" or "censor" -
4:47 - 4:50may elicit different reactions
than "limit" or "not allow." -
4:50 - 4:53Asking: "Do you believe in space aliens?"
is a much different question than -
4:53 - 4:56"Do you think that there is intelligent
life somewhere else in the universe?" -
4:56 - 4:59It's the same question, but in the first,
the subject may assume that you mean -
4:59 - 5:02"aliens visiting the Earth
and making crop circles -
5:02 - 5:04and abducting people and poking them."
-
5:04 - 5:06And if how you phrase surveys is important,
-
5:06 - 5:07so is who you ask.
-
5:07 - 5:11I could ask a room full of students at a pacifist
club what they think about arms control, -
5:11 - 5:14but the results wouldn't be a representative
measure of where the students stand, -
5:14 - 5:17because there's a pretty clear
sampling bias at work here. -
5:17 - 5:20To fairly represent a population,
I'd need to give a random sample -
5:20 - 5:22where all members of the target group
-
5:22 - 5:26(in this case, students) had an equal chance
of being selected to answer the question. -
5:26 - 5:28So, once you've described behavior
-
5:28 - 5:30with surveys, case studies,
or naturalistic observation, -
5:30 - 5:34you can start making sense out of it
and even predict future behavior. -
5:34 - 5:36One way to do that is to look at
-
5:36 - 5:39how one trait or behavior is related
to another or how they correlate. -
5:39 - 5:41So let's get back to my buddy Bob,
-
5:41 - 5:44who seems to think that his refrigerator
is actually some kind of time machine -
5:44 - 5:46that can preserve food indefinitely.
-
5:46 - 5:49Let's say that Bob is just tucked into
a lunch of questionable leftovers... -
5:49 - 5:52Pizza that may very well have
had a little bit of fungus on it... -
5:52 - 5:55But he was hungry. And lazy.
And so he doused it in sriracha. -
5:55 - 5:57Suddenly, he starts seeing things.
-
5:57 - 6:00Green armadillos with laser-beam-eyes.
-
6:00 - 6:03From here we can deduce that eating
unknown fungus predicts hallucination. -
6:03 - 6:07That's a correlation;
but correlation is not causation. -
6:07 - 6:11Yes, it makes sense that eating questionable
fungus would cause hallucinations, -
6:11 - 6:14but it's possible that Bob was already
on the verge of a psychotic episode -
6:14 - 6:17and those fuzzy left-overs were actually benign!
-
6:17 - 6:20Or, they could be an entirely
different factor involved, -
6:20 - 6:22like maybe he hadn't slept in 72 hours
-
6:22 - 6:24or had an intense migraine coming on,
-
6:24 - 6:26and one of those factors
caused his hallucinations. -
6:26 - 6:28It's tempting to draw
conclusions from correlations, -
6:28 - 6:30but it's super important to remember
-
6:30 - 6:34that correlations predict the POSSIBILITY
of a cause-and-effect relationships; -
6:34 - 6:36they can not prove them.
-
6:36 - 6:39So we've talked about how to describe
behavior without manipulating it, -
6:39 - 6:42and how to make connections and
predictions from those findings, -
6:42 - 6:44but that can only take you so far.
-
6:44 - 6:46To really get to the bottom of
cause-and-effect behaviors, -
6:46 - 6:48you're gonna have to start experimenting.
-
6:48 - 6:51Experiments allow investigators
to isolate different effects -
6:51 - 6:53by manipulating an independent variable
-
6:53 - 6:57and keeping all other variables constant
(or as constant as you can). -
6:57 - 7:00This means that they need at least two groups:
-
7:00 - 7:02the experimental group, which is
gonna get "messed with"; -
7:02 - 7:05and the control group, which is
not going to get "messed with". -
7:05 - 7:07Just as surveys use random samples,
-
7:07 - 7:10experimental researchers need to
randomly assign participants to each group -
7:10 - 7:14to minimize potential confounding variables
or outside factors that may skew the results. -
7:14 - 7:17You don't want all grumpy teenagers in one group
-
7:17 - 7:19and wealthy Japanese servers
in the other; they gotta mingle. -
7:19 - 7:24Sometimes one or both groups are not
informed about what's actually being tested. -
7:24 - 7:27For example, researchers can test
how substances affect people -
7:27 - 7:30by comparing their effects to
placebos, or inert substances. -
7:30 - 7:32And often, the researchers themselves
-
7:32 - 7:35don't know which group is experimental
and which is control, -
7:35 - 7:39so they don't unintentionally influence
the results through their own behavior. -
7:39 - 7:41In which case, it's called...
-
7:41 - 7:43You guessed it! A double-blind procedure.
-
7:43 - 7:46So let's put these ideas into practice
in our own little experiment. -
7:46 - 7:49Like all good work, it starts with a question.
-
7:49 - 7:51So the other day, my friend
Bernice and I were debating. -
7:51 - 7:54We were debating coffee
and its effect on the brain. -
7:54 - 7:57Personally, she's convinced that coffee
helps her focus and think better, -
7:57 - 8:00but I get all jittery, like a caged meerkat
and can't focus on anything. -
8:00 - 8:04And because we know that over-confidence
can lead to belief that are not true, -
8:04 - 8:06we decided to do some critical thinking.
-
8:06 - 8:07So let's figure out our question:
-
8:07 - 8:10"Do humans solve problems
faster when given caffeine?" -
8:10 - 8:13Now we've got to boil that down
into a testable prediction. -
8:13 - 8:17Remember: Keep it clear, simple, and
eloquent so that it can be replicated. -
8:17 - 8:20"Caffeine makes me smarter"
is not a great hypothesis. -
8:20 - 8:22A better one would be, say...
-
8:22 - 8:27"Adults humans given caffeine will navigate
a maze faster than humans not given caffeine." -
8:27 - 8:31The caffeine dosage is your independent
variable (the thing that you can change). -
8:31 - 8:32So, you'll need some coffee.
-
8:32 - 8:34Your result or dependent variable--
-
8:34 - 8:37(the thing that depends on
the thing that you can change), -
8:37 - 8:40is going to be the speed at which the
subject navigates this giant corn maze. -
8:40 - 8:43Go out on the street, wrangle up a
bunch of different kinds of people, -
8:43 - 8:45and randomly assign them
into three different groups. -
8:45 - 8:48Also at this point, the American
Psychological Association suggests -
8:48 - 8:51that you acquire everyone's
informed consent to participate. -
8:51 - 8:54You don't want to force anyone
to be in your experiment, -
8:54 - 8:55no matter how cool you think it is.
-
8:55 - 8:58So the control group gets
a placebo (in this case, decaf). -
8:58 - 9:01Experimental group 1 gets
a low dose of caffeine, -
9:01 - 9:02which we'll define at 100mg
-
9:02 - 9:05(just an eye opener,
like a cup of coffee's worth). -
9:05 - 9:07Experimental group 2 gets 500 mg
-
9:07 - 9:10(more than a quad-shot of espresso
dumped in a Red Bull). -
9:10 - 9:13Once you dose everyone,
turn them loose in the maze -
9:13 - 9:15and wait at the other end with a stopwatch.
-
9:15 - 9:18All that's left is to measure your
results from the three different groups -
9:18 - 9:20and compare them, just to see if
there were any conclusive results. -
9:20 - 9:22If the highly-dosed folks got through it
-
9:22 - 9:24twice as fast as the low dose, placebo groups
-
9:24 - 9:26then Bernice's hypothesis was correct
-
9:26 - 9:29and she can rub my face in it,
saying she was right all along, -
9:29 - 9:33but really, that would just be
the warm flush of hindsight bias -
9:33 - 9:35telling her something she didn't
really know until we tested it. -
9:35 - 9:38Then, because we've used clear
language in defining our parameters, -
9:38 - 9:41other curious minds can easily
replicate this experiment and -
9:41 - 9:44we can eventually pool all the data together,
-
9:44 - 9:48and have something solid to say about what
that macchiato was doing to your cognition. -
9:48 - 9:51Or at least the speed at which
you can run through a maze. -
9:51 - 9:52Science!
-
9:52 - 9:54Probably the best tool that you have
for understanding other people. -
9:54 - 9:57Thanks for watching this episode
of Crash Course Psychology! -
9:57 - 9:58If you've paid attention,
-
9:58 - 10:01you've learned how to apply the scientific
method to psychological research -
10:01 - 10:05through case studies, naturalistic observation,
surveys and interviews, and experimentation. -
10:05 - 10:09You've also learned about different
kinds of bias in experimentation, -
10:09 - 10:11and how research practices help us avoid them.
-
10:11 - 10:13Thanks specially to our Subbable subscribers,
-
10:13 - 10:16who make this and all of Crash Course possible.
-
10:16 - 10:20If you'd like to contribute to help us keep
Crash Course going and also get awesome perks, -
10:20 - 10:25like an autographed science poster or
even be animated into an upcoming episode -
10:25 - 10:28go to subbable.com/crashcourse to find out how.
-
10:28 - 10:41[Host reads the credits]
- Title:
- Psychological Research - Crash Course Psychology #2
- Description:
-
You can directly support Crash Course at http://www.subbable.com/crashcourse Subscribe for as little as $0 to keep up with everything we're doing. Also, if you can afford to pay a little every month, it really helps us to continue producing great content.
So how do we apply the scientific method to psychological research? Lots of ways, but today Hank talks about case studies, naturalistic observation, surveys and interviews, and experimentation. Also he covers different kinds of bias in experimentation and how research practices help us avoid them.
--
Table of ContentsThe Scientific Method 2:06
Case Studies 3:05
Naturalistic Observation 3:48
Surveys and Interviews 4:15
Experimentation 6:35
Proper Research Practices 8:40
--
Want to find Crash Course elsewhere on the internet?
Facebook - http://www.facebook.com/YouTubeCrashCourse
Twitter - http://www.twitter.com/TheCrashCourse
Tumblr - http://thecrashcourse.tumblr.com
Support CrashCourse on Subbable: http://subbable.com/crashcourse - Video Language:
- English
- Duration:
- 10:51
shelly.wordmassage edited English subtitles for Psychological Research - Crash Course Psychology #2 | ||
geriwilson edited English subtitles for Psychological Research - Crash Course Psychology #2 | ||
shelly.wordmassage edited English subtitles for Psychological Research - Crash Course Psychology #2 | ||
shelly.wordmassage edited English subtitles for Psychological Research - Crash Course Psychology #2 | ||
shelly.wordmassage edited English subtitles for Psychological Research - Crash Course Psychology #2 | ||
shelly.wordmassage edited English subtitles for Psychological Research - Crash Course Psychology #2 | ||
shelly.wordmassage edited English subtitles for Psychological Research - Crash Course Psychology #2 | ||
shelly.wordmassage edited English subtitles for Psychological Research - Crash Course Psychology #2 |