Return to Video

Your own artificial intelligence | Marcelo Rinesi | TEDxRíodelaPlata

  • 0:11 - 0:16
    The best way to understand
    a magic trick is from this side.
  • 0:16 - 0:17
    From the magician's point of view.
  • 0:18 - 0:22
    So imagine you are creating
    a social network.
  • 0:23 - 0:25
    The business model is simple:
  • 0:25 - 0:29
    the more time people spend in your site,
  • 0:29 - 0:32
    the more ads you sell,
    and the more money you make.
  • 0:33 - 0:34
    You don't have to do anything.
  • 0:35 - 0:39
    People generate the contents,
    upload the photos.
  • 0:39 - 0:43
    All you have to do
    is to choose at all times
  • 0:43 - 0:47
    what to show to who so they stay.
  • 0:49 - 0:53
    Being the 21st century you automate it,
    you use software.
  • 0:54 - 0:58
    Call it algorithms, big data,
    artificial intelligence.
  • 0:59 - 1:01
    That works, you make a fortune...
  • 1:01 - 1:04
    Congratulations! You invented Facebook.
  • 1:05 - 1:08
    Now, when psychologists and sociologists
  • 1:08 - 1:13
    start to observe what actually happens
    to people on the site,
  • 1:13 - 1:17
    they discover minor side effects:
  • 1:17 - 1:22
    depression, political polarization,
    incitement to violence.
  • 1:22 - 1:25
    None of that was in the business plan.
  • 1:25 - 1:26
    What happened?
  • 1:27 - 1:32
    What machines discover, or rediscover
    in their blind and mechanical way,
  • 1:33 - 1:35
    is that what catches our attention
  • 1:35 - 1:38
    is not what's useful or beautiful,
  • 1:39 - 1:45
    is what entertains us,
    more effectively, what makes us angry.
  • 1:45 - 1:48
    Something good that someone attacks,
  • 1:48 - 1:51
    something bad that someone defends.
  • 1:52 - 1:56
    Instinctively we comment, share, discuss,
  • 1:57 - 2:00
    not because it makes us feel good.
  • 2:00 - 2:03
    It's rare that we sign out
    from Facebook or Twitter
  • 2:03 - 2:06
    feeling emotionally better
    than when we entered.
  • 2:07 - 2:08
    But in that combination
  • 2:08 - 2:11
    of positive and negative emotions
    by scrolling Facebook,
  • 2:11 - 2:15
    there's something virtually addictive:
  • 2:16 - 2:20
    The more time we are in a social network,
  • 2:21 - 2:23
    the worse we feel.
  • 2:23 - 2:28
    We feel worse, we have less energy
    to do something else
  • 2:28 - 2:30
    and more time we spend
    in the social network.
  • 2:31 - 2:36
    Intelligence agencies, marketing agencies
    professional trolls
  • 2:36 - 2:38
    --today there's no difference--
  • 2:38 - 2:44
    are constantly generating
    professionally designed content
  • 2:44 - 2:47
    to make us feel the worst possible
  • 2:47 - 2:51
    about a person, a country, a brand,
  • 2:51 - 2:53
    a political party, whatever.
  • 2:54 - 2:58
    All that negativity is toxic,
  • 2:58 - 3:03
    individually and socially,
    but it's not the worst part.
  • 3:05 - 3:09
    That technology designed
    to catch our attention
  • 3:09 - 3:12
    distracts us at the same time.
  • 3:13 - 3:15
    It distracts us from something important,
  • 3:15 - 3:18
    which is not our loved ones,
    nor the sunset.
  • 3:19 - 3:21
    It's the technology itself.
  • 3:21 - 3:25
    We know it's there,
    we talk about technology,
  • 3:25 - 3:27
    we know it's there to influence us.
  • 3:28 - 3:34
    We are so distracted from the fact
    that we ourselves can use it
  • 3:34 - 3:37
    for our own purposes, today.
  • 3:39 - 3:41
    Every time I tell this
  • 3:41 - 3:45
    to someone who is not an entrepreneur,
    a scientist or a programmer,
  • 3:45 - 3:47
    he gives me that face:
  • 3:47 - 3:50
    "Every word is in Spanish,
    but you speak to me in Swahili."
  • 3:50 - 3:51
    (Laughter)
  • 3:51 - 3:56
    Psychologists have a term for that:
    learned helplessness.
  • 3:57 - 4:02
    If you take a person and ask them
    to do something impossible,
  • 4:03 - 4:09
    like doing crosswords with no solution,
    they learn they can't do it.
  • 4:10 - 4:14
    And, sooner or later,
    they never try again in their entire life.
  • 4:15 - 4:18
    It's a kind of conceptual blindness
  • 4:18 - 4:21
    where the idea itself
    that you are able to do something
  • 4:21 - 4:24
    disappears from the mental map.
  • 4:25 - 4:27
    We all have things like that
  • 4:27 - 4:31
    --dancing, talking in public, math--
  • 4:31 - 4:35
    where we know with total certainty
    that something is impossible.
  • 4:37 - 4:40
    This happens with technology.
  • 4:40 - 4:42
    In particular, with these technologies.
  • 4:43 - 4:50
    We feel them so threatening,
    so foreign, so alienating
  • 4:51 - 4:52
    not for what they can do,
  • 4:53 - 4:57
    but because we see them
    under the exclusive control of scientists,
  • 4:57 - 5:01
    big companies and advanced governments.
  • 5:02 - 5:04
    It's true, yes, that they use them.
  • 5:06 - 5:09
    But it is also true that we,
    as individuals,
  • 5:09 - 5:15
    can develop, use and control them.
  • 5:15 - 5:21
    If your daily activity involves
    reviewing documents or messages
  • 5:22 - 5:24
    and categorize them into piles or folders,
  • 5:24 - 5:27
    even if it's just:
    "Interested", "Not interested",
  • 5:27 - 5:30
    very easily a program
    could be looking over your shoulder,
  • 5:31 - 5:34
    learning from you how you do it
  • 5:34 - 5:37
    and slowly start automating the process.
  • 5:38 - 5:42
    The effect of that in your income,
  • 5:42 - 5:46
    in your job stability,
    in your quality of life,
  • 5:47 - 5:50
    depends entirely on
    if that same program
  • 5:50 - 5:54
    belongs to the company or the worker.
  • 5:54 - 5:57
    If it belongs to you or another party.
  • 5:58 - 6:01
    The same applies to creative activities.
  • 6:02 - 6:04
    It's another discussion
    whether or not this is art,
  • 6:05 - 6:08
    but a tool can be programmed to generate
  • 6:08 - 6:14
    at least drafts of drawings,
    blueprints, diagrams, music,
  • 6:15 - 6:20
    and use that to generate new ideas
    or to explore your own.
  • 6:20 - 6:25
    Learning what you like, what inspires you,
    what is useful to you.
  • 6:27 - 6:30
    And such a tool can be programmed
  • 6:30 - 6:33
    to learn the specific strengths
  • 6:33 - 6:37
    of people with totally different
    skills and experiences,
  • 6:37 - 6:43
    then put together everything they know
    without prejudice, without discrimination
  • 6:43 - 6:49
    and make much better decisions
    than what each person would do separately
  • 6:49 - 6:51
    or all together on the same table.
  • 6:52 - 6:56
    What I'm not saying is that
    to have these tools
  • 6:56 - 6:58
    --that we can all have--
  • 6:58 - 7:01
    we all have to learn to program.
  • 7:01 - 7:05
    It's a useful skill, yes, it's a good job,
  • 7:05 - 7:09
    it's a hobby, it's an art
    that changes your life forever.
  • 7:10 - 7:14
    But there are other ways to get
    advanced software for personal use.
  • 7:15 - 7:17
    One is to pay someone.
  • 7:17 - 7:20
    The most obvious idea this afternoon,
  • 7:20 - 7:22
    but very few people do it.
  • 7:22 - 7:24
    Even people who can afford it.
  • 7:24 - 7:26
    Another more interesting way,
  • 7:26 - 7:30
    is to look for someone who knows
    how to do it and ask them.
  • 7:31 - 7:34
    If what you are doing with your lives
  • 7:34 - 7:39
    is to protect someone, help someone,
    or create something beautiful,
  • 7:39 - 7:42
    it is very difficult not to find
    someone who say yes.
  • 7:43 - 7:46
    The software is artificial,
    but it's people who make it.
  • 7:46 - 7:51
    And people, in general,
    want to help other people.
  • 7:53 - 7:57
    Do it, buy it, or order it.
  • 7:58 - 8:03
    To have your own artificial intelligences
    under your control,
  • 8:03 - 8:05
    has no other trick than that.
  • 8:06 - 8:09
    But it's useless to tell someone
  • 8:09 - 8:14
    who "knows" that they can't do
    something, that they can.
  • 8:14 - 8:17
    You can give them examples,
    you can explain.
  • 8:18 - 8:20
    You can give a TED talk,
  • 8:20 - 8:22
    but they have to try.
  • 8:23 - 8:26
    If you only transmit the idea,
    and you only listen to the idea,
  • 8:26 - 8:27
    it's just a prologue.
  • 8:28 - 8:32
    If you stay in that first step it's worse
    than doing nothing.
  • 8:32 - 8:37
    It's a very insidious distraction,
  • 8:37 - 8:41
    because you feel you are doing something.
  • 8:41 - 8:44
    In my experience, people and groups
  • 8:45 - 8:49
    that get rid of the idea
    --conscious or unconscious--
  • 8:49 - 8:52
    that some things are too advanced,
  • 8:52 - 8:58
    and they just risk the attempt
    to do an impossible thing,
  • 8:58 - 9:03
    quickly discover that the difference
    between the futuristic and the reasonable,
  • 9:03 - 9:05
    the realistic, the logical,
  • 9:05 - 9:09
    it's not in the resources,
    it's in the attitude.
  • 9:09 - 9:13
    And when you do that
    something happens to your attention.
  • 9:15 - 9:20
    You see the Internet less as a happy meal
    to choose from a menu,
  • 9:20 - 9:23
    and more you see it as a box of Legos,
  • 9:23 - 9:26
    with pieces of things that you use
    to do your own things.
  • 9:27 - 9:32
    Less you care about tweets, retweets,
    comments, followers...
  • 9:32 - 9:34
    You are very busy making
    your own machinery
  • 9:35 - 9:38
    to hook with someone else's.
  • 9:41 - 9:44
    I don't t have much hope
    for the 21st century.
  • 9:44 - 9:46
    (Laughter)
  • 9:48 - 9:51
    There are many things not going well.
  • 9:51 - 9:54
    And there are very few signs
    that it will improve.
  • 9:55 - 9:56
    On the other hand,
  • 9:56 - 10:00
    we are fighting with one hand
    tied behind our back.
  • 10:01 - 10:05
    We face these problems, the big and small,
  • 10:05 - 10:09
    with tools owned by others,
    for other people's goals,
  • 10:09 - 10:11
    with their rules.
  • 10:12 - 10:14
    If that changes, who knows.
  • 10:15 - 10:18
    Helplessness can be unlearned.
  • 10:19 - 10:22
    It's an individual process,
    it's an uncomfortable process,
  • 10:23 - 10:25
    but it is one in which we all benefit
  • 10:25 - 10:27
    Thank you.
  • 10:27 - 10:29
    (Applause)
Title:
Your own artificial intelligence | Marcelo Rinesi | TEDxRíodelaPlata
Description:

The combination of social networks with artificial intelligence is generating unanticipated effects in how we relate and how we access information, how we form our opinions and even how we vote for our political representatives. But can we use this same artificial intelligence in our favor? Marcelo Rinesi is a freelance data scientist, member of the Baikal Institute, and CTO of the Institute for Ethics and Emerging Technologies. In addition to creating and implementing models for the prediction and optimization of the behavior of individuals and organizations, has been cited or published on the economic and political implications of emerging technologies in publications such as Rolling Stone, Quartz, The Verge, BoingBoing, Forbes, Wired, Huffington Post, Fast Company, VICE's Motherboard, La Nacion, Clarin, and Veintitres. He has published several collections of short science fiction stories online, the last of which is TACTICAL AWARENESS.

This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at http://ted.com/tedx

more » « less
Video Language:
Spanish
Team:
closed TED
Project:
TEDxTalks
Duration:
10:46

English subtitles

Revisions