< Return to Video

Ruby Conf 2013 - Ruby On Robots Using Artoo by Ron Evans

  • 0:16 - 0:18
    RON EVANS: Good afternoon everybody.
  • 0:18 - 0:20
    AUDIENCE: Good afternoon!
  • 0:20 - 0:23
    R.E.: This is RubyConf 2013!
  • 0:23 - 0:24
    AUDIENCE: Whoo!
  • 0:24 - 0:28
    R.E.: Yeah! So before we get started, we just
  • 0:28 - 0:30
    want to say a very special thank you to
  • 0:30 - 0:33
    the organizers of Ruby Central and to all
    the
  • 0:33 - 0:35
    conference staff, the sponsors, and to all
    of you
  • 0:35 - 0:37
    for being here. Thank you so very much, let's
  • 0:37 - 0:39
    give a big round of applause for everybody.
  • 0:45 - 0:48
    So I am @deadprogram, aka Ron Evans in the
  • 0:48 - 0:52
    real world. I'm the ringleader of the @hybrid_group.
    This
  • 0:52 - 0:56
    other guy over here is @adzankich, aka, Adrian
    Zankich.
  • 0:56 - 0:59
    He is the serious programming guy at the @hybrid_group.
  • 0:59 - 1:01
    So he does all the work and I take
  • 1:01 - 1:04
    all the credit. Yeah.
  • 1:04 - 1:08
    I love how that gets applause.
  • 1:08 - 1:10
    So we're with the HubridGroup. We are a software
  • 1:10 - 1:15
    development consultancy based in sunny Los
    Angelos, California. And
  • 1:15 - 1:19
    among other things, we are the creators of
    KidsRuby.
  • 1:19 - 1:24
    How did you guys like my new boss this
  • 1:24 - 1:28
    morning? She's awesome, right. The funny part
    if you're
  • 1:28 - 1:31
    kid- you think I'm kidding. Yeah. So, but
    here
  • 1:31 - 1:34
    today we are here to talk to you on
  • 1:34 - 1:38
    Ruby on robots. Zzzz-zzzz.
  • 1:38 - 1:40
    This robot is not with us today.
  • 1:40 - 1:43
    So let me ask you, is innovation dead?
  • 1:43 - 1:44
    AUDIENCE: No! Yes!
  • 1:44 - 1:47
    R.E.: I mean, William Gibson said the future
    is
  • 1:47 - 1:51
    already here, it's just not very evenly distributed.
    Isn't
  • 1:51 - 1:54
    that really true? I mean, many of us have
  • 1:54 - 1:56
    been doing web development for years, and
    yet we've
  • 1:56 - 1:59
    been seeing a very interesting thing happening
    as we've
  • 1:59 - 2:01
    been creating all these different technologies.
  • 2:01 - 2:04
    We've discovered that innovating is really
    hard. I mean,
  • 2:04 - 2:07
    doing something genuinely different. And,
    in fact, it's especially
  • 2:07 - 2:12
    hard when you're dealing with hardware. So
    about six
  • 2:12 - 2:15
    years ago, my younger brother Damien and I
    started
  • 2:15 - 2:17
    building unmanned aerial vehicles using Ruby.
    A number of
  • 2:17 - 2:20
    people remember that. And we had to source
    parts
  • 2:20 - 2:23
    from literally all over the globe, AKA China.
    And
  • 2:23 - 2:26
    they would ship us these really amazing microcontrollers
    and
  • 2:26 - 2:27
    we would put them in blimps and they would
  • 2:27 - 2:29
    burn up and we would order more.
  • 2:29 - 2:31
    Nowadays though you can go to the Apple store
  • 2:31 - 2:33
    and buy several different kinds of robots.
    I mean,
  • 2:33 - 2:36
    the robot revolution is already here. So we're
    here
  • 2:36 - 2:39
    to introduce to you Artoo, which is Ruby on
  • 2:39 - 2:43
    robots. It is a Ruby framework for robotics
    and
  • 2:43 - 2:48
    physical computing. It supports multiple hardware
    devices, different hardware
  • 2:48 - 2:53
    devices, and multiple hardware devices at
    the same time.
  • 2:53 - 2:53
    In Ruby?
  • 2:53 - 2:56
    I mean, are we serious?
  • 2:56 - 2:59
    Yes! We are extremely serious, and the reason
    for
  • 2:59 - 3:02
    that is a remarkable piece of technology called
    Celluloid.
  • 3:02 - 3:05
    Tony, are you here, by any chance? You bailed
  • 3:05 - 3:07
    on my talk? What's up with that?
  • 3:07 - 3:08
    So anyway, a bunch of the committers from
    Celluloid
  • 3:08 - 3:11
    are here, and actually this is probably the,
    one
  • 3:11 - 3:13
    of the most important technologies to occur
    in the
  • 3:13 - 3:16
    entire Ruby community in years, and if you're
    not
  • 3:16 - 3:17
    paying attention to this, you need to be.
  • 3:17 - 3:19
    In fact, you're probably using it right now,
    if
  • 3:19 - 3:22
    you use SideKiq, which is another great project.
    So
  • 3:22 - 3:24
    it runs on the MRI Ruby, of course. It
  • 3:24 - 3:27
    runs far better, however, on JRuby, thanks
    to the
  • 3:27 - 3:30
    concurrency of the JVM - excellent piece of
    software.
  • 3:30 - 3:33
    You probably saw Charles and Tom's talk earlier.
    Great
  • 3:33 - 3:33
    work.
  • 3:33 - 3:35
    But, we're gonna be showing most of our demos
  • 3:35 - 3:38
    today on Rubinius. The Ruby of the future.
    If
  • 3:38 - 3:41
    you love any of these projects, please go
    help
  • 3:41 - 3:44
    them. Bryan Sharrai is here. Bryan, are you,
    are
  • 3:44 - 3:47
    you here? Oh, he's probably with his daughter.
  • 3:47 - 3:50
    Where are all my friends? Anyway, these -
  • 3:50 - 3:53
    AUDIENCE: I'll be your friend!
  • 3:53 - 3:55
    R.E.: Aww! Giant hugs!
  • 3:55 - 3:57
    Channeling my inner tenderlove.
  • 3:57 - 4:00
    So, anyway, this is an amazing project. It
    just
  • 4:00 - 4:02
    reached the 2 point 0 release, and Rubinius
    X
  • 4:02 - 4:05
    has been announced. There's really exciting
    things happening to
  • 4:05 - 4:07
    it, and it's an important part of, really
    a
  • 4:07 - 4:10
    pillar of the future of Ruby.
  • 4:10 - 4:13
    Anyway, back to Artoo. So Artoo is to robotics
  • 4:13 - 4:16
    like Rails is to web development. I'm gonna
    say
  • 4:16 - 4:19
    that again cause it's really, really important.
    Artoo is
  • 4:19 - 4:22
    to robotics like Rails is to web development.
    Actually,
  • 4:22 - 4:24
    it might be a little bit more like Sinatra,
  • 4:24 - 4:27
    as you can tell from this code example.
  • 4:27 - 4:29
    So you see, first of all, we require Artoo.
  • 4:29 - 4:32
    Then we're going to declare a connection to
    an
  • 4:32 - 4:35
    :arduino that's going to use an adapter called
    :firmata,
  • 4:35 - 4:37
    which is a serial protocol that you can use
  • 4:37 - 4:42
    to communicate with various arduinos and arduino-compatible
    microcontrollers. Then
  • 4:42 - 4:44
    we're gonna connect on a particular port.
    Then we're
  • 4:44 - 4:46
    gonna declare a device.
  • 4:46 - 4:48
    This device is an LED. It's gonna use the
  • 4:48 - 4:51
    LED drive and be connected to pin thirteen.
  • 4:51 - 4:53
    Then the work that we're going to do is
  • 4:53 - 4:56
    every one second, we're going to led dot toggle,
  • 4:56 - 4:58
    which is going to turn the LED either on
  • 4:58 - 5:00
    or off. So this is kind of the canonical,
  • 5:00 - 5:03
    make an LED blink. So we'll show you that
  • 5:03 - 5:04
    in a minute.
  • 5:04 - 5:06
    So, Artoo's architecture. We have a couple
    of very
  • 5:06 - 5:09
    important design patterns that we're utilizing
    within Artoo. Here's
  • 5:09 - 5:12
    a little bit of a outline. So the main
  • 5:12 - 5:14
    entity in Artoo is, of course, the robot.
  • 5:14 - 5:17
    And we have two things underneath that. We
    have
  • 5:17 - 5:19
    the connections, as you saw before, and we
    have
  • 5:19 - 5:22
    devices. Now, we are using the adapter pattern
    in
  • 5:22 - 5:25
    both of these cases. So connections use an
    adapter,
  • 5:25 - 5:28
    we can use, similar to the way ActiveRecord
    or
  • 5:28 - 5:30
    other ORMs work, we can actually use this
    adap-
  • 5:30 - 5:33
    these different adapters to talk to different
    kinds of
  • 5:33 - 5:33
    hardware.
  • 5:33 - 5:36
    So where connections control how we actually
    communicate, whatever
  • 5:36 - 5:40
    protocols with the device, then the devices
    control behaviors.
  • 5:40 - 5:43
    LEDs know how to blink, drones know how to
  • 5:43 - 5:44
    fly, et cetera.
  • 5:44 - 5:45
    And then we are also using the polish and
  • 5:45 - 5:50
    subscribe pattern via events. Devices, by
    their drivers, can
  • 5:50 - 5:53
    detect events and then tell the robot about
    them.
  • 5:53 - 5:55
    So it also has an API in Artoo. I
  • 5:55 - 5:57
    mean, what good is a robot unless you can
  • 5:57 - 6:00
    control it via an API across the intertubes,
    right?
  • 6:00 - 6:03
    So here's a, an example of both a restful
  • 6:03 - 6:05
    API and a web sockets API that could be
  • 6:05 - 6:08
    used by two different applications to talk
    to the
  • 6:08 - 6:11
    MCP or the master control program, which will
    then
  • 6:11 - 6:14
    control all of the different robots. And there
    you
  • 6:14 - 6:16
    have it.
  • 6:16 - 6:19
    Of course, test-driven robotics is very, very
    important. I
  • 6:19 - 6:21
    mean, we are Rubyists and we test first, right.
  • 6:21 - 6:24
    Well, traditionally in robotics, the way you
    do testing
  • 6:24 - 6:25
    is you would turn on the robot and jump
  • 6:25 - 6:27
    back really fast.
  • 6:27 - 6:29
    I have scars.
  • 6:29 - 6:32
    However, this is Ruby, and we can do a
  • 6:32 - 6:34
    little better. Here's an example of TDR, or
    test-driven
  • 6:34 - 6:37
    robotics, as we call it. In this case, we're
  • 6:37 - 6:40
    actually using mini-spec, we're using mocha
    and we are
  • 6:40 - 6:43
    using timecop. So let's take a quick look.
  • 6:43 - 6:46
    First, we're gonna declare :start right now.
    The robot
  • 6:46 - 6:48
    is going to be the main robot, which is,
  • 6:48 - 6:51
    as you'll remember, very similar to the Sinatra
    syntax,
  • 6:51 - 6:55
    and then before this test, we're going to
    Timecop.travel
  • 6:55 - 6:58
    to the start. Then we start our robot's work.
  • 6:58 - 7:00
    It must roll every three seconds. So we travel
  • 7:00 - 7:03
    to three seconds after the start. We're going
    to
  • 7:03 - 7:05
    expect a roll command, and then we process
    messages
  • 7:05 - 7:08
    to give Celluloid's mailboxes a chance to
    catch up
  • 7:08 - 7:09
    with its actors.
  • 7:09 - 7:10
    So this way we do not have to wait,
  • 7:10 - 7:12
    just a little over three seconds to test something
  • 7:12 - 7:15
    that takes three seconds. Because otherwise
    if we wanted
  • 7:15 - 7:17
    to test something like turn the sprinklers
    on once
  • 7:17 - 7:19
    a week, we would have to wait a week.
  • 7:19 - 7:22
    And that's not good. You think your CI is
  • 7:22 - 7:23
    bad, try it with robots.
  • 7:23 - 7:26
    So, of course, Artoo also has a command line
  • 7:26 - 7:29
    interface, because, two, well one of the important
    patterns
  • 7:29 - 7:32
    that we've discovered, as we call it RobotOps,
    is
  • 7:32 - 7:34
    that you definitely want to use commands to
    connect
  • 7:34 - 7:36
    to all the devices. You do not want to
  • 7:36 - 7:37
    do these sort of things manually.
  • 7:37 - 7:42
    Anyway, though, I've done enough talking.
    How about a
  • 7:42 - 7:45
    demo? You guy's want to see a demo?
  • 7:45 - 7:46
    All right.
  • 7:46 - 7:48
    So, the - oh, first thing I'm gonna take
  • 7:48 - 7:51
    a look at is the Digispark microcontroller.
    So the
  • 7:51 - 7:55
    Digispark is what we might call the minimum
    via
  • 7:55 - 7:59
    microcontroller. Oh yeah, please. It's very
    small. We have
  • 7:59 - 8:01
    to get it very close. Oh. It would help
  • 8:01 - 8:06
    if you had the camera. All right.
  • 8:06 - 8:10
    So this is it. It's extremely small. Oh, can't
  • 8:10 - 8:14
    see it. I can see it.
  • 8:14 - 8:20
    Hey. There it is.
  • 8:20 - 8:22
    So this is it. It's a, it's a rather
  • 8:22 - 8:25
    small device, as you can tell. It actually,
    let's
  • 8:25 - 8:28
    take the shielf off. This is the Digispark
    itself.
  • 8:28 - 8:32
    It's a very small, well, thank you, it is
  • 8:32 - 8:37
    very small, itty-tiny powered microcontroller
    that actually uses another
  • 8:37 - 8:40
    protocol called littlewire, similar to fermanta
    but it runs
  • 8:40 - 8:42
    on even smaller microcontrollers.
  • 8:42 - 8:46
    We're good? All right, we're in focus. And
    we're
  • 8:46 - 8:49
    going to use this LED shield that plugs into
  • 8:49 - 8:53
    it, it. It's better when it's over here. All
  • 8:53 - 8:55
    right, so the program that we're gonna run
    -
  • 8:55 - 9:02
    well, that's not it. That's definitely not
    it.
  • 9:02 - 9:04
    OK, so the program we're gonna run here is,
  • 9:04 - 9:06
    the first thing we're gonna do is we're gonna
  • 9:06 - 9:09
    connect to the Digispark using the little
    wire adaptor
  • 9:09 - 9:11
    with the vendor in ID since it's a USB
  • 9:11 - 9:13
    device. We're gonna connect to the boad to
    retrieve
  • 9:13 - 9:15
    it's device info, which we're gonna display
    on this
  • 9:15 - 9:17
    screen that you won't be able to see. Then
  • 9:17 - 9:20
    the LED device, we're going to, again, toggle
    it
  • 9:20 - 9:21
    every second.
  • 9:21 - 9:23
    So you see it's exactly the same code as
  • 9:23 - 9:26
    we were using with the arduino. See a pattern
  • 9:26 - 9:27
    forming? All right. Let's run this.
  • 9:27 - 9:32
    Oh, right. The video.
  • 9:32 - 9:39
    Executing code. So it should start to flash.
    A
  • 9:40 - 9:42
    three thousand dollar lamp!
  • 9:42 - 9:48
    All right. And, and you're applauding. Ha
    ha ha
  • 9:48 - 9:51
    ha! I should be Apple. All right.
  • 9:51 - 9:53
    So moving on. So what do you do with
  • 9:53 - 9:56
    a flashing LED? Well, we are software developers,
    and
  • 9:56 - 9:58
    of course what we do is we check our
  • 9:58 - 10:03
    Travis build status notifications. Yes, the,
    the build notifier
  • 10:03 - 10:06
    is to physical computing like the blog engine
    was
  • 10:06 - 10:10
    to website development. That's the thing you
    do. All
  • 10:10 - 10:10
    right.
  • 10:10 - 10:12
    So let's take a look at some code, real
  • 10:12 - 10:13
    fast.
  • 10:13 - 10:16
    All right. So in this case, we're gonna require
  • 10:16 - 10:18
    Artoo, we're gonna require Travis. We're gonna
    connect to
  • 10:18 - 10:21
    the Digispark and its different LEDs. Then
    we're gonna
  • 10:21 - 10:23
    connect to a broken repo that we've called
    broken
  • 10:23 - 10:27
    arrow, in the tradition of flying things that
    don't
  • 10:27 - 10:29
    necessarily work. We're gonna connect to the
    Travis repo,
  • 10:29 - 10:32
    then every ten seconds we're gonna check the
    repo.
  • 10:32 - 10:33
    If the repo is green, we're gonna turn on
  • 10:33 - 10:36
    the green LED. When we're tasking it, it's
    going
  • 10:36 - 10:38
    to turn blue, when we're check either the
    build
  • 10:38 - 10:40
    is running or we're checking the Travis status.
    And
  • 10:40 - 10:43
    then if the build fails we're gonna turn red.
  • 10:43 - 10:44
    And then last we have a couple of functions.
  • 10:44 - 10:47
    One turns on one particular LED and the other
  • 10:47 - 10:50
    turns them all off. All right.
  • 10:50 - 10:57
    So if this actually works, across the internet,
    it
  • 10:58 - 10:59
    will turn blue that we're checking the Travis
    build
  • 10:59 - 11:02
    status, and it will turn red since broken
    arrow
  • 11:02 - 11:09
    is a broken build. Working. Working. Working.
    Fail.
  • 11:15 - 11:19
    Now, we could go in and fix the build
  • 11:19 - 11:20
    but in the interest of time let's just move
  • 11:20 - 11:23
    on to the next thing. All right, so what
  • 11:23 - 11:25
    is the next thing?
  • 11:25 - 11:30
    Oh yes. So one of the greatest things on
  • 11:30 - 11:35
    the internet are cats. And the only thing
    better
  • 11:35 - 11:40
    than cats are internet enabled cats. For example,
    internet
  • 11:40 - 11:44
    enabled cat toys. So in this case, we have
  • 11:44 - 11:48
    a cool little device that we've made, kind
    of
  • 11:48 - 11:51
    homebrew, but we like it. It's got two servos
  • 11:51 - 11:54
    and it plugs into the Digispark, and then
    is
  • 11:54 - 11:57
    connected to this fun little toy. Whoops,
    get the
  • 11:57 - 11:58
    right angle.
  • 11:58 - 12:01
    Can you guys see this OK?
  • 12:01 - 12:02
    We don't have a cat. They wouldn't let us
  • 12:02 - 12:06
    bring one in. We have a robot cat but
  • 12:06 - 12:07
    it's not the same.
  • 12:07 - 12:08
    All right, so let's take a look at the
  • 12:08 - 12:14
    code. Wait, that's something else. Where is
    the code?
  • 12:14 - 12:15
    I forgot to load it. All right. Well, in
  • 12:15 - 12:21
    any case, we're using this leap motion. Yes.
    This
  • 12:21 - 12:24
    leap motion is going to allow us to, with
  • 12:24 - 12:29
    nothing more than his hand waves, control
    these servos,
  • 12:29 - 12:32
    moving this cat toy to the invisible internet
    cat
  • 12:32 - 12:34
    on the other side.
  • 12:34 - 12:35
    Oh, wait.
  • 12:35 - 12:38
    K, let's see the toy.
  • 12:38 - 12:45
    O.V.: What it's this. There it goes.
  • 12:49 - 12:52
    R.E.: All right.
  • 12:52 - 12:55
    AUDIENCE: Oh! Whoa!
  • 12:55 - 13:00
    R.E.: Ah, we broke it.
  • 13:00 - 13:05
    I don't know how long this would last with
  • 13:05 - 13:08
    a real cat, but it's still cool.
  • 13:08 - 13:12
    Look ma! Just hands!
  • 13:12 - 13:16
    Thank you.
  • 13:16 - 13:22
    OK. So, so now let's switch to something else.
  • 13:22 - 13:25
    The Beaglebone Black. So one of the important
    robot
  • 13:25 - 13:27
    ops patterns that we want to share with you
  • 13:27 - 13:29
    is, you do not want to think you are
  • 13:29 - 13:33
    going to develop robotics on your notebook
    computer unless
  • 13:33 - 13:36
    you plan on duct taping it to a drone,
  • 13:36 - 13:38
    which you might try. It might work. For you.
  • 13:38 - 13:42
    On the other hand, there are amazing single
    board
  • 13:42 - 13:45
    system on chip, or SOC Linux computers that
    are
  • 13:45 - 13:49
    very, very inexpensive. The Raspberry Pi is
    one. Another
  • 13:49 - 13:51
    one, though, that is a little bit more powerful
  • 13:51 - 13:55
    but is also opensource hardware, is the Beaglebone
    Black.
  • 13:55 - 13:58
    Where is my video? There we go. So the
  • 13:58 - 14:01
    Beaglebone Black is a very, very cool, also
    arm
  • 14:01 - 14:05
    coretex powered single board computer. It
    has a one
  • 14:05 - 14:09
    gigahertz processor and 512 megabytes of RAM.
    In this
  • 14:09 - 14:11
    particular case, we are running an Arch Linux
    distro
  • 14:11 - 14:14
    that we have built that is also available
    on
  • 14:14 - 14:15
    the Artoo, a link from the Artoo dot IO
  • 14:15 - 14:18
    website where it includes everything you're
    going to need,
  • 14:18 - 14:22
    software wise, to turn this into a complete,
    full
  • 14:22 - 14:25
    physical computing and robotics platform.
  • 14:25 - 14:28
    Can you see this OK? There, we need it
  • 14:28 - 14:29
    on the other side.
  • 14:29 - 14:33
    OK. So let's take a, a closer look.
  • 14:33 - 14:38
    It's, it's naked. We have cases, but.
  • 14:38 - 14:42
    So, you see that it has a lot of
  • 14:42 - 14:45
    different pins that you can plug into for
    digital
  • 14:45 - 14:49
    IO, for analog IO, for pulse with modulation
    and
  • 14:49 - 14:52
    for I2C. Now, I might mention, you saw before
  • 14:52 - 14:56
    that we had difference between drivers, and
    connections. Well,
  • 14:56 - 15:00
    we have generic drivers for general purpose
    IO and
  • 15:00 - 15:03
    for I2C devices. So you can actually use these
  • 15:03 - 15:07
    same drivers on arduino, on a Raspberry Pi,
    on
  • 15:07 - 15:09
    the Beaglebone Black, on the Digispark, or
    on any
  • 15:09 - 15:13
    other platform that supports Linux GPIO.
  • 15:13 - 15:15
    Think about that. Kind of fun.
  • 15:15 - 15:18
    So what we're gonna do here is we're going
  • 15:18 - 15:23
    to show our - there we go. The blink
  • 15:23 - 15:27
    program that we showed before with a slightly
    different
  • 15:27 - 15:29
    syntax, but same idea. Except in this case,
    we're
  • 15:29 - 15:31
    gonna use the connection to the Beaglebone
    using the
  • 15:31 - 15:34
    Beaglebone's adaptor. And we have a slightly
    different pin
  • 15:34 - 15:37
    numbering scheme, because the Beaglebone Black's
    pins are different
  • 15:37 - 15:39
    in a similar fashion that Raspberry Pi pins.
  • 15:39 - 15:41
    This is actually what the pin is labeled on
  • 15:41 - 15:44
    the device. That way you're not trying to,
    go
  • 15:44 - 15:46
    to a lookup table. Man, there's software for
    that.
  • 15:46 - 15:47
    It's called Artoo.
  • 15:47 - 15:50
    All right. Back to the camera. So and now
  • 15:50 - 15:53
    we're going to - he's actually SSH pin to
  • 15:53 - 15:56
    this Unix computer and is going to make our
  • 15:56 - 16:02
    gigantic LED start flashing.
  • 16:02 - 16:09
    If all goes well.
  • 16:16 - 16:20
    Blink on the Beaglebone. Yes. It's real.
  • 16:20 - 16:27
    All right. So that was cool, but can we
  • 16:29 - 16:33
    get a little more exciting? Like, yes. So
    let's
  • 16:33 - 16:40
    bring in another toy. Another robotic device.
    The Sphero.
  • 16:40 - 16:44
    So the, the Sphero is from Orotix. Oh, yeah.
  • 16:44 - 16:48
    The camera. Oh, camera.
  • 16:48 - 16:51
    So the Sphero is a small robotic sphere from
  • 16:51 - 16:56
    Orbotics based out of Boulder, Colarado. Fantastically
    interesting toy.
  • 16:56 - 16:59
    It might be the minimum viable robot, because
    it
  • 16:59 - 17:03
    actually possesses input. It has accelerometers
    that can detect
  • 17:03 - 17:07
    collisions. It has output. It can change its
    color.
  • 17:07 - 17:09
    And it can move around on its own volition.
  • 17:09 - 17:13
    It is a bluetooth device. So we're gonna connect
  • 17:13 - 17:15
    up to it using another Artoo program.
  • 17:15 - 17:21
    Which, let me show you the code for that.
  • 17:21 - 17:25
    All right. So in this case, we're gonna require
  • 17:25 - 17:27
    Artoo. We're gonna make a connection to the
    Sphero
  • 17:27 - 17:32
    using the Sphero adaptor on a particular IP
    address.
  • 17:32 - 17:34
    Another one of the lessons from the robot
    ops
  • 17:34 - 17:37
    toolbook is you definitely want to use serial
    to
  • 17:37 - 17:39
    socket connections. You don't want to try
    to connect
  • 17:39 - 17:41
    directly to the serial ports. You know, well,
    that
  • 17:41 - 17:44
    way we can use nice TCP and UGP style
  • 17:44 - 17:45
    software development.
  • 17:45 - 17:47
    In this case, the work that we're gonna do
  • 17:47 - 17:50
    is every one second, we're going to display
    a
  • 17:50 - 17:51
    little message, and then we're gonna set the
    Sphero's
  • 17:51 - 17:54
    color to a random color - RGB color -
  • 17:54 - 17:56
    and then we're going to roll at the speed
  • 17:56 - 18:00
    of 90 in a random direction. Crazy Sphero.
  • 18:00 - 18:07
    So let's go and see what happens.
  • 18:08 - 18:11
    So blue means we are connected to the bluetooth
  • 18:11 - 18:17
    device. Oh, by the way, we are running this
  • 18:17 - 18:24
    off of the Beaglebone Black as well. Go, go,
  • 18:24 - 18:26
    Sphero!
  • 18:26 - 18:33
    It is alive.
  • 18:38 - 18:42
    So one thing we did want to mention before
  • 18:42 - 18:45
    we go any further is, choose your own hardware
  • 18:45 - 18:48
    adventure! What good is this stuff if you
    don't
  • 18:48 - 18:50
    have some hardware? Well, luckily we have
    a lot
  • 18:50 - 18:53
    of wonderful friends, and these friends said
    please give
  • 18:53 - 18:56
    away hardware to the awesome Ruby community.
    So you
  • 18:56 - 18:58
    get to choose your own hardware adventure!
  • 18:58 - 19:01
    Now, not everyone is gonna get hardware today.
    Only
  • 19:01 - 19:04
    those who go to the Twitterverse and appeal
    to
  • 19:04 - 19:07
    the magnificent of the robotic overlords,
    and say please
  • 19:07 - 19:11
    give me a microcontroller. So, if you Tweet
    @digistump
  • 19:11 - 19:14
    and @artooio you can win one of our Artoo
  • 19:14 - 19:18
    Digispark starter kits that comes with the
    microcontroller, it
  • 19:18 - 19:21
    comes with an RGB LED shield. It comes with
  • 19:21 - 19:23
    all the little connectors that you will need
    to
  • 19:23 - 19:25
    connect it to motors or servos or other things.
  • 19:25 - 19:30
    So @digistump and @artooio to win the Digispark.
  • 19:30 - 19:32
    If you want to win a Beaglebone Black starter
  • 19:32 - 19:35
    kit that includes a Beaglebone Black, SD card,
    jumpers,
  • 19:35 - 19:37
    preg board - everything you need to build
    your
  • 19:37 - 19:41
    own robot - Tweet @beagleboardorg - kind of
    long,
  • 19:41 - 19:44
    sorry - and @artooio to win that.
  • 19:44 - 19:46
    And if you want to win a Sphero 2
  • 19:46 - 19:49
    point 0, the hot new item, then you Tweet
  • 19:49 - 19:53
    @gosphero and @artooio. All right, so to run
    through
  • 19:53 - 19:54
    that again.
  • 19:54 - 19:59
    @digistump and @artooio to win that. @beagleboardorg
    and @artooio
  • 19:59 - 20:01
    if you go in that direction. And if you
  • 20:01 - 20:03
    go north, you get to try to win a
  • 20:03 - 20:04
    @gosphero and @artooio.
  • 20:04 - 20:07
    So again our criteria is whichever Tweet we
    like
  • 20:07 - 20:10
    most, so, beg. It's OK.
  • 20:10 - 20:13
    All right. Onto the demo!
  • 20:13 - 20:16
    So Conway's Game of Life. Who knows about
    Conway's
  • 20:16 - 20:20
    Game of Life? A decent percentage. But let's
    just
  • 20:20 - 20:22
    to a quick mathemagical review. So John Conway
    was
  • 20:22 - 20:27
    a mathemagician who invented something that
    we call cellular
  • 20:27 - 20:30
    automata. It basically says by using very,
    very simple
  • 20:30 - 20:33
    algorithm we can get interest emerging behaviors.
    It's absolutely
  • 20:33 - 20:36
    kind of like a swam of robots.
  • 20:36 - 20:38
    And we thought, let's just do the, do the
  • 20:38 - 20:40
    rules here real fast. So it's usually played
    on
  • 20:40 - 20:43
    graph paper using a paper and pencil. By the
  • 20:43 - 20:45
    way, I highly recommend graph paper for doing
    creative
  • 20:45 - 20:47
    work. It's fantastic.
  • 20:47 - 20:50
    All right. So with graph paper, you would
    draw
  • 20:50 - 20:52
    some cells, which are the dots, and then the
  • 20:52 - 20:56
    rules are, if a cell has less than two
  • 20:56 - 20:59
    neighbors, it dies on the next turn. If a
  • 20:59 - 21:04
    cell has exactly two neighbors, an empty space,
    a
  • 21:04 - 21:06
    new cell is born into it. And if where
  • 21:06 - 21:08
    a cell is there are more than three neighbors,
  • 21:08 - 21:11
    it dies from over population. So that would
    be
  • 21:11 - 21:15
    the second move. So first move. Second move,
    and
  • 21:15 - 21:15
    so on.
  • 21:15 - 21:17
    Well, we thought it would be really cool to
  • 21:17 - 21:21
    do Conway's Game of Life with robots. But
    we
  • 21:21 - 21:23
    realized we'd have to do things a little tiny
  • 21:23 - 21:24
    bit differently.
  • 21:24 - 21:26
    One of the differences is that the Sphero
    does
  • 21:26 - 21:28
    not possess the ability - you might want to
  • 21:28 - 21:30
    start connecting - the Sphero does not possess
    the
  • 21:30 - 21:34
    ability to see other Spheros. However, it
    does an
  • 21:34 - 21:37
    accelerometer to detect collisions. So by
    doing a little
  • 21:37 - 21:40
    bit of an inverse fourier transform, we can
    basically
  • 21:40 - 21:43
    turn the collisions into an estimation of
    proximity within
  • 21:43 - 21:46
    a slice of time. And thereby we can make
  • 21:46 - 21:49
    a decision about whether or not this is actually
  • 21:49 - 21:52
    collided and whether it should live or die.
  • 21:52 - 21:56
    So let's watch artificial life with Artoo
    and Sphero.
  • 21:56 - 21:57
    And it begins.
  • 21:57 - 22:04
    They have become alive. Now they're wandering
    around looking
  • 22:11 - 22:18
    for love in all the wrong places. They need
  • 22:23 - 22:29
    just a little contact. Not human. Sphero.
  • 22:29 - 22:32
    Actually human contact would probably work,
    but. Oh. Two
  • 22:32 - 22:32
    died.
  • 22:32 - 22:38
    I feel traumatized even when artificial life
    loses it.
  • 22:38 - 22:42
    They can come back to life. We call that
  • 22:42 - 22:43
    zombie mode.
  • 22:43 - 22:49
    Well, so, and eventually they might all die,
    or
  • 22:49 - 22:52
    it might just go on for long periods of
  • 22:52 - 22:54
    time. It's very hard for me to kill off
  • 22:54 - 22:58
    any life form, artificial or natural. So let's
    give
  • 22:58 - 23:05
    it a brief moment. Oh. It's so lonely. The
  • 23:06 - 23:09
    last, the last Sphero.
  • 23:09 - 23:11
    There's something kind of epic. Maybe someone
    will compose
  • 23:11 - 23:13
    a ballad. Ooh!
  • 23:13 - 23:16
    Anyway, I think you guys kind of get the
  • 23:16 - 23:20
    idea. Let's take a quick look at some code.
  • 23:20 - 23:23
    So in this case, we're actually using Artoo's
    modular
  • 23:23 - 23:25
    mode, where we're declaring a class, which
    is the
  • 23:25 - 23:29
    Conway Sphero robot, so the connecting to
    a Sphero,
  • 23:29 - 23:30
    the device is a Sphero. The work it's gonna
  • 23:30 - 23:33
    do is first it's born, then on the Sphero's
  • 23:33 - 23:35
    collision - and here we see an example of
  • 23:35 - 23:38
    Artoo's event syntax - every, on a collision
    we're
  • 23:38 - 23:40
    gonna call the contact method.
  • 23:40 - 23:42
    Every three seconds we're gonna make a move,
    and
  • 23:42 - 23:44
    if we're alive, and every ten seconds a birth
  • 23:44 - 23:47
    day if we're still alive. Life is short, hard,
  • 23:47 - 23:51
    and colorful in Sphero land.
  • 23:51 - 23:52
    So then you see some of our helpers, check
  • 23:52 - 23:56
    if we're alive, rebirth. If we actually follow
    the
  • 23:56 - 24:00
    rules we can be born, life, and death. So
  • 24:00 - 24:01
    you kind of get the idea.
  • 24:01 - 24:03
    Oh, wait, there's one last thing that's kind
    of
  • 24:03 - 24:08
    important here. So then we declare a hash
    with
  • 24:08 - 24:10
    all of our different IP addresses and the
    names
  • 24:10 - 24:12
    of Spheros, and then each Sphero we create
    a
  • 24:12 - 24:14
    new one and then tell all of them to
  • 24:14 - 24:16
    go to work at the same time. A swarm
  • 24:16 - 24:17
    of Spheros.
  • 24:17 - 24:20
    All right.
  • 24:20 - 24:24
    So now let's do something completely different.
  • 24:24 - 24:26
    This is the time to put on your protective
  • 24:26 - 24:28
    gear if you have some.
  • 24:28 - 24:35
    So we're gonna demo the ARDrone and we're
    moving
  • 24:36 - 24:41
    over there so we have some space. Oh, yeah,
  • 24:41 - 24:45
    we forgot to set this up before. Whoops! Oh,
  • 24:45 - 24:46
    the Sphero.
  • 24:46 - 24:51
    I mean, sorry, the ARDrone, yeah. That thing.
  • 24:51 - 24:54
    AUDIENCE: Sphero is funnier.
  • 24:54 - 24:57
    R.E.: Many people got to see Jim Weirichs
    Argis
  • 24:57 - 24:59
    gem. We're actually using the Argus gem wrapped
    up
  • 24:59 - 25:03
    inside of the Artoo ARDrone adapter and we've
    done
  • 25:03 - 25:05
    a few contributions to it ourselves. It's
    very, very
  • 25:05 - 25:12
    cool. Thank you Jim. We really appreciate
    it.
  • 25:14 - 25:15
    Standing on the shoulders of giants is awesome.
  • 25:15 - 25:18
    All right, so what we're gonna do here is
  • 25:18 - 25:22
    take a quick look at some, some code. All
  • 25:22 - 25:26
    right. So in this case, we're gonna make require
  • 25:26 - 25:28
    Artoo, we're gonna make a connection to the
    ARDrone
  • 25:28 - 25:30
    via its adapter. The device is gonna be the
  • 25:30 - 25:33
    drone - you seeing a pattern forming?
  • 25:33 - 25:34
    So the work we're gonna do is first we're
  • 25:34 - 25:37
    gonna start the drone. Then the drone's gonna
    take
  • 25:37 - 25:40
    off. After fifteen seconds, it's going to
    hover and
  • 25:40 - 25:43
    land, and then after twenty seconds stop.
    So a
  • 25:43 - 25:48
    little bit of automated drone flight.
  • 25:48 - 25:55
    So this is the drone.
  • 26:06 - 26:11
    Hello.
  • 26:11 - 26:17
    ARDrone in Artoo!
  • 26:17 - 26:20
    So now, for this next demonstration we're
    going to
  • 26:20 - 26:25
    need a courageous volunteer from our studio
    audience. I
  • 26:25 - 26:28
    mean courageous, like this is kind of dangerous.
    And
  • 26:28 - 26:30
    you have to be tall.
  • 26:30 - 26:34
    Oh, yeah, let's just use of the Hybrid group
  • 26:34 - 26:38
    members cause I believe we are insured for
    them.
  • 26:38 - 26:40
    And if not I can just drive them home.
  • 26:40 - 26:42
    Daniel Fisher, HybridGroup strongman!
  • 26:42 - 26:43
    All right.
  • 26:43 - 26:45
    DANIEL FISHER: (indecipherable - 00:26:47)
  • 26:45 - 26:49
    R.E.: Oh, OK. Yes, thank you. All right. So
  • 26:49 - 26:51
    what we're going to do - recently, we added,
  • 26:51 - 26:55
    there we go. Recently we added openCV support
    to
  • 26:55 - 26:57
    Artoo. If you're not familiar with it, openCV
    is
  • 26:57 - 27:01
    probably the most important computer vision
    library. It's open
  • 27:01 - 27:03
    source. It has, it's, it's a very deep and
  • 27:03 - 27:06
    rich platform, and so what we're gonna do
    here
  • 27:06 - 27:10
    is we're gonna make a connection to the capture
  • 27:10 - 27:12
    device, then we're gonna make a connection
    to the
  • 27:12 - 27:14
    video device, and we're gonna make a connection
    to
  • 27:14 - 27:16
    the ARDrone.
  • 27:16 - 27:20
    We're gonna use a facial recognition set of
    data
  • 27:20 - 27:23
    and then the work that we're gonna do is
  • 27:23 - 27:26
    we're going to capture each frame and display
    it
  • 27:26 - 27:27
    on a window, which we'll see in a moment.
  • 27:27 - 27:30
    We're gonna start the drone and take off.
    After
  • 27:30 - 27:32
    eight seconds it's gonna boost up to about
    face
  • 27:32 - 27:36
    hugger level. After ten seconds it will hover
    again
  • 27:36 - 27:39
    and then at the mysterious thirteen second
    mark it
  • 27:39 - 27:42
    will begin its facial recognition mission.
  • 27:42 - 27:45
    It should detect Daniel's face and then, as
    he
  • 27:45 - 27:48
    tries to evade it, it should follow him.
  • 27:48 - 27:51
    I think you see now why we chose our
  • 27:51 - 27:52
    own volunteer.
  • 27:52 - 27:58
    All right. So without further risk to us,
    cause
  • 27:58 - 28:01
    it's gonna be over there - get behind me
  • 28:01 - 28:05
    man. All right.
  • 28:05 - 28:07
    This is how you know it's real.
  • 28:07 - 28:10
    We swear.
  • 28:10 - 28:17
    All right.
  • 28:17 - 28:21
    Wanna enlarge the window please?
  • 28:21 - 28:28
    V.O.: Why would do this?
  • 28:38 - 28:45
    R.E.: Well, when we say customer service drone,
    we
  • 28:47 - 28:54
    mean customer service drone. Evasive.
  • 29:00 - 29:06
    We put in code to stop it before it
  • 29:06 - 29:12
    got too dangerous.
  • 29:12 - 29:19
    So thank you Daniel. I owe you a drink,
  • 29:19 - 29:19
    man.
  • 29:19 - 29:22
    All right. So we promise new hardware every
    time
  • 29:22 - 29:26
    we do a show and basically we cannot disappoint.
  • 29:26 - 29:29
    So why is it getting dark?
  • 29:29 - 29:30
    AUDIENCE: (indecipherable - 00:29:30)
  • 29:30 - 29:33
    R.E.: Oh. That explains it. I'm like, it's
    all
  • 29:33 - 29:38
    getting dark in here. Around the edges especially.
  • 29:38 - 29:40
    So we have a, some really awesome new hardware
  • 29:40 - 29:43
    - where is it? It's really small, so it's
  • 29:43 - 29:44
    hard to find.
  • 29:44 - 29:46
    AUDIENCE: Are we gonna do the- (indecipherable
    - 00:29:47)
  • 29:46 - 29:49
    R.E.: Oh, OK, yeah let's do that first. All
  • 29:49 - 29:51
    right. Actually we do have two kinds of new
  • 29:51 - 29:54
    hardware today. So the first thing is, many
    of
  • 29:54 - 29:57
    you might have seen us fly the ARDrone around
  • 29:57 - 30:00
    with a Wii classic controller using an arduino.
    But
  • 30:00 - 30:02
    even Nintendo has stopped dealing with Wii.
  • 30:02 - 30:05
    So we thought, hey, it's time to get in
  • 30:05 - 30:08
    the modern generation. So we now support the
    PS3
  • 30:08 - 30:11
    controller and the xBox 60, xBox 360 controller.
    So
  • 30:11 - 30:15
    Adriane, who is serious programming guy and
    test pilot,
  • 30:15 - 30:19
    is going to use this generic GameStop PS3
    style
  • 30:19 - 30:23
    controller to fly this ARDrone around. Let's
    see if
  • 30:23 - 30:26
    - take a look at the code here.
  • 30:26 - 30:32
    No, I went the wrong way. There we go.
  • 30:32 - 30:35
    So we can see that we're gonna declare connection
  • 30:35 - 30:37
    to the ARDrone. We're gonna declare the device
    of
  • 30:37 - 30:40
    the drone, a connection to the joystick, and
    then
  • 30:40 - 30:44
    we see that we're gonna - whoops.
  • 30:44 - 30:46
    We're gonna handle a bunch of these controller
    events
  • 30:46 - 30:48
    - for example, when he hits the square button
  • 30:48 - 30:51
    it will take off, the triangle button, it
    will
  • 30:51 - 30:53
    hover. The x button it will land - et
  • 30:53 - 30:54
    cetera.
  • 30:54 - 30:57
    And so now if all goes as expected -
  • 30:57 - 31:04
    oh yes, reset the drone.
  • 31:04 - 31:11
    If it comes really close to you, please duck.
  • 31:11 - 31:13
    This is human-powered flight, so blame him.
  • 31:13 - 31:18
    ADRIENNE: I just work for this guy.
  • 31:18 - 31:25
    R.E.: No, it all comes here. Blame me. I
  • 31:27 - 31:34
    have band aids. I think.
  • 31:37 - 31:40
    Standing by.
  • 31:40 - 31:42
    This is Tower. How you doing control?
  • 31:42 - 31:44
    ADRIENN: (indecipherable - 00:31:45)
  • 31:44 - 31:46
    R.E.: Still trying to connect to the wifi
    on
  • 31:46 - 31:52
    the drone. We're, we're standing by ground
    control. There's
  • 31:52 - 31:54
    a certain cadence to this.
  • 31:54 - 31:56
    ADRIENN: There we go.
  • 31:56 - 31:57
    R.E.: If you don't do it right, it literally
  • 31:57 - 31:59
    won't take off. I mean if you don't say
  • 31:59 - 32:03
    it right. If there's anybody from Rocket City
    here,
  • 32:03 - 32:10
    please correct my English. American.
  • 32:10 - 32:16
    ADRIENN: All right, ready?
  • 32:16 - 32:20
    R.E.: Standing by.
  • 32:20 - 32:27
    Yeah, get some altitude. Get some altitude!
  • 32:29 - 32:33
    ARDrone, PS3 controller!
  • 32:33 - 32:38
    Ah, you're not gonna buzz the back row? No,
  • 32:38 - 32:42
    don't do it. Don't do it. If I want
  • 32:42 - 32:43
    you to.
  • 32:43 - 32:46
    It's tempting, but no. Not today.
  • 32:46 - 32:52
    OK. So now the grand finale. What you've all
  • 32:52 - 32:58
    been waiting for.
  • 32:58 - 33:02
    The Crazyflie, ARDrone, and PS/3 Controller.
    So, what is
  • 33:02 - 33:06
    the Crazyflie? Are we crazy? We are. Extremely.
    If
  • 33:06 - 33:09
    you hadn't noticed that. It's probably the
    minimum viable
  • 33:09 - 33:13
    quad copter. This is the Crazyflie from BitCrazy
    out
  • 33:13 - 33:18
    of Sweden. It's a, it - it's gonna hurt,
  • 33:18 - 33:20
    yeah. It's really small - how bad can it
  • 33:20 - 33:21
    hurt?
  • 33:21 - 33:23
    So this is actually a very, very impressive
    piece
  • 33:23 - 33:27
    of technology. It also has an armcore tox
    processor
  • 33:27 - 33:32
    running a realtime operating system. It's
    got a accelerometer
  • 33:32 - 33:34
    - three axis accelerometer. It's got a magnitometer,
    a
  • 33:34 - 33:37
    to a compass, and it also has a barometer
  • 33:37 - 33:41
    for altitude detection. It's actually quite
    an acrobatic drone.
  • 33:41 - 33:47
    It's very, very hard to control.
  • 33:47 - 33:49
    Luckily it's very small. Also luckily it only
    has
  • 33:49 - 33:52
    about a seven minute life span on that battery.
  • 33:52 - 33:55
    So there's that. If you can get away from
  • 33:55 - 33:59
    seve- six and a half minutes, you're fine.
    Actually,
  • 33:59 - 34:01
    in about five minutes, the sensors start going
    off.
  • 34:01 - 34:04
    So if you throw some, like, tin foil and
  • 34:04 - 34:04
    run that way.
  • 34:04 - 34:08
    All right. So what we're going to do, the
  • 34:08 - 34:11
    first time ever anywhere seen, is we're actually
    gonna
  • 34:11 - 34:14
    control both of these drones with the same
    code.
  • 34:14 - 34:16
    Which, let's take a look at it.
  • 34:16 - 34:21
    All right, so first we're going to require
    Artoo.
  • 34:21 - 34:23
    Then we're gonna make a connection to the
    Crazyflie
  • 34:23 - 34:26
    using its adapter, and we're going to then
    connect
  • 34:26 - 34:29
    to the joystick. Then we're gonna connect
    to the
  • 34:29 - 34:30
    ARDrone.
  • 34:30 - 34:32
    So the work that we're gonna actually do here
  • 34:32 - 34:35
    is we're gonna use the controller to control
    the
  • 34:35 - 34:38
    Crazyflie, and then we're gonna use the ARDrone
    to
  • 34:38 - 34:41
    do automated take-off. And if this all goes
    as
  • 34:41 - 34:45
    expected, the ARDrone should take off and
    hover, and
  • 34:45 - 34:46
    then Adrienne should be able to kind of fly
  • 34:46 - 34:49
    around it manually using the Crazyflie.
  • 34:49 - 34:53
    Should be interesting.
  • 34:53 - 35:00
    Let's do it. Standing by. Multidrone. Two
    drones, one
  • 35:13 - 35:16
    code.
  • 35:16 - 35:23
    V.O.: Watch out!
  • 35:25 - 35:32
    R.E.: They live!
  • 35:33 - 35:36
    We so crazy.
  • 35:36 - 35:43
    Adrienne, our test pilot!
  • 35:47 - 35:54
    So was that fun? I love this stuff.
  • 35:56 - 35:59
    AUDIENCE: ( indecipherable - 00:35:58)
  • 35:59 - 36:03
    R.E.: Yeah. Let's, the question was, is it
    possible
  • 36:03 - 36:05
    to control them both with one controller?
    The answer
  • 36:05 - 36:09
    is yes, however, because there's a significantly
    different vector
  • 36:09 - 36:13
    of thrust in the Crazyflie versus the ARDrone,
    we
  • 36:13 - 36:15
    didn't really have time to get that perfected
    with
  • 36:15 - 36:16
    the amount of space we had in the hotel
  • 36:16 - 36:20
    room here. And we kind of didn't want to
  • 36:20 - 36:22
    spoil all the surprise, cause as soon as you
  • 36:22 - 36:24
    start flying something around, people start
    coming in and
  • 36:24 - 36:26
    swarming on it.
  • 36:26 - 36:28
    So, we'll get to that. Come and see us
  • 36:28 - 36:32
    at Robots Conf in December, here back in Florida.
  • 36:32 - 36:36
    But wait, there's more!
  • 36:36 - 36:39
    There's always more. We, we heard that, you
    know,
  • 36:39 - 36:42
    some people really, really like JavaScript
    these days, and
  • 36:42 - 36:43
    so we thought, we'd like to put some robots
  • 36:43 - 36:46
    on JavaScript. So Cylon dot JS is a project
  • 36:46 - 36:48
    we just announced last month. And it lets
    you
  • 36:48 - 36:52
    use CoffeeScript or JavaScript with node.js
    to do basically
  • 36:52 - 36:54
    the same exact thing that you just saw, except
  • 36:54 - 36:56
    in those languages.
  • 36:56 - 36:58
    And so it's actually available now. It doesn't
    have
  • 36:58 - 37:01
    all twelve platforms supported. The Artoo
    does, but it's,
  • 37:01 - 37:04
    but we're getting there. And then, today we
    announce
  • 37:04 - 37:06
    GoBot!
  • 37:06 - 37:13
    Because we heard that the Go programming language
    was
  • 37:13 - 37:14
    something you guys were kind of interested
    in, too,
  • 37:14 - 37:18
    and we sort of liked it ourselves. And we
  • 37:18 - 37:20
    thought, Go, Go Robot! so, actually GoBot
    is this
  • 37:20 - 37:25
    month's project announcement. It's literally
    very, very hot and
  • 37:25 - 37:28
    fresh. In other words, it barely works, but
    it
  • 37:28 - 37:29
    kind of does.
  • 37:29 - 37:30
    And I-
  • 37:30 - 37:33
    ADRIENNE: It runs Conway's Game of Life.
  • 37:33 - 37:36
    R.E.: Exactly. We have artificial life with
    GoBot already.
  • 37:36 - 37:41
    So, so check it out. Join the Robot Evolution!
  • 37:41 - 37:43
    Because we need all of you to help us
  • 37:43 - 37:47
    build this future. So artoo dot io, or follow
  • 37:47 - 37:51
    us @artooio on Twitter. cylonjs dot com or
    follow
  • 37:51 - 37:55
    us @cylonjs on Twitter. Or gobot dot io and
  • 37:55 - 37:58
    follow us @gobotio on Twitter. So once again
    -
  • 37:58 - 37:59
    those numbers again.
  • 37:59 - 38:04
    Artoo dot io, cylonjs, and gobotio.
  • 38:04 - 38:08
    All right. So I, for one, say welcome to
  • 38:08 - 38:12
    the machines. But there are a few questions
    that
  • 38:12 - 38:14
    we have. One of them, perhaps, has to do
  • 38:14 - 38:18
    with robot economics. So when machines are
    doing the
  • 38:18 - 38:21
    jobs that humans do now, what will we do?
  • 38:21 - 38:23
    Kurt Vonnegut in Player Piano kind of posited
    a
  • 38:23 - 38:27
    future where the satisfaction that we have
    would be
  • 38:27 - 38:29
    greatly lacking because of a meaning and purpose
    that
  • 38:29 - 38:32
    we needed in our lives. And what about the
  • 38:32 - 38:34
    pay that people need for jobs?
  • 38:34 - 38:38
    Well, and what about robot ethics? Do robots
    actually
  • 38:38 - 38:40
    have ethics, or do they only have the ethics
  • 38:40 - 38:43
    that we give to them? So what is going
  • 38:43 - 38:46
    to happen? The answer is - I don't know.
  • 38:46 - 38:51
    However, I know some actual professional futurists,
    and Daniel
  • 38:51 - 38:52
    Rasmus is a very dear friend of mine. He
  • 38:52 - 38:54
    wrote a book called Listening to the Future
    where
  • 38:54 - 38:57
    he talks about something called scenario analysis.
  • 38:57 - 38:59
    So we're gonna do a little scenario analysis
    of
  • 38:59 - 39:01
    what we think's gonna happen, and we're gonna
    use
  • 39:01 - 39:04
    two axis. The first one is robot sentience
    -
  • 39:04 - 39:07
    will it become intelligent or not? And the
    other
  • 39:07 - 39:09
    one is robot friendliness - will they be friendly
  • 39:09 - 39:10
    or hostile?
  • 39:10 - 39:12
    And because we are a Los Angelos based company
  • 39:12 - 39:14
    at HybridGroup, we look at everything in terms
    of
  • 39:14 - 39:21
    Hollywood movies. So a, if the robots become
    non-intelligent
  • 39:21 - 39:23
    and they're not friendly, we get the movie
    Brazil.
  • 39:23 - 39:27
    In other words, life today.
  • 39:27 - 39:31
    Now, if the robots become intelligent but
    they're not
  • 39:31 - 39:34
    friendly, we get Terminator. Enough said.
  • 39:34 - 39:39
    Now, if the robots are not sentient but friendly,
  • 39:39 - 39:45
    we get Power Rangers. And then if the robots
  • 39:45 - 39:49
    are both sentient and friendly, we get singularity.
  • 39:49 - 39:52
    So this guy spent a lot of time thinking
  • 39:52 - 39:54
    about what was gonna happen in the future.
    I,
  • 39:54 - 39:56
    I hope every one knows that this is Isaac
  • 39:56 - 39:56
    Asimov.
  • 39:56 - 39:58
    AUDIENCE: Oh yeah.
  • 39:58 - 40:00
    R.E.: You didn't know Conway's Game of Life,
    but
  • 40:00 - 40:01
    thank you know Asimov.
  • 40:01 - 40:04
    So he wrote the three laws of robotics, and
  • 40:04 - 40:08
    I'm gonna read them to you now. Number one:
  • 40:08 - 40:11
    a robot may not injure a human being or,
  • 40:11 - 40:13
    through inaction, allow a human being to come
    to
  • 40:13 - 40:15
    harm.
  • 40:15 - 40:18
    Number two: a robot must obey the orders given
  • 40:18 - 40:21
    to it by human beings, except where such orders
  • 40:21 - 40:23
    would conflict with the first law.
  • 40:23 - 40:26
    And then number three: a robot must protect
    its
  • 40:26 - 40:29
    own existence, as long as such protection
    does not
  • 40:29 - 40:31
    conflict with the first or second laws.
  • 40:31 - 40:37
    So, how's that been working out for us guys?
  • 40:37 - 40:39
    But I don't think it's fair to blame the
  • 40:39 - 40:40
    robots, because this in fact is not a robot,
  • 40:40 - 40:42
    it is a drone, and it is being controlled
  • 40:42 - 40:45
    by a person in a hidden bunker, where they
  • 40:45 - 40:47
    are far, far away from the battlefield where
    these
  • 40:47 - 40:51
    weapons are actually being utilized against
    real humans.
  • 40:51 - 40:55
    So we propose a small change - just a
  • 40:55 - 40:59
    little patch revision to Asimov's first law.
    Version one
  • 40:59 - 41:03
    point one. A human may not injure a human
  • 41:03 - 41:06
    being or, through inaction, allow a human
    being to
  • 41:06 - 41:08
    come to harm.
  • 41:08 - 41:15
    Imagine that future. Let's make that future.
    Thank you.
Title:
Ruby Conf 2013 - Ruby On Robots Using Artoo by Ron Evans
Description:

more » « less
Duration:
41:41

English subtitles

Revisions