Return to Video

INET Washington DC: Surveillance, Cybersecurity and the Future of the Internet

  • 0:01 - 0:05
    >> Good afternoon again
    and welcome to GW.
  • 0:05 - 0:08
    My name is David Dolling and
    I'm the Dean of the School
  • 0:08 - 0:11
    of Engineering and
    Applied Science here
  • 0:11 - 0:13
    at the George Washington
    University.
  • 0:13 - 0:16
    I don't think it would
    be news to anybody
  • 0:16 - 0:20
    that there's been a lot of
    documents out there recently
  • 0:20 - 0:22
    with respect to cyber security.
  • 0:22 - 0:25
    We've seen the White
    House's executive order aimed
  • 0:25 - 0:28
    at protecting our
    critical infrastructure.
  • 0:28 - 0:31
    We've seen a report from the
    Security Company Mandiant
  • 0:31 - 0:35
    that exposed a Chinese
    military hacking group.
  • 0:35 - 0:38
    And very much in the news
    now although it's faded
  • 0:38 - 0:42
    after the press conference
    yesterday afternoon have been
  • 0:42 - 0:45
    documents related
    to NSA's prison
  • 0:45 - 0:48
    and telephone surveillance
    programs.
  • 0:48 - 0:50
    Well today, we are
    very fortunate
  • 0:50 - 0:52
    that have several
    experts with us
  • 0:52 - 0:56
    to discuss their various
    view points on surveillance
  • 0:56 - 0:59
    on cyber security and the
    future of the internet.
  • 0:59 - 1:04
    As we try to be secure
    while protecting privacy,
  • 1:04 - 1:09
    we do need to be guided by
    knowledge from many disciplines
  • 1:09 - 1:12
    and in fact, this was stated
    quite explicitly last December
  • 1:12 - 1:16
    at the launch of the GW
    Cyber Security Initiative.
  • 1:16 - 1:18
    The initiative's chairman,
  • 1:18 - 1:23
    former DHS Secretary Michael
    Chertoff spoke of the need
  • 1:23 - 1:26
    for university's
    like ours and others
  • 1:26 - 1:30
    to give students an
    understanding of the triad
  • 1:30 - 1:34
    of technology, business
    policy and how they interact
  • 1:34 - 1:38
    to create difficult cyber
    security challenges.
  • 1:38 - 1:41
    While I'm on the podium here,
    I'd like to just take advantage
  • 1:41 - 1:45
    of a captive audience because I
    know you all too polite to cheer
  • 1:45 - 1:49
    or get up and move out to say
    a few things about the School
  • 1:49 - 1:52
    of Engineering and
    Applied Science here at GW
  • 1:52 - 1:56
    which is a very thriving
    and growing enterprise.
  • 1:56 - 1:58
    Our Department of
    Computer Science
  • 1:58 - 2:02
    in fact anticipated
    Secretary Chertoff's remarks
  • 2:02 - 2:06
    by launching last fall
    the Master of Science
  • 2:06 - 2:10
    in Cyber security in our
    Computer Science Department.
  • 2:10 - 2:11
    It's the first program
  • 2:11 - 2:14
    in Washington DC designed
    specifically to respond
  • 2:14 - 2:17
    to the large and
    fast growing need
  • 2:17 - 2:21
    for technical cyber
    security experts locally,
  • 2:21 - 2:24
    nationally, and internationally.
  • 2:24 - 2:29
    Underscoring our commitment
    here to cyber security education
  • 2:29 - 2:32
    and public service, our NSF
  • 2:32 - 2:36
    and DHS sponsored CyberCorps
    scholarship programs.
  • 2:36 - 2:40
    These programs fund students who
    are going to be future leaders
  • 2:40 - 2:43
    in the federal government
    on cyber security
  • 2:43 - 2:46
    and indeed some of
    them already are.
  • 2:46 - 2:50
    Each week, the CyberCorps
    students meet in class often
  • 2:50 - 2:53
    with current or former
    federal employees as well
  • 2:53 - 2:55
    as industry experts to learn
  • 2:55 - 2:59
    about the major cyber security
    issues so they're prepared
  • 2:59 - 3:01
    to meet the challenges that
    will certainly great them
  • 3:01 - 3:04
    after they graduate
    here and take jobs.
  • 3:04 - 3:07
    This course which was
    previously only available
  • 3:07 - 3:11
    to CyberCorps students is now
    available to all GW students
  • 3:11 - 3:15
    as a hybrid cause
    conducted mostly online
  • 3:15 - 3:16
    with the short intensive
  • 3:16 - 3:19
    in person learning
    experience or the beginning.
  • 3:19 - 3:23
    Starting next year,
    some doctor or students
  • 3:23 - 3:24
    in the graduate school
    of education
  • 3:24 - 3:27
    and human development
    here at GW on human
  • 3:27 - 3:31
    and organizational learning
    will also be taking this class a
  • 3:31 - 3:35
    required part of their
    cyber security focus.
  • 3:35 - 3:37
    Well, let me say that
    today's event which looks
  • 3:37 - 3:39
    to be very well populated
    and looks like it's going
  • 3:39 - 3:42
    to be very interesting
    is sponsored
  • 3:42 - 3:45
    by the cyber security policy
    in research institute.
  • 3:45 - 3:48
    We abbreviate that around
    here to CSPRI of the School
  • 3:48 - 3:50
    of Engineering and
    Applied Science.
  • 3:50 - 3:54
    In addition to overseeing
    the CyberCorps program,
  • 3:54 - 3:58
    it facilitates and execute
    into disciplinary research
  • 3:58 - 4:02
    in education in cyber
    security across GW.
  • 4:02 - 4:05
    One example of this is its
    work with the graduate school
  • 4:05 - 4:08
    of education in human
    development supporting the
  • 4:08 - 4:11
    national cyber watch
    center in a joint project
  • 4:11 - 4:15
    to develop the nation
    cyber security workforce.
  • 4:15 - 4:17
    More information on that program
  • 4:17 - 4:21
    on CSPRI Cyber Security
    Scholarships and School
  • 4:21 - 4:23
    of Engineering and Applied
    Science can be found
  • 4:23 - 4:27
    in the material available
    the registration desk.
  • 4:27 - 4:29
    We're happy to have
    as a cosponsor
  • 4:29 - 4:34
    of this event today the internet
    society and it's my pleasure
  • 4:34 - 4:37
    to now introduced Paul
    Brigner, Regional Director
  • 4:37 - 4:42
    of the North American Bureau at
    the Internet Society to come up
  • 4:42 - 4:44
    and tell you a few
    words about it
  • 4:44 - 4:47
    and its role and
    in today's event.
  • 4:47 - 4:49
    So, on behalf of GW, the
    School of Engineering
  • 4:49 - 4:52
    and Applied Science,
    CSPRI and the Department
  • 4:52 - 4:54
    Of Computer Science,
    welcome to GW
  • 4:54 - 4:56
    and I hope you have
    a great afternoon.
  • 4:56 - 4:58
    Thank you.
  • 4:58 - 5:01
    [ Applause ]
  • 5:01 - 5:10
    [ Pause ]
  • 5:11 - 5:13
    >> Thank you Dr.
    Dolling and thank you
  • 5:13 - 5:16
    to the George Washington
    Cyber Security Policy
  • 5:16 - 5:17
    And Research Institute
    for partnering
  • 5:17 - 5:19
    with us on this event today.
  • 5:19 - 5:22
    And thanks to all of
    you for joining us
  • 5:22 - 5:26
    and that means not everyone here
    in the room but also to those
  • 5:26 - 5:29
    who are at a remote
    hub in New York City.
  • 5:29 - 5:30
    That remote hub is
    being sponsored
  • 5:30 - 5:32
    by the internet societies
    New York chapter.
  • 5:32 - 5:34
    I know we don't have
    a good view of them
  • 5:34 - 5:36
    with the camera right
    now but what's going
  • 5:36 - 5:38
    to happen is they
    will walk up to
  • 5:38 - 5:40
    that camera and ask questions.
  • 5:40 - 5:43
    And by the way, they've
    been meeting
  • 5:43 - 5:44
    on this topic already
    this morning.
  • 5:44 - 5:46
    They've had their
    own meetings today,
  • 5:46 - 5:49
    so maybe that gives you an idea
    of how passionate that group
  • 5:49 - 5:52
    of individual is about
    the topics we will be
  • 5:52 - 5:53
    discussing though.
  • 5:53 - 5:56
    So, thank you to the New York
    chapter for getting together
  • 5:56 - 5:59
    to join us as remote hub.
  • 5:59 - 6:02
    And definitely, thank you
    to all those on live stream.
  • 6:02 - 6:04
    We have a very good crowd
    already gathering on live stream
  • 6:04 - 6:09
    to watch us from
    all over the world.
  • 6:09 - 6:12
    My first order of business today
    is makes sure that you're aware
  • 6:12 - 6:13
    of the internet society.
  • 6:13 - 6:16
    If this is the first time you're
    joining from one of our events,
  • 6:16 - 6:19
    we are a global nonprofit
    cost driven organization
  • 6:19 - 6:21
    with a very straight
    forward vision.
  • 6:21 - 6:24
    That is the internet
    is for everyone.
  • 6:24 - 6:27
    We work to achieve that vision
  • 6:27 - 6:29
    by promoting the open
    development evolution and use
  • 6:29 - 6:30
    of the internet for the benefit
  • 6:30 - 6:33
    of tall people throughout
    the world.
  • 6:33 - 6:37
    Some of our key functions
    involve facilitating the open
  • 6:37 - 6:40
    development of standards,
    protocols, administration
  • 6:40 - 6:43
    and a technical infrastructure
    of the internet.
  • 6:43 - 6:47
    Supporting internet education
    in developing countries.
  • 6:47 - 6:50
    Promoting professional
    development in community meeting
  • 6:50 - 6:53
    to faster greater
    participation and leadership
  • 6:53 - 6:56
    in areas important to the
    evolution of the internet
  • 6:56 - 6:59
    and fostering an environment
    for international cooperation,
  • 6:59 - 7:00
    community and a culture
  • 7:00 - 7:04
    that enables internet
    self governance.
  • 7:04 - 7:07
    Well, I saw staff lead specific
    projects on these topics.
  • 7:07 - 7:10
    Much of our work is achieved
    through the work of our members
  • 7:10 - 7:12
    and chapters around the world.
  • 7:12 - 7:14
    And speaking of chapters,
  • 7:14 - 7:16
    I would like to give
    special recognition
  • 7:16 - 7:19
    to the local DC chapter.
  • 7:19 - 7:21
    I am very fortunate that our
    headquarters is just not far
  • 7:21 - 7:25
    away in western Virginia so
    I'm able to interact personally
  • 7:25 - 7:28
    with our DC chapter
    on a regular basis.
  • 7:28 - 7:29
    And I can attest to the fact
  • 7:29 - 7:31
    that they are really a
    first class group of people
  • 7:31 - 7:34
    that regularly hold very
    interesting meetings related
  • 7:34 - 7:36
    to the internet societies
    mission.
  • 7:36 - 7:39
    So, if you're a local
    and you're not involve,
  • 7:39 - 7:42
    you're really missing out, I
    do hope you will get involved.
  • 7:42 - 7:44
    And if the leaders who are here
  • 7:44 - 7:47
    from the internet society DC
    chapter would stand briefly,
  • 7:47 - 7:50
    I just want to make sure
    that you're recognized.
  • 7:51 - 7:53
    So, great.
  • 7:53 - 7:54
    So, you know who they are.
  • 7:54 - 7:56
    Please make a special
    effort to talk
  • 7:56 - 7:57
    to them today during
    this event if you'd
  • 7:57 - 8:00
    like to get involve
    with the chapter.
  • 8:00 - 8:03
    [ Pause ]
  • 8:03 - 8:06
    One of my key functions
    at the internet society is
  • 8:06 - 8:09
    to work closely with our
    chapters in the region not only
  • 8:09 - 8:12
    to plan iNET events like this
    but to assist with their events
  • 8:12 - 8:17
    and to have an open
    dialogue on policy issues.
  • 8:17 - 8:19
    That dialogue in turn
    helps me and my colleagues
  • 8:19 - 8:23
    at the internet society to form
    our policy positions and help
  • 8:23 - 8:25
    to set the agenda for
    our regional meetings
  • 8:25 - 8:27
    like the one we are
    attending today
  • 8:27 - 8:29
    and that brings me
    to today's agenda.
  • 8:29 - 8:32
    The topic of this iNET
    has really been driven
  • 8:32 - 8:33
    by the interest of our members
  • 8:33 - 8:36
    in the North American
    region and around the world.
  • 8:36 - 8:40
    It so happens today that one of
    our panelist is the President
  • 8:40 - 8:41
    and Chief Executive Officer
  • 8:41 - 8:45
    of the internet society
    that's Lynn St. Amour.
  • 8:45 - 8:47
    So, she will have
    the opportunity
  • 8:47 - 8:50
    to address the concerns
    of our constituency today.
  • 8:50 - 8:53
    Now, without further delay,
  • 8:53 - 8:56
    I would like to introduce
    Professors Lance Hoffman.
  • 8:56 - 9:01
    He is the Director of the Cyber
    Security Research and Policy--
  • 9:01 - 9:03
    I'm sorry, the Cyber
    Security Policy
  • 9:03 - 9:06
    and Research Institute here at
    George Washington University.
  • 9:06 - 9:10
    I first met Professor Hoffman
    when he sponsored a debate
  • 9:10 - 9:14
    on PIPA and SOPA and
    I knew that he was,
  • 9:14 - 9:17
    based on that experience, I
    knew that he was game to talk
  • 9:17 - 9:19
    about some very controversial
    topics.
  • 9:19 - 9:22
    So, I guessed right, he
    was interested in talking
  • 9:22 - 9:25
    about this one as well and
    it's been really a pleasure
  • 9:25 - 9:26
    to partner with him
    on this event.
  • 9:26 - 9:30
    So, I want to sincerely
    thank you for that.
  • 9:30 - 9:32
    It's really been a
    great experience.
  • 9:32 - 9:35
    I'd also like to thank all
    of our fantastic panelist
  • 9:35 - 9:37
    who have agreed to
    speak here today.
  • 9:37 - 9:39
    We really have an all
    star group of people
  • 9:39 - 9:42
    who have taken their time out
    to share their views with you.
  • 9:42 - 9:45
    I'd like them all to come up
    and take their sits on stage now
  • 9:45 - 9:47
    if they would and I
    will turn this event
  • 9:47 - 9:49
    over to Professor Hoffman.
  • 9:49 - 9:53
    One thing I forgot to mention
    is that if you want to Twitter
  • 9:53 - 9:55
    or tweet about this event,
  • 9:55 - 9:59
    please use the hashtag
    iNET DC, iNET DC.
  • 9:59 - 10:02
    Thank you all over much.
  • 10:02 - 10:07
    [ Applause ]
  • 10:07 - 10:16
    [ Noise ]
  • 10:19 - 10:28
    [ Inaudible Discussion ]
  • 10:36 - 10:37
    >> Good afternoon everybody.
  • 10:37 - 10:43
    Let me add my welcome to that of
    Dean Dolling and to Paul Brigner
  • 10:43 - 10:46
    and let me also thank the
    internet society, our cosponsor
  • 10:46 - 10:48
    for this event for
    their programmatic
  • 10:48 - 10:51
    and financial assistance
    in making it possible.
  • 10:51 - 10:56
    I also want to thank from
    CSPRI, my Associate Director,
  • 10:56 - 10:59
    Dr. Costis Toregas and our
    intern and research assistance
  • 10:59 - 11:03
    who have worked on this and are
    helping here today tray hair
  • 11:03 - 11:04
    [phonetic] Dustin
    Benanberg [assumed spelling],
  • 11:04 - 11:05
    Jaime Moore [assumed spelling],
  • 11:05 - 11:07
    and Greg Ziegler
    [assumed spelling].
  • 11:07 - 11:11
    My role today is to set the
    stage for our excellent panel
  • 11:11 - 11:14
    and into moderate that panel
    discussion on cyber security
  • 11:14 - 11:18
    in the future of the internet
    given the recent revelations
  • 11:18 - 11:21
    about prism and other
    surveillance programs.
  • 11:21 - 11:24
    The panelist of each degree
    to summarize their thoughts
  • 11:24 - 11:27
    in a four minute statements
    so we could have lots of time
  • 11:27 - 11:30
    for questions from the
    audience including the audience
  • 11:30 - 11:32
    in cyber space.
  • 11:32 - 11:36
    After the audience question and
    answer session in a short break,
  • 11:36 - 11:39
    participants will return for
    a very interesting session
  • 11:39 - 11:43
    around table discussion where
    we will see their world views
  • 11:43 - 11:45
    on this, where there
    world views agree,
  • 11:45 - 11:51
    and where there are tensions and
    talk about this in a round table
  • 11:51 - 11:52
    which will be informed by
    their previous statements
  • 11:52 - 11:55
    and the audience
    questions before hand.
  • 11:55 - 11:57
    I'm delighted that
    for this event,
  • 11:57 - 12:00
    we will have an experience
    moderator Professor Steve
  • 12:00 - 12:03
    Roberts leading the
    round table discussion.
  • 12:03 - 12:06
    Many of the students and
    faculty here may recognize Steve
  • 12:06 - 12:08
    as the superior professor
    of media
  • 12:08 - 12:11
    and public affairs here at GW.
  • 12:11 - 12:13
    But others including
    those viewing on this
  • 12:13 - 12:15
    on the internet maybe
    more familiar with him
  • 12:15 - 12:17
    from his appearances
    as a commentator
  • 12:17 - 12:20
    on many Washington
    based television shows,
  • 12:20 - 12:23
    as a political analyst
    on the ABC radio network,
  • 12:23 - 12:26
    and as a substitute host
    on NPR Diane Rehm Show.
  • 12:26 - 12:30
    Another rule where he-- this
    is material from experts
  • 12:30 - 12:34
    and clarifies it to facilitate
    informed public discussion.
  • 12:34 - 12:36
    But before the round table,
  • 12:36 - 12:38
    let me identify the
    participants on stage.
  • 12:38 - 12:41
    We will not have lengthy
    introductions here
  • 12:41 - 12:44
    since their bio sketches have
    been made available to you
  • 12:44 - 12:49
    in the brochure which you
    have gotten or can get
  • 12:49 - 12:51
    at the registration desk.
  • 12:51 - 12:56
    But anyway, let me
    introduce very briefly
  • 12:56 - 12:59
    and you see their name
    tags in front of you,
  • 12:59 - 13:02
    Danny Weitzner is the
    Director and Cofounder
  • 13:02 - 13:04
    of the MIT Computer Science
  • 13:04 - 13:07
    and Artificial Intelligence
    Laboratory.
  • 13:07 - 13:10
    >> Not the whole lab.
  • 13:10 - 13:13
    >> Not the whole lab?
  • 13:13 - 13:15
    >> I'll explain.
  • 13:15 - 13:18
    >> OK. Lynn St. Amour
    is the President and CEO
  • 13:18 - 13:20
    of the Internet Society.
  • 13:20 - 13:23
    Let's see, who's next.
  • 13:23 - 13:26
    Laura, Laura DeNardis
    is professor of School
  • 13:26 - 13:29
    of Communication at
    American University.
  • 13:29 - 13:33
    Melissa Hathaway is President of
    the Hathaway Global Strategies.
  • 13:33 - 13:34
    Let's see.
  • 13:34 - 13:36
    Who's down there?
  • 13:36 - 13:38
    I don't have you on my list.
  • 13:38 - 13:40
    >> Oh no.
  • 13:40 - 13:41
    >> I apologize.
  • 13:41 - 13:42
    >> That would me.
  • 13:42 - 13:43
    >> Oh Randy, of course.
  • 13:43 - 13:46
    Randy from-- I don't have here.
  • 13:46 - 13:48
    I mean, OK.
  • 13:48 - 13:50
    Randy-- where are you from?
  • 13:50 - 13:51
    I'll let you introduce yourself.
  • 13:51 - 13:52
    I'm sorry.
  • 13:52 - 13:53
    [Inaudible Remark]
  • 13:53 - 13:54
    >> I'm the University
    IT Security Officer
  • 13:54 - 13:55
    at Virginia Tech.
  • 13:55 - 13:56
    >> I didn't want
    it to get it wrong
  • 13:56 - 13:58
    and say something
    wrong-- wrong university.
  • 13:58 - 14:00
    Sorry Randy.
  • 14:00 - 14:02
    OK, let's see.
  • 14:02 - 14:05
    Leslie Harris is from Center
    for Democracy and Technology.
  • 14:05 - 14:08
    And last but not the least,
    John Curran, President and CEO
  • 14:08 - 14:11
    of American Registry
    for Internet Numbers.
  • 14:11 - 14:16
    OK, so without further ado, I am
    going to suggest to be each go
  • 14:16 - 14:19
    down the-- I have to
    decide, I'm going to leave it
  • 14:19 - 14:25
    to I guess either Danny or
    John on who wants to go first.
  • 14:25 - 14:25
    >> No, no, no.
  • 14:25 - 14:26
    Go ahead.
  • 14:26 - 14:26
    >> John?
  • 14:26 - 14:27
    >> OK.
  • 14:27 - 14:27
    >> OK?
  • 14:27 - 14:28
    >> OK John, go on.
  • 14:28 - 14:29
    Four minutes please.
  • 14:29 - 14:30
    >> Well, thank you.
  • 14:30 - 14:33
    I'm happy to be here and I
    want to thank the opportunity
  • 14:33 - 14:36
    to present at this panel.
  • 14:36 - 14:39
    Keeping my remark short,
    I will say that we're
  • 14:39 - 14:43
    in an interesting period of
    time with respect to the future
  • 14:43 - 14:46
    of the internet and
    how it's governed.
  • 14:46 - 14:49
    People may not realize
    it but the fact is,
  • 14:49 - 14:52
    there is a coordination
    function that's existed
  • 14:52 - 14:56
    since the earliest days of the
    internet which is necessary
  • 14:56 - 14:59
    so that we can actually
    use the internet
  • 14:59 - 15:00
    so that it actually functions.
  • 15:00 - 15:02
    For example, computers
  • 15:02 - 15:05
    on the internet have
    unique identifiers.
  • 15:05 - 15:08
    Several, actually, one of
    them is the IP addresses
  • 15:08 - 15:09
    which is part of what
    Aaron [assumed spelling] is
  • 15:09 - 15:10
    involved in.
  • 15:10 - 15:13
    We can't have computers
    using the same IP addresses.
  • 15:13 - 15:15
    Every computer needs
    its own IP address.
  • 15:15 - 15:19
    Likewise, there's coordination
    of things like domain names
  • 15:19 - 15:20
    to make sure that a
    domain name is a sign
  • 15:20 - 15:23
    to one organization,
    et cetera, et cetera.
  • 15:23 - 15:26
    This is called technical
    coordination
  • 15:26 - 15:30
    and it's been a function of the
    internet, necessary function
  • 15:30 - 15:32
    for the protocols
    to work as designed
  • 15:32 - 15:34
    since the very beginning.
  • 15:34 - 15:37
    People may not realize that
    this use to all be done
  • 15:37 - 15:42
    by one gentlemen, Dr. Jon Postel
    who we no longer have with us
  • 15:42 - 15:44
    but was an amazing individual.
  • 15:44 - 15:47
    And in the late '90s, we
    actually went about setting
  • 15:47 - 15:51
    up infrastructure to enable
    this technical coordination
  • 15:51 - 15:56
    so that IP addresses and
    domain names could be available
  • 15:56 - 16:00
    to everyone globally for
    their use on the internet
  • 16:00 - 16:02
    but coordinated so that
    they actually interoperated
  • 16:02 - 16:06
    so that they work as expected.
  • 16:06 - 16:08
    This structure involves
    a lot of organizations.
  • 16:08 - 16:12
    Aaron is one, there're five
    regional registries throughout
  • 16:12 - 16:13
    the globe.
  • 16:13 - 16:15
    Aaron is the one that handles
    North America, so with Canada,
  • 16:15 - 16:18
    the United States and about
    half of the Caribbean.
  • 16:18 - 16:20
    Though there are four
    other regional registries
  • 16:20 - 16:23
    and then there is the DNS
    coordination that goes
  • 16:23 - 16:26
    on through organizations
    such as ICANN.
  • 16:26 - 16:27
    The international
    coo-- [laughs].
  • 16:27 - 16:32
    Sorry. International--
    [Noise] --
  • 16:32 - 16:35
    Organization for assigned
    Names and Numbers.
  • 16:35 - 16:39
    So, the fact is that these
    bodies all work doing technical
  • 16:39 - 16:41
    coordination is very important.
  • 16:41 - 16:46
    But, they've been doing it based
    on a historical trajectory.
  • 16:46 - 16:50
    A trajectory that originated
    with projects and programs
  • 16:50 - 16:52
    out of the US government
    which have
  • 16:52 - 16:55
    since been increasingly
    moved out to independence
  • 16:55 - 16:57
    but were not quite there yet.
  • 16:57 - 17:03
    In fact, some of you maybe aware
    that these organizations were
  • 17:03 - 17:08
    under supervision somewhat
    indirect but still supervision
  • 17:08 - 17:13
    of program such as NTIA in
    the department of commerce.
  • 17:13 - 17:17
    And so, one of the things that
    we now find ourselves facing is
  • 17:17 - 17:20
    that there is a unique
    role in the US government
  • 17:20 - 17:21
    with respect to the internet.
  • 17:21 - 17:26
    And this unique role is one that
    is based on-- it's experienced.
  • 17:26 - 17:30
    It's a steady hand,
    behind the scenes,
  • 17:30 - 17:31
    helping the internet
    mature and grow.
  • 17:31 - 17:35
    Recently however, we've now
    seen that there's another hand
  • 17:35 - 17:37
    that maybe busy doing
    other things
  • 17:37 - 17:41
    such as surveillance
    in cyber security.
  • 17:41 - 17:45
    And so, the question that comes
    up is how do we reconcile those?
  • 17:45 - 17:51
    How do we reconcile an open
    global transparent internet run
  • 17:51 - 17:54
    for everyone with
    the possibility
  • 17:54 - 17:58
    that there is also surveillance
    going on and other things
  • 17:58 - 18:01
    that may not meet
    the expectations
  • 18:01 - 18:02
    of global internet users.
  • 18:02 - 18:05
    And so, this is the
    challenge in front of us.
  • 18:05 - 18:09
    It's particularly
    highlighted because of the fact
  • 18:09 - 18:12
    that the US governments
    unique oversight role in this
  • 18:12 - 18:14
    and it's brought
    to the forefront
  • 18:14 - 18:20
    by the recent discussions where
    we now say that it is not simply
  • 18:20 - 18:22
    for the benefit of
    all the internet,
  • 18:22 - 18:25
    but there's also some
    national priorities
  • 18:25 - 18:29
    and national initiatives that
    the US government also pursues
  • 18:29 - 18:31
    over the same infrastructure.
  • 18:31 - 18:34
    I think that's the challenge
    that we will now face more
  • 18:34 - 18:36
    so than ever in light
    of the revelations
  • 18:36 - 18:40
    for the last few months and
    the net result would be a lot
  • 18:40 - 18:42
    of the discussion in
    forum such as this.
  • 18:42 - 18:44
    Thank you.
  • 18:44 - 18:45
    >> Thank you and I'll try
  • 18:45 - 18:47
    to make this iPhone
    behave a little better--
  • 18:47 - 18:47
    >> OK.
  • 18:47 - 18:48
    >> Although--
  • 18:48 - 18:49
    >> I thought I was over on time.
  • 18:49 - 18:50
    >> Yes.
  • 18:50 - 18:51
    >> You were right on
    time, you were great.
  • 18:51 - 18:52
    >> OK.
  • 18:52 - 18:53
    >> Leslie?
  • 18:53 - 18:55
    >> It's time, I guess.
  • 18:55 - 18:59
    So, I also want to get us out
    of the DC bubble and the focus
  • 18:59 - 19:02
    on the rights of people
    in the United States.
  • 19:02 - 19:04
    I've been talking about
    that for the last six weeks.
  • 19:04 - 19:08
    And congresses about to
    considering an amendment that's
  • 19:08 - 19:10
    going to reign in
    the NSAs ability
  • 19:10 - 19:13
    to collect our metadata
    from phones.
  • 19:13 - 19:16
    I want to pivot away from that
    'cause I think there're three
  • 19:16 - 19:19
    other things that we ought
    to be considering at least.
  • 19:19 - 19:23
    What-- and the reactions of
    governments and internet users
  • 19:23 - 19:26
    around the world and what this
    means with the architecture,
  • 19:26 - 19:29
    not just the governance, the
    architecture of the internet
  • 19:29 - 19:32
    and also for the
    protection of human rights
  • 19:32 - 19:34
    of global internet
    users and that turns
  • 19:34 - 19:36
    out to be a very
    thorny question.
  • 19:36 - 19:37
    Certainly for governments,
  • 19:37 - 19:41
    I think this entire kerfuffle
    is reinforced the concerns
  • 19:41 - 19:44
    that the global internet has
    undermined their sovereignty
  • 19:44 - 19:46
    in control over their citizens.
  • 19:46 - 19:49
    And it's further
    illuminated this sort
  • 19:49 - 19:53
    of privileged position
    of the United States.
  • 19:53 - 19:55
    Certainly, with respect
    to ICANN and some
  • 19:55 - 19:57
    of the other critical resources,
  • 19:57 - 20:00
    we continue to have a
    disproportionate share
  • 20:00 - 20:03
    of internet traffic, all
    that's declining and obviously,
  • 20:03 - 20:06
    we have a dominance in our
    global internet companies.
  • 20:06 - 20:11
    For some, and the EUs, I
    think I have a good example,
  • 20:11 - 20:13
    this illuminates
    left an inability
  • 20:13 - 20:15
    to protect the human
    rights of their own citizens
  • 20:15 - 20:20
    and we have a major fight coming
    up in the data directive there
  • 20:20 - 20:22
    about whether they're
    going to allow data to come
  • 20:22 - 20:24
    to the United States at all.
  • 20:24 - 20:30
    At the same time, it's an
    extraordinary opportunity
  • 20:30 - 20:34
    for authoritarian governments
    to exert more control
  • 20:34 - 20:39
    over the people in data in their
    own boarders and it's, you know,
  • 20:39 - 20:43
    rushes now dusting of a
    legislation that has to do
  • 20:43 - 20:44
    with national servers.
  • 20:44 - 20:46
    You're going to see
    that elsewhere as well
  • 20:46 - 20:49
    and I think it's fair to worry
    whether this carefully honed
  • 20:49 - 20:55
    narrative as US narrative of the
    why is trusted neutral stored
  • 20:55 - 20:58
    of the internet can
    really hold and it's fair
  • 20:58 - 21:01
    to ask whether we're moving
    towards the balkanization
  • 21:01 - 21:05
    of the internet as
    government sees these moment
  • 21:05 - 21:08
    to impost local server
    requirements or worse
  • 21:08 - 21:12
    and we saw some of these
    proposals last year.
  • 21:12 - 21:15
    I know we're supposed to be
    acronym free at the wicked.
  • 21:15 - 21:18
    I'll explain-- routing
    requirements
  • 21:18 - 21:22
    to either avoid the
    United States of literally
  • 21:22 - 21:23
    to direct the routing of traffic
  • 21:23 - 21:27
    and the worst outcome
    being literally ring fans
  • 21:27 - 21:30
    to national or regional
    networks.
  • 21:30 - 21:33
    If I were in Latin America and
    notice that there was 80 percent
  • 21:33 - 21:36
    of my traffic still coming
    from the United States,
  • 21:36 - 21:40
    I may start to wonder why
    we're still relying on Miami
  • 21:40 - 21:41
    to reach most of the world.
  • 21:41 - 21:44
    Some of that might be
    a weird salutary effect
  • 21:44 - 21:48
    that we finally get internet
    exchange points in some places
  • 21:48 - 21:51
    that have not had them.
  • 21:51 - 21:54
    But I think governments becoming
    involved directly in the rooting
  • 21:54 - 21:58
    of internet data would
    opposed a profound challenge
  • 21:58 - 22:00
    to the open end-to-end internet
  • 22:00 - 22:03
    and I think all these things
    are going to be on the table.
  • 22:03 - 22:06
    I think you are also kind of
    expect to see a reexamination
  • 22:06 - 22:08
    of the world's relationships
  • 22:08 - 22:11
    with our cloud providers,
    already seeing that.
  • 22:11 - 22:13
    You know, there's a Netherlands
    provider and no prism,
  • 22:13 - 22:16
    no surveillance, no
    government backdoors.
  • 22:16 - 22:22
    And finally, I think we may have
    lost some of the rapprochement
  • 22:22 - 22:26
    that happened in the governance
    wars in the last year.
  • 22:26 - 22:29
    There are number of important
    forums and discussions coming
  • 22:29 - 22:31
    up about various treaties
  • 22:31 - 22:34
    and how much various treaty
    buddies ought to be able
  • 22:34 - 22:36
    to impose their--
  • 22:36 - 22:40
    post themselves into internet
    debates, I think we're going
  • 22:40 - 22:41
    to lose some ground there
  • 22:41 - 22:44
    and it could be I
    think quite troubling.
  • 22:44 - 22:49
    Last point, global
    internet users are furious.
  • 22:49 - 22:52
    They believe their human
    rights have been violated
  • 22:52 - 22:54
    and I think they
    don't yet realize
  • 22:54 - 22:56
    that it's not entirely clear
    whether they have any recourse
  • 22:56 - 22:57
    for that.
  • 22:57 - 22:59
    There's a lot of ambiguity
    in human rights law.
  • 22:59 - 23:01
    Human rights law are
    pretty much applies,
  • 23:01 - 23:05
    if countries make commitments
    to protect the human rights
  • 23:05 - 23:07
    of people within their
    boarders or their control,
  • 23:07 - 23:10
    I don't think we began to
    understand or have any clarity
  • 23:10 - 23:13
    on how-- who's responsible
    for human rights
  • 23:13 - 23:15
    where there's non-physical
    action
  • 23:15 - 23:17
    such as electronic surveillance.
  • 23:17 - 23:20
    And they're not going to be
    happy when they understand
  • 23:20 - 23:22
    that FISA and all
    the conversations
  • 23:22 - 23:26
    in the United States about
    safeguards and minimization are
  • 23:26 - 23:29
    to protect our rights but
    have absolutely nothing
  • 23:29 - 23:30
    to do with them.
  • 23:30 - 23:32
    I don't think congress
    is going to care.
  • 23:32 - 23:35
    But I think there's a
    massive laws to respect
  • 23:35 - 23:38
    and when we're talking about
    five zeta [inaudible] whatever
  • 23:38 - 23:40
    that is, a non-US
    person data going
  • 23:40 - 23:45
    into a new Utah data center, I
    think that the irony at the end
  • 23:45 - 23:48
    of the day will be that one
  • 23:48 - 23:52
    of the FISA permissible
    activities is collecting
  • 23:52 - 23:55
    intelligence for foreign
    affairs and foreign policy.
  • 23:55 - 23:59
    And I think at the end of
    the day, we may see people
  • 23:59 - 24:02
    of the world uniting with
    their governments around kinds
  • 24:02 - 24:06
    of restrictions and ring fencing
    of the internet that will come
  • 24:06 - 24:08
    to both underlying rights
  • 24:08 - 24:10
    and undermine the
    openness of the internet.
  • 24:10 - 24:14
    So, that is my anxious
    persons guide to the--
  • 24:14 - 24:16
    to what's been happening.
  • 24:16 - 24:18
    >> Thank you Leslie.
  • 24:18 - 24:19
    Randy?
  • 24:19 - 24:24
    >> How many of you noticed
    accounted how many surveillance
  • 24:24 - 24:28
    devices you've past
    to get to this room?
  • 24:28 - 24:34
    Yeah. I just walked in from the
    restaurant a couple blocks away.
  • 24:34 - 24:41
    I counted 22 surveillance style
    devices just in the walk outside
  • 24:41 - 24:44
    of this building and there's
    at least in this room right now
  • 24:44 - 24:48
    that not counting your
    smart phones and all that.
  • 24:48 - 24:51
    I hope you all can find the six
  • 24:51 - 24:54
    because they're pretty
    obvious as to what they are.
  • 24:54 - 24:58
    My point on all of this
    is that as a practitioner,
  • 24:58 - 25:00
    I'm the Sisa for Virginia Tech.
  • 25:00 - 25:04
    My office is responsible for
    monitoring and responding
  • 25:04 - 25:07
    to any attacks against
    our network infrastructure
  • 25:07 - 25:08
    at Virginia Tech.
  • 25:08 - 25:10
    We have our main campus down
    the Blacksburg, Virginia.
  • 25:10 - 25:12
    It's about four and a
    half hours from here.
  • 25:12 - 25:13
    We have a Northern
    Virginia campus right
  • 25:13 - 25:16
    across the river in Boylston.
  • 25:16 - 25:18
    And so my charge,
    my office is charged
  • 25:18 - 25:21
    to monitor any attacks
    from there.
  • 25:21 - 25:24
    Anybody who has ever managed
    an internet infrastructure has
  • 25:24 - 25:26
    known since the very beginning
  • 25:26 - 25:29
    that the internet has never
    been anonymous in the sense
  • 25:29 - 25:31
    of tracking a machine.
  • 25:31 - 25:34
    From day 1, we've
    always been able to track
  • 25:34 - 25:36
    where our machine was.
  • 25:36 - 25:39
    We were not able to track
    who is at the machine
  • 25:39 - 25:41
    with any reasonable
    amount of accuracy
  • 25:41 - 25:44
    but we've always been able
    to track where machines are.
  • 25:44 - 25:48
    The big thing was of course
    data, storage capabilities.
  • 25:48 - 25:50
    We didn't have enough
    data storage capabilities,
  • 25:50 - 25:51
    disk drives.
  • 25:51 - 25:54
    In 1992, when I've
    got involved--
  • 25:54 - 25:57
    first involved with the computer
    security stuff, you know,
  • 25:57 - 26:00
    we had a one gigabyte drive and
    we thought that was, you know,
  • 26:00 - 26:03
    the entire disc storage
    in the entire free world.
  • 26:03 - 26:07
    And, you know, nowadays, with
    the huge disk forms that are
  • 26:07 - 26:10
    out there, that is allowing
    this collection of data.
  • 26:10 - 26:13
    So, from our standpoint,
    that's always been the case.
  • 26:13 - 26:17
    So, my challenge is,
    you all let this happen.
  • 26:17 - 26:21
    You all let knew that this was
    happening and you let it go on.
  • 26:21 - 26:24
    There is nothing new about the
    prism stuff and all of these.
  • 26:24 - 26:26
    It's kind of ironic for people
  • 26:26 - 26:29
    in my world 'cause we all
    sit back and we go, you know,
  • 26:29 - 26:33
    in 2010, the Washington
    Posts, Dana Priest
  • 26:33 - 26:35
    and William Arkin wrote an
    excellent series called Top
  • 26:35 - 26:38
    Secret America where
    they're talking about all
  • 26:38 - 26:40
    of the built way companies
  • 26:40 - 26:42
    that are building the
    surveillance technologies
  • 26:42 - 26:45
    that the federal government
    and other entities are using.
  • 26:45 - 26:47
    All out there was an excellent
    series on the newspaper
  • 26:47 - 26:49
    that I read it back then.
  • 26:49 - 26:52
    Newsweek came out in
    2008 where they talked
  • 26:52 - 26:56
    about a whistle blower
    who mentioned
  • 26:56 - 27:00
    about NSA's warrantless
    wiretapping so to speak.
  • 27:00 - 27:01
    It was all out there.
  • 27:01 - 27:04
    If you go back and you look
    through the press things,
  • 27:04 - 27:07
    there was always something
    there and nobody reacted to it.
  • 27:07 - 27:10
    So, it's ironic from
    my viewpoint
  • 27:10 - 27:13
    that everybody is having
    this big flop about it now
  • 27:13 - 27:16
    because it's been there
    and you let it happen.
  • 27:16 - 27:20
    So, my charge to you is don't
    let it happen again, OK?
  • 27:20 - 27:22
    When you do read
    about these things,
  • 27:22 - 27:25
    you need to influence your
    legislatures about what's going
  • 27:25 - 27:28
    on 'cause they do not
    understand the technology.
  • 27:28 - 27:31
    One quick example, in Virginia,
  • 27:31 - 27:33
    there's data bridge
    notification laws
  • 27:33 - 27:36
    of social security is disclosed.
  • 27:36 - 27:38
    You know, somebody
    gets a spreadsheet
  • 27:38 - 27:40
    and it explodes out there.
  • 27:40 - 27:44
    Yet I can go to a county
    courthouse website and look
  • 27:44 - 27:47
    up public records deeds,
    divorce decrees, whatever,
  • 27:47 - 27:49
    and what's on those documents.
  • 27:49 - 27:52
    Yet the kind of clerks can't
    redact that information
  • 27:52 - 27:55
    because until recently,
    the law forbid it, OK?
  • 27:55 - 27:59
    Everybody was in a rush for
    eGovernment but never thought
  • 27:59 - 28:02
    about what's in those
    documents that going there.
  • 28:02 - 28:04
    So, we need to be
    able to influence
  • 28:04 - 28:06
    and educate the legislatures
    who are--
  • 28:06 - 28:08
    who are building the policy.
  • 28:08 - 28:10
    So, this is the whole thing.
  • 28:10 - 28:13
    I'm a technologist and everybody
    in here, the computer scientists
  • 28:13 - 28:16
    and all that, you all
    learned in here with me.
  • 28:16 - 28:18
    We help build this
    infrastructure.
  • 28:18 - 28:21
    We may have had a
    misgiving where we thought
  • 28:21 - 28:24
    that as builders we could
    control the people who use it.
  • 28:24 - 28:27
    But as, you know,
    previous history shown
  • 28:27 - 28:29
    with the atom bomb development
    and all these other things,
  • 28:29 - 28:30
    that's not always the case.
  • 28:30 - 28:34
    Builders do not control
    the controllers.
  • 28:34 - 28:35
    That's my first point.
  • 28:35 - 28:38
    The second point is that, again,
    you all have smart phones.
  • 28:38 - 28:39
    Actually I don't.
  • 28:39 - 28:41
    I have kind of a dumb phone
    and people always laugh.
  • 28:41 - 28:44
    They go, "You know,
    you're a technologist,"
  • 28:44 - 28:46
    and I just have a little,
    you know, Samsung Integrity
  • 28:46 - 28:48
    with the nice little
    flippy thing.
  • 28:48 - 28:50
    And I said, "Because the
    guys in my lab have busted
  • 28:50 - 28:51
    into the smart phones
  • 28:51 - 28:54
    and tracked everybody
    all over the place."
  • 28:54 - 28:55
    But actually the reason
    why I don't this is
  • 28:55 - 28:58
    because the battery life is not
    long enough for my taste, OK?
  • 28:58 - 29:01
    But a great quote, Lyn Paramore
    [assumed spelling] wrote a great
  • 29:01 - 29:05
    article and in there quoted
    things that are supposed
  • 29:05 - 29:07
    to make our lives easier.
  • 29:07 - 29:11
    "Smartphones, Gmails,
    Skype, GPS, Facebook,
  • 29:11 - 29:14
    they have become
    tools to track us.
  • 29:14 - 29:17
    And we've been happily
    shopping for the bars
  • 29:17 - 29:22
    to our own prison one prison at
    a time, one product at a time."
  • 29:22 - 29:25
    So, we're in the-- we're
    to blame for all of this.
  • 29:25 - 29:27
    We are allowing that to happen.
  • 29:27 - 29:29
    The last thing I want to talk
  • 29:29 - 29:31
    about is this Federal
    word play thing.
  • 29:31 - 29:33
    Whenever I confront a student
  • 29:33 - 29:37
    or a professor whose been
    violating a university policy,
  • 29:37 - 29:39
    you let something
    go out on the net.
  • 29:39 - 29:40
    And I go. "Why did you do that?"
  • 29:40 - 29:43
    And it's kind of like watching,
    you know, a six-year-old.
  • 29:43 - 29:46
    "I don't know, you know, it
    wasn't my fault," you know.
  • 29:46 - 29:47
    That type of stuff.
  • 29:47 - 29:50
    And you get this
    word play, you know.
  • 29:50 - 29:53
    Richard Pryor, a long time
    ago had an excellent thing
  • 29:53 - 29:57
    when his son broke something
    and he asked his son happened.
  • 29:57 - 30:00
    And all the word play that his
    son came out with, you know,
  • 30:00 - 30:01
    "Some invisible man came
    out and broke the thing."
  • 30:01 - 30:04
    That's what we're seeing now.
  • 30:04 - 30:07
    We're seeing this when they
    say we're not collecting data
  • 30:07 - 30:08
    on you.
  • 30:08 - 30:10
    Well, OK, your not collecting it
    technically but you're buying it
  • 30:10 - 30:13
    from people who are
    collecting it from us.
  • 30:13 - 30:15
    So, that's-- that's
    one thing there.
  • 30:15 - 30:18
    The network companies that say,
    you know, the data providers,
  • 30:18 - 30:21
    the Verizons, the Googles,
    they say, "Hey, you know,
  • 30:21 - 30:24
    we're not willingly giving it
    to the federal government."
  • 30:24 - 30:25
    Well, of course not.
  • 30:25 - 30:26
    You're being subpoenaed
    for the document.
  • 30:26 - 30:29
    You're not giving it
    willingly to the-- to there.
  • 30:29 - 30:33
    You know, Zack Holman
    wrote a great little tweet.
  • 30:33 - 30:36
    It says, "We don't give
    direct database access
  • 30:36 - 30:38
    to government agencies."
  • 30:38 - 30:44
    That quote has become the
    new "I didn't inhale," OK?
  • 30:44 - 30:46
    And so the key word is direct.
  • 30:46 - 30:49
    Marketers are being collecting
    information on us all the time.
  • 30:49 - 30:51
    And, in fact, that was
    one of the big things
  • 30:51 - 30:52
    that helped us with 9/11.
  • 30:52 - 30:55
    On a positive side,
    that did help us
  • 30:55 - 30:58
    in 9/11 identify the hijackers
    when the marketing companies
  • 30:58 - 31:01
    and the credit card companies
    realized that they could help
  • 31:01 - 31:03
    because they had that data.
  • 31:03 - 31:08
    So, we need to be sure that
    things are done correctly.
  • 31:08 - 31:10
    Everybody says we're
    doing it legally.
  • 31:10 - 31:11
    And that's correct.
  • 31:11 - 31:13
    The laws stated they can do it.
  • 31:13 - 31:16
    It's the creation of
    that law that's the flaw.
  • 31:16 - 31:21
    And that's my surveillance guide
    telling me it's time to quit.
  • 31:21 - 31:22
    Those are my points.
  • 31:22 - 31:23
    >> Thanks, Randy.
  • 31:23 - 31:24
    Melissa?
  • 31:24 - 31:25
    >> Thank you.
  • 31:25 - 31:30
    Well, I think it's
    important to look
  • 31:30 - 31:34
    to our past to inform
    our future.
  • 31:34 - 31:36
    And I'm going to take a
    little bit different direction
  • 31:36 - 31:38
    than my colleagues.
  • 31:38 - 31:41
    The very first transmission of
    the internet was October 29th,
  • 31:41 - 31:45
    1969 and it was an e-mail
    between two universities.
  • 31:45 - 31:50
    And today we have more than 204
    e-mails are sent per minute.
  • 31:50 - 31:53
    More than 1,300 new mobile
    users are added per minute
  • 31:53 - 31:56
    to the internet, 47,000
    applications are downloaded,
  • 31:56 - 32:02
    100,000 tweets, 1.3 million
    videos are uploaded to YouTube,
  • 32:02 - 32:05
    two million searches to Google,
    and six million Facebook views.
  • 32:05 - 32:08
    The internet is part of
    every part of our life
  • 32:08 - 32:12
    and over the last
    45 almost years,
  • 32:12 - 32:17
    we have embedded the internet
    in every part of our society.
  • 32:17 - 32:21
    And it is the backbone of our
    core infrastructure of every--
  • 32:21 - 32:24
    every country's core
    infrastructure.
  • 32:24 - 32:29
    It represents e-government,
    e-banking, e-health, e-learning,
  • 32:29 - 32:32
    the next generation of
    air traffic control,
  • 32:32 - 32:34
    the next generation
    of power grids
  • 32:34 - 32:37
    and every other essential
    service has been concentrated
  • 32:37 - 32:41
    onto one infrastructure,
    the internet.
  • 32:41 - 32:43
    And that is putting
    our businesses
  • 32:43 - 32:46
    and our national
    security at risk.
  • 32:46 - 32:49
    And I-- when I speak about
    our, I'm really speaking
  • 32:49 - 32:50
    in as a global citizen.
  • 32:50 - 32:52
    If I were in France I would be
    speaking about it in France.
  • 32:52 - 32:55
    If I were Iran or
    Israel, it's the same way.
  • 32:55 - 32:58
    And so I think that
    that has really began
  • 32:58 - 33:00
    to change the conversation
  • 33:00 - 33:04
    that we can't have any one
    single point of failure,
  • 33:04 - 33:08
    economic failure and/or national
    security failure to any one
  • 33:08 - 33:11
    of our-- of our core
    businesses or infrastructures.
  • 33:11 - 33:14
    And so when we're talking about
    cyber security and we're talking
  • 33:14 - 33:17
    about different government
    actions and we wrap it all
  • 33:17 - 33:20
    into one conversation, I ask
    you to start to think about it
  • 33:20 - 33:23
    as multiple conversations.
  • 33:23 - 33:26
    And it's not helpful
    to bundle into one term
  • 33:26 - 33:28
    of cyber security
    and/or surveillance.
  • 33:28 - 33:31
    So, I'd you to-- I'd like
    you to think about it
  • 33:31 - 33:34
    in six different ways
    of why our governments
  • 33:34 - 33:37
    and why our industry are
    talking past each other.
  • 33:37 - 33:40
    The first is is when we're
    talking about cyber security,
  • 33:40 - 33:43
    we're sometimes talking
    about political activism.
  • 33:43 - 33:47
    Those who would like to bring
    transparency to policies
  • 33:47 - 33:50
    and to our initiatives
    that they don't agree with.
  • 33:50 - 33:54
    In the United States, one could
    say that that was WikiLeaks
  • 33:54 - 33:57
    that brought about a great
    amount of transparency
  • 33:57 - 34:00
    to US policies as they
    leaked our information
  • 34:00 - 34:01
    into the internet.
  • 34:01 - 34:04
    All right, one could argue
    that that was also Snowden.
  • 34:04 - 34:08
    But in other countries
    political activism is being used
  • 34:08 - 34:13
    by Twitter, using Twitter and/or
    Facebook to organize people
  • 34:13 - 34:18
    in these squares like Taksim
    and/or in Turkey or Egypt
  • 34:18 - 34:22
    to express political
    discontent with their government
  • 34:22 - 34:24
    with the intent to
    overthrow the government.
  • 34:24 - 34:26
    Political instability.
  • 34:26 - 34:28
    And so when our governments
    are starting to talk
  • 34:28 - 34:31
    about surveillance on the
    internet and/or filtering
  • 34:31 - 34:36
    of the internet, some believe in
    political democracy and freedom
  • 34:36 - 34:38
    of that speech on the
    internet and others do not.
  • 34:38 - 34:41
    And it has different
    mechanisms of how they're using
  • 34:41 - 34:43
    that surveillance
    or the technology
  • 34:43 - 34:45
    around political activism.
  • 34:45 - 34:48
    Now that should not be
    confused with organized crime
  • 34:48 - 34:51
    on the internet, the real modern
    day bank robber who's stealing
  • 34:51 - 34:54
    ones and zeros which
    is real dollars
  • 34:54 - 34:58
    out of your credit accounts
    or out of your real banks,
  • 34:58 - 35:01
    and as being passed on as
    a cost for our citizens.
  • 35:01 - 35:03
    We have many of our
    governments who are talking
  • 35:03 - 35:05
    about the importance
    of organized crime.
  • 35:05 - 35:07
    When I was just in Europe
    just a few weeks ago,
  • 35:07 - 35:09
    it was 30 million dollars stolen
  • 35:09 - 35:12
    out of 45 different
    cities in 30 minutes.
  • 35:12 - 35:14
    That's a real problem
    for our banks.
  • 35:14 - 35:16
    It's a real problem
    for our credit cards.
  • 35:16 - 35:19
    And we're having to deal
    with that organized crime
  • 35:19 - 35:21
    and real theft of
    ones and zeroes.
  • 35:21 - 35:25
    Now that should not be confused
    with intellectual property theft
  • 35:25 - 35:27
    and industrial espionage.
  • 35:27 - 35:28
    That's very different.
  • 35:28 - 35:31
    And many of our government
    leaders in the United States
  • 35:31 - 35:34
    and many government leaders
    in Europe are talking
  • 35:34 - 35:37
    about the unprecedented theft
    of intellectual property.
  • 35:37 - 35:40
    Thefts, meaning,
    illegally copying the plans,
  • 35:40 - 35:44
    processes and/or next
    generation technologies
  • 35:44 - 35:47
    out of our corporations
    for the economic advance
  • 35:47 - 35:50
    of their companies
    and/or countries.
  • 35:50 - 35:54
    And so the intellectual property
    theft is then not the same
  • 35:54 - 35:56
    as espionage.
  • 35:56 - 35:58
    And there are governments
    that are conducting espionage.
  • 35:58 - 36:01
    Most governments do to steal
    the plans and intentions
  • 36:01 - 36:05
    of other governments and
    know their capabilities.
  • 36:05 - 36:07
    In the United States, we
    sort of bundled the two,
  • 36:07 - 36:11
    IP theft and espionage.
  • 36:11 - 36:14
    When were talking about it, in
    fact, we're talking about it
  • 36:14 - 36:19
    as Pearl Harbor and other
    very exaggerated terms.
  • 36:19 - 36:22
    If we are going to bundle
    intellectual property theft
  • 36:22 - 36:24
    with espionage then we have
    to be willing to put espionage
  • 36:24 - 36:27
    that we would walk from
    it as a government.
  • 36:27 - 36:28
    And I don't see any
    government willing
  • 36:28 - 36:30
    to walk away from espionage.
  • 36:30 - 36:32
    So, why don't we talk about
    what's really the problem
  • 36:32 - 36:35
    and that's intellectual property
    theft and/or the protection
  • 36:35 - 36:38
    of intellectual property
    and patents, et cetera.
  • 36:38 - 36:41
    There are two other areas that
    are becoming more concerning
  • 36:41 - 36:46
    for most companies and
    countries that are--
  • 36:46 - 36:47
    the first is disruption
    of service.
  • 36:47 - 36:50
    And this is the distributed
    denial-of-services actually
  • 36:50 - 36:53
    degrading real services
    in your e-banking
  • 36:53 - 36:57
    and your e-infrastructures
    that are preventing our banks
  • 36:57 - 37:00
    from allowing you to actually
    access those infrastructures
  • 37:00 - 37:02
    and/or capabilities.
  • 37:02 - 37:04
    And we just had a--
    and a significant,
  • 37:04 - 37:05
    in the United States,
  • 37:05 - 37:07
    we're having a distributed
    denial-of-service
  • 37:07 - 37:08
    against our financial
    institutions
  • 37:08 - 37:11
    and so are many others
    in Asia and Europe.
  • 37:11 - 37:14
    And then finally the
    destruction of property.
  • 37:14 - 37:17
    There was just recently a
    malware that was released
  • 37:17 - 37:19
    against Saudi ARamCo
  • 37:19 - 37:22
    which destroyed 30,000
    of their computers.
  • 37:22 - 37:24
    And when we start to actually
    think about destruction
  • 37:24 - 37:27
    of property and how one
    might recover from that
  • 37:27 - 37:30
    that is a different set of
    capabilities than you would deal
  • 37:30 - 37:34
    with from organized crime
    and/or political activism.
  • 37:34 - 37:36
    So, as we talk about this
    on our panel over the course
  • 37:36 - 37:39
    of the next hour, I ask
    you to start to think
  • 37:39 - 37:41
    about which problem
    are you talking
  • 37:41 - 37:43
    about because we're not
    talking about the same thing
  • 37:43 - 37:46
    in each of our conversations.
  • 37:46 - 37:48
    I'd like to wrap up and that
  • 37:48 - 37:51
    over 100 countries have
    these capabilities,
  • 37:51 - 37:53
    and they are using these
    capabilities to deal
  • 37:53 - 37:56
    with these different problems
    whether it's political activism
  • 37:56 - 37:59
    to overthrow government
    or its political activism
  • 37:59 - 38:01
    to bring transparency
    to policies they don't
  • 38:01 - 38:04
    like is different
    than organized crime
  • 38:04 - 38:06
    or intellectual property theft.
  • 38:06 - 38:08
    And there are three strategic
    things that are happening
  • 38:08 - 38:10
    in the global order of things.
  • 38:10 - 38:15
    First, some are using disruptive
    technology like a Stuxnet
  • 38:15 - 38:17
    or a Shamoon to bring
    down core infrastructures
  • 38:17 - 38:20
    or core businesses
    around the world
  • 38:20 - 38:24
    or some are implementing
    surveillance tools to bring
  • 38:24 - 38:28
    about transparency or to
    show the vulnerabilities
  • 38:28 - 38:30
    for those infrastructures
    to be brought down.
  • 38:30 - 38:32
    So, disruptive technologies
    are being used.
  • 38:32 - 38:35
    Second, strategic alliances
    are also being wielded
  • 38:35 - 38:39
    to actually gather that power
    and control over the internet.
  • 38:39 - 38:44
    And that's playing out in
    the UN and the ITU and NATO
  • 38:44 - 38:47
    and ACION [assumed spelling]
    and other of the forum
  • 38:47 - 38:49
    where countries can
    align against each other
  • 38:49 - 38:50
    for particular motives.
  • 38:50 - 38:53
    And then finally there
    are strategic properties.
  • 38:53 - 38:57
    And I mean that in the
    very sense of it is.
  • 38:57 - 38:59
    There are 25 internet
    service providers
  • 38:59 - 39:00
    that control 90 percent
  • 39:00 - 39:02
    of information flow
    on the internet.
  • 39:02 - 39:04
    There are internet
    exchange points
  • 39:04 - 39:08
    that actually control the
    flow of technology and/or ones
  • 39:08 - 39:12
    and zeroes from continent to
    continent or within a continent.
  • 39:12 - 39:14
    And there are data aggregators
  • 39:14 - 39:18
    who have actually more
    information on us like a Google
  • 39:18 - 39:20
    or a Facebook than any
    foreign intelligence service
  • 39:20 - 39:21
    of any other country.
  • 39:21 - 39:23
    So, you have to think
  • 39:23 - 39:24
    about where are the
    strategic properties
  • 39:24 - 39:25
    and how they're being used
  • 39:25 - 39:28
    by all governments not
    just by one government.
  • 39:28 - 39:30
    Thank you.
  • 39:30 - 39:35
    >> Thank you, Melissa.
  • 39:35 - 39:36
    Laura?
  • 39:36 - 39:37
    >> Good afternoon everyone.
  • 39:37 - 39:38
    I'm very delighted to be here.
  • 39:38 - 39:41
    And I wish to thank the Internet
    Society for the invitation
  • 39:41 - 39:43
    and for GW for hosting
    it as well.
  • 39:43 - 39:47
    I view PRISM as an
    opportunity to draw attention
  • 39:47 - 39:50
    to the implications of broader
    global internet governance
  • 39:50 - 39:53
    conflicts that have
    implications for economic
  • 39:53 - 39:55
    and expressive liberty.
  • 39:55 - 40:00
    I recently-- recently the
    last two or three years spent
  • 40:00 - 40:01
    that time researching
  • 40:01 - 40:03
    and writing a new book
    called the Global War
  • 40:03 - 40:04
    for Internet Governance.
  • 40:04 - 40:07
    And it's going to be
    published later this year
  • 40:07 - 40:08
    by Yale University Press.
  • 40:08 - 40:12
    Now in this book, there are
    approximately four pages
  • 40:12 - 40:14
    at the end of acronyms.
  • 40:14 - 40:16
    So, what I've done is I've
    challenged myself today
  • 40:16 - 40:18
    to not use any acronyms.
  • 40:18 - 40:21
    So, what I would like to ask you
    to do is to pound on the table
  • 40:21 - 40:24
    if I use an internet governance
    acronyms, so pay attention
  • 40:24 - 40:27
    to that, and I know some
    of you will do that.
  • 40:27 - 40:28
    But what I tried to do
  • 40:28 - 40:31
    in the book is describe
    the various layers
  • 40:31 - 40:35
    of how the internet is
    already governed and what some
  • 40:35 - 40:38
    of the current debates are that
    I expect to shape the future
  • 40:38 - 40:41
    of freedom and innovation
    in the coming years.
  • 40:41 - 40:43
    Now what is internet governance?
  • 40:43 - 40:44
    This panel has already
    described it.
  • 40:44 - 40:46
    If I had to give one definition,
  • 40:46 - 40:51
    I would say internet governance
    is the design and administration
  • 40:51 - 40:53
    of the technologies
    that are necessary
  • 40:53 - 40:56
    to keep the internet operational
    and then the enactment
  • 40:56 - 40:59
    of substantive policy
    around those technologies.
  • 40:59 - 41:02
    But there is no single system
    of internet governance.
  • 41:02 - 41:05
    John said it best when he was
    describing the various roles,
  • 41:05 - 41:07
    names and numbers,
    administration,
  • 41:07 - 41:11
    standard setting, private
    interconnection, arrangements
  • 41:11 - 41:14
    between telecommunication
    companies, the privacy policies
  • 41:14 - 41:17
    that are enacted by social
    media, by search engines
  • 41:17 - 41:19
    and other information
    intermediaries.
  • 41:19 - 41:23
    And, of course, cyber security
    governance not necessarily
  • 41:23 - 41:25
    enacted by governments
    but by entities
  • 41:25 - 41:28
    such as certificate
    authorities that are handing
  • 41:28 - 41:30
    out digital signatures
    and things like that.
  • 41:30 - 41:32
    So, those are just
    a few examples.
  • 41:32 - 41:35
    So, we need to take this
    conversation outside
  • 41:35 - 41:38
    of discussions about
    just governance.
  • 41:38 - 41:41
    However, one of themes that
    I do take up in the book
  • 41:41 - 41:44
    and in my work in general is
  • 41:44 - 41:48
    that internet governance
    conflicts are the new spaces
  • 41:48 - 41:52
    where political and economic
    power is working itself
  • 41:52 - 41:54
    out in the 21st century.
  • 41:54 - 41:57
    We see this with PRISM,
    we see this with Stuxnet,
  • 41:57 - 42:00
    we see this with the
    turn to intellectual--
  • 42:00 - 42:01
    to infrastructure
  • 42:01 - 42:04
    for intellectual property
    rights enforcement,
  • 42:04 - 42:07
    with governments cutting off
    access during political turmoil.
  • 42:07 - 42:09
    And as Mellissa said,
  • 42:09 - 42:11
    we had denial-of-service
    attacks often used
  • 42:11 - 42:14
    to suppress human
    rights and expression.
  • 42:14 - 42:18
    So, internet governance points
    of control are really not just
  • 42:18 - 42:20
    about keeping the internet
    operational although
  • 42:20 - 42:22
    that is absolutely vital.
  • 42:22 - 42:26
    But there are also a proxy
    for broader political
  • 42:26 - 42:27
    and economic conflicts.
  • 42:27 - 42:30
    So, the fact that PRISM
    draws attention to some
  • 42:30 - 42:34
    of these broader global
    internet governance issues is
  • 42:34 - 42:35
    an opportunity.
  • 42:35 - 42:39
    But keep in mind that government
    surveillance ad censorship
  • 42:39 - 42:43
    for that matter which is in my
    opinion an even greater problem
  • 42:43 - 42:47
    around the world
    is not something
  • 42:47 - 42:49
    that happens in a vacuum.
  • 42:49 - 42:51
    So, it's delegated and
    it is made possible
  • 42:51 - 42:55
    by certain arrangements
    of technical architecture
  • 42:55 - 42:57
    and by private ordering.
  • 42:57 - 43:01
    So, infrastructure governance
    just to give you a few examples,
  • 43:01 - 43:06
    infrastructure governance is
    directly tied to privacy issues.
  • 43:06 - 43:09
    So, I have an information
    engineering background.
  • 43:09 - 43:12
    I am also a social scientist
    who studies the politics
  • 43:12 - 43:13
    of technical architecture.
  • 43:13 - 43:16
    And I can say that, you
    know, you probably could say
  • 43:16 - 43:17
    to use a Harry Potter analogy.
  • 43:17 - 43:20
    There are some dark arts
    of internet governance
  • 43:20 - 43:25
    that have a good intention but
    they can be used for other uses.
  • 43:25 - 43:28
    So, one of these, for example,
    is the deep packet inspection.
  • 43:28 - 43:30
    I didn't use the acronym
    so no pounding on the table
  • 43:30 - 43:34
    which is a capability that
    allows network providers
  • 43:34 - 43:38
    to inspect the actual
    content of packets sent
  • 43:38 - 43:40
    over the internet rather
    than just the packet headers.
  • 43:40 - 43:42
    So, this can be used
    for a variety
  • 43:42 - 43:47
    of very important function
    such as network management,
  • 43:47 - 43:49
    detecting viruses and worms.
  • 43:49 - 43:51
    It could also be used for
    customized advertising,
  • 43:51 - 43:53
    now getting outside of
    the operational role,
  • 43:53 - 43:56
    or for surveillances
    or for throttling
  • 43:56 - 43:57
    and blocking of traffic.
  • 43:57 - 44:01
    So, this is a very significant
    development made possible only
  • 44:01 - 44:04
    by advances and processing
    power and storage.
  • 44:04 - 44:07
    And we need transparency
    and accountability
  • 44:07 - 44:09
    in issues like this as well.
  • 44:09 - 44:12
    So, another area of
    infrastructure related
  • 44:12 - 44:16
    to privacy is the hidden
    identity infrastructure
  • 44:16 - 44:19
    that makes possible business
    models that are based
  • 44:19 - 44:20
    on online advertising.
  • 44:20 - 44:23
    So, this is a good thing
    for freedom of expression
  • 44:23 - 44:26
    because there are free products
    that we are able to use,
  • 44:26 - 44:29
    but we are almost at the
    point where the prospect
  • 44:29 - 44:33
    for anonymous speech considering
    this identity infrastructure is
  • 44:33 - 44:35
    almost impossible.
  • 44:35 - 44:38
    We have technical identifiers
    at the level of hardware
  • 44:38 - 44:40
    like Ethernet cards at the level
  • 44:40 - 44:45
    of virtual identifiers
    locationally via cellphone
  • 44:45 - 44:51
    location, wireless fidelity
    AKA wifi, it's my one acronym
  • 44:51 - 44:53
    or global positioning
    system and through things
  • 44:53 - 44:57
    like platform mediation and real
    identification requirements.
  • 44:57 - 45:00
    So, if you put this all
    together, this is at the heart
  • 45:00 - 45:03
    of business models that
    we need at the heart
  • 45:03 - 45:07
    of online advertising, at the
    heart of having free software.
  • 45:07 - 45:11
    But it also is the
    technical capability
  • 45:11 - 45:13
    that can enable new
    form, even newer forms
  • 45:13 - 45:15
    of surveillance in the future.
  • 45:15 - 45:18
    So, these are new
    opportunities for surveillance.
  • 45:18 - 45:22
    Two other infrastructure issues
    I'll mention I'll mention
  • 45:22 - 45:26
    quickly include a current
    rethinking and redesign of the
  • 45:26 - 45:29
    "who is" protocol
    which keeps track
  • 45:29 - 45:33
    of who is registering a
    domain name and, of course,
  • 45:33 - 45:36
    also the issue as has
    already been mentioned
  • 45:36 - 45:38
    of internet exchange points.
  • 45:38 - 45:40
    How these are governed and
    how they are distributed
  • 45:40 - 45:42
    around the world is something
  • 45:42 - 45:44
    that is very related
    to civil liberties.
  • 45:44 - 45:48
    The Internet Society and we'll
    hear from next has really been
  • 45:48 - 45:50
    at the forefront of
    this area which--
  • 45:50 - 45:52
    having more internet
    exchange points,
  • 45:52 - 45:55
    looking at the criticality
    of them for human rights
  • 45:55 - 45:56
    for infrastructure development
  • 45:56 - 46:00
    and as concentrated points
    of information flows.
  • 46:00 - 46:04
    So, the truth is that global
    internet choke points,
  • 46:04 - 46:05
    of course, exist.
  • 46:05 - 46:09
    And the internet is governed
    not by any one entity
  • 46:09 - 46:12
    but multi-stakeholder governance
    which I'm sure we will take
  • 46:12 - 46:14
    up in the roundtable later.
  • 46:14 - 46:18
    But that this governance
    is not fixed anymore
  • 46:18 - 46:20
    than architecture is fixed.
  • 46:20 - 46:22
    So, the architecture
    is constantly changing
  • 46:22 - 46:26
    and the governance is
    constantly changing as well.
  • 46:26 - 46:28
    The basic theoretical
    or conceptual framework
  • 46:28 - 46:30
    of my own work is
    that arrangements
  • 46:30 - 46:34
    of technical architecture are
    also arrangements of power.
  • 46:34 - 46:38
    So, it's critical for the public
    to be engaged in these debates
  • 46:38 - 46:40
    because the future of
    internet architecture
  • 46:40 - 46:43
    and governance is
    directly related
  • 46:43 - 46:44
    to the future of
    internet freedom.
  • 46:44 - 46:47
    So, I appreciate the opportunity
    to discuss that here today.
  • 46:47 - 46:49
    Thank you very much
    for listening.
  • 46:49 - 46:50
    >> Thank you, Laura.
  • 46:50 - 46:53
    Lynn?
  • 46:53 - 46:54
    >> Good afternoon.
  • 46:54 - 46:56
    This has been a very
    comprehensive set
  • 46:56 - 46:59
    of speaking points, I
    have to say to date.
  • 46:59 - 47:02
    So, the Internet Society is
    a cause-based organization.
  • 47:02 - 47:05
    We advocate for an
    open global internet.
  • 47:05 - 47:07
    In the recent revelations
  • 47:07 - 47:09
    about the mass scale
    interceptions not only
  • 47:09 - 47:12
    by the US, the UK, but
    many, many other countries
  • 47:12 - 47:15
    around the world have
    serious implications
  • 47:15 - 47:18
    for the open global internet.
  • 47:18 - 47:20
    ISOC is an international
    organization.
  • 47:20 - 47:22
    We have members, org
    members and chapters
  • 47:22 - 47:24
    in virtually every country
  • 47:24 - 47:27
    of the world perhaps even
    every country of the world.
  • 47:27 - 47:31
    We have headquarters in
    just outside of D.C. here
  • 47:31 - 47:34
    in Western Virginia and
    Geneva, Switzerland.
  • 47:34 - 47:39
    And we have a very senior policy
    and technical staff in little
  • 47:39 - 47:41
    over 20 countries of
    the world often times
  • 47:41 - 47:43
    in the same individual
    which is more
  • 47:43 - 47:46
    and more the future in any case.
  • 47:46 - 47:48
    I spent 27 years in
    Europe, just moved back
  • 47:48 - 47:50
    to the US a little
    over a year ago.
  • 47:50 - 47:53
    And the one thing I'd really
    like to do is to make sure
  • 47:53 - 47:56
    that here in D.C., in
    this country's capital
  • 47:56 - 47:58
    that we recognize that this
    is not just about US citizens
  • 47:58 - 48:01
    and it's not just
    about the foreigners
  • 48:01 - 48:04
    that the US is surveilling.
  • 48:04 - 48:07
    This affects every
    individual in the world.
  • 48:07 - 48:10
    It affects some of
    them very directly
  • 48:10 - 48:12
    but it will affect the
    internet we all have access
  • 48:12 - 48:17
    to going forward to tomorrow
    and to generations to come.
  • 48:17 - 48:22
    If we are not careful, we will
    actually rob both individuals
  • 48:22 - 48:25
    today, tomorrow and future
    generations of all the freedom
  • 48:25 - 48:27
    and the benefit and
    the innovation
  • 48:27 - 48:30
    that the internet has brought.
  • 48:30 - 48:33
    The Internet Society actually
    deals an awful lot in principles
  • 48:33 - 48:36
    and the principles that make the
    internet what the internet is.
  • 48:36 - 48:38
    First and foremost,
    it's a platform.
  • 48:38 - 48:43
    It allows everybody to go out
    and develop what they choose to,
  • 48:43 - 48:47
    to access what they choose to,
    to innovate on material that's
  • 48:47 - 48:49
    out there and make
    that available.
  • 48:49 - 48:54
    If we're not careful,
    we'll loose all of that.
  • 48:54 - 48:57
    So, the-- in particular
    recently,
  • 48:57 - 49:00
    the unwarded collection
    storage and the ease
  • 49:00 - 49:02
    of correlation amongst all
    the data that's collected.
  • 49:02 - 49:05
    And I actually don't
    differentiate a lot
  • 49:05 - 49:08
    between metadata and content.
  • 49:08 - 49:10
    You can get so much
    information from metadata
  • 49:10 - 49:12
    that we shouldn't kid
    ourselves by saying,
  • 49:12 - 49:15
    "It's just metadata we're
    collecting, it's OK."
  • 49:15 - 49:20
    That-- though the collection
    that will undermine many
  • 49:20 - 49:22
    of the key principles
    and relationships.
  • 49:22 - 49:27
    And in particular some of those
    natural conclusions will start
  • 49:27 - 49:29
    to impact the physical
    infrastructure
  • 49:29 - 49:32
    of the internet itself whether
    it's using some of the IXPs
  • 49:32 - 49:34
    as choke points or
    whether it's using some
  • 49:34 - 49:36
    of the technical
    capabilities that exist
  • 49:36 - 49:39
    to help with surveillance.
  • 49:39 - 49:42
    Those are all things
    we want to I think--
  • 49:42 - 49:45
    think through very,
    very carefully.
  • 49:45 - 49:46
    One of the principles we argue
  • 49:46 - 49:48
    for is multistakeholder
    dialogue.
  • 49:48 - 49:51
    And it's because so much of what
    we're all facing whether it's
  • 49:51 - 49:53
    in a policy or a technical
  • 49:53 - 49:55
    or a social environment
    has never been done
  • 49:55 - 49:56
    in the world before.
  • 49:56 - 49:59
    We're breaking barriers
    every single day and we need
  • 49:59 - 50:02
    to bring everybody to the
    table for a discussion
  • 50:02 - 50:03
    and move forward
    thoughtfully and carefully.
  • 50:03 - 50:10
    That is even more so when we
    look to governments particularly
  • 50:10 - 50:13
    in their role in
    protecting citizens.
  • 50:13 - 50:17
    We believe that the internet
    must be a channel for secure,
  • 50:17 - 50:19
    reliable, private communication
  • 50:19 - 50:22
    between entities
    and individuals.
  • 50:22 - 50:26
    And surveillance without due
    process is simply unacceptable.
  • 50:26 - 50:30
    And as some other articles
    have said recently, frankly,
  • 50:30 - 50:31
    it's very creepy as well.
  • 50:31 - 50:34
    And if that makes it more
    personal, this is good
  • 50:34 - 50:39
    because we need everybody to
    care about what's happening now.
  • 50:39 - 50:40
    We also challenge the
    view that policies
  • 50:40 - 50:42
    to ensure security
    must always come
  • 50:42 - 50:44
    at the cost of user's rights.
  • 50:44 - 50:47
    I'd also argue with the fact
    that we all know it was coming
  • 50:47 - 50:49
    so what are we concerned about?
  • 50:49 - 50:51
    Due process wasn't followed.
  • 50:51 - 50:53
    That's what were
    concerned about.
  • 50:53 - 50:54
    Did most people that
    pay attention
  • 50:54 - 50:57
    to this field understand that
    you could do all these sorts
  • 50:57 - 50:58
    of things with the data?
  • 50:58 - 51:01
    Yes. Did we believe our
    governments were doing it
  • 51:01 - 51:04
    without due process
    and certainly
  • 51:04 - 51:08
    without an adequate
    level of transparency?
  • 51:08 - 51:10
    I might answer yes
    but hopefully--
  • 51:10 - 51:12
    hopefully we didn't
    and [inaudible]
  • 51:12 - 51:15
    to happen as Frank said.
  • 51:15 - 51:17
    One of the things we'd actually
  • 51:17 - 51:20
    like to do is really
    get everybody to focus
  • 51:20 - 51:22
    on the multistakeholder.
  • 51:22 - 51:23
    A lot of the principles
  • 51:23 - 51:28
    that have given us the internet
    find new forms, structures
  • 51:28 - 51:30
    and processes to address that.
  • 51:30 - 51:33
    Don't revert back to we
    need a new institution.
  • 51:33 - 51:38
    I don't think that is the answer
    in almost every situation.
  • 51:38 - 51:43
    Some of the recent proposals
    that have been put forward
  • 51:43 - 51:45
    to address some of these
    aspects would call for treaties.
  • 51:45 - 51:47
    Treaties are largely
    intergovernmental.
  • 51:47 - 51:49
    They don't allow for
    the private sector.
  • 51:49 - 51:51
    They don't allow
    for civil society.
  • 51:51 - 51:53
    And honestly I don't
    know how you get some
  • 51:53 - 51:55
    of those private
    sector companies
  • 51:55 - 51:57
    to sign on to a treaty.
  • 51:57 - 51:59
    So, I don't think treaties
    are the answer either.
  • 51:59 - 52:03
    We're going to need to create
    new processes and new forums
  • 52:03 - 52:05
    and certainly at the
    core of all that ought
  • 52:05 - 52:07
    to be thoughtful
    informed dialogue.
  • 52:07 - 52:10
    So, I think, you
    know, it's our hope
  • 52:10 - 52:13
    that as these discussions
    continue across the world
  • 52:13 - 52:17
    that we recognize and
    come to agree again
  • 52:17 - 52:19
    on some other high
    level principles.
  • 52:19 - 52:22
    Some of the ones I'd throw
    out for further debate,
  • 52:22 - 52:25
    is it unwanted surveillance
    even in the furtherance
  • 52:25 - 52:28
    of national security
    is not acceptable.
  • 52:28 - 52:31
    Unwarranted surveillance
    is not acceptable.
  • 52:31 - 52:34
    The disproportionate
    surveillance is also
  • 52:34 - 52:36
    not acceptable.
  • 52:36 - 52:39
    That surveillance
    without accountability is
  • 52:39 - 52:41
    not acceptable.
  • 52:41 - 52:43
    And further, turning
  • 52:43 - 52:45
    to something a little
    more positive,
  • 52:45 - 52:47
    that there should be
    transparency with respect
  • 52:47 - 52:49
    to policy and its
    implementation,
  • 52:49 - 52:53
    that we should be harnessing the
    expertise of all stakeholders
  • 52:53 - 52:56
    to discover better ways
    to protect citizens
  • 52:56 - 52:58
    in a global community.
  • 52:58 - 53:03
    And most importantly,
    encore to everything we do
  • 53:03 - 53:05
    that we uphold human rights.
  • 53:05 - 53:07
    And so I look forward to
    the discussion for the rest
  • 53:07 - 53:09
    of the day, and thank you.
  • 53:09 - 53:10
    >> Thank you, Lynn.
  • 53:10 - 53:11
    Danny?
  • 53:11 - 53:12
    >> Thanks, Lance.
  • 53:12 - 53:17
    So, thanks to the Internet
    Society and GW, Lynn and Lance,
  • 53:17 - 53:19
    for getting-- Paul for
    getting us together.
  • 53:19 - 53:24
    And at the end of the
    panel, I'm always reminded
  • 53:24 - 53:26
    of this very distinguished
    member of Congress said
  • 53:26 - 53:28
    at the near the end of a very
    excruciatingly long hearing,
  • 53:28 - 53:32
    everything's been said but
    not every one has said it.
  • 53:32 - 53:37
    So, I subscribe to much
    of what has been said
  • 53:37 - 53:39
    by many of my fellow panels.
  • 53:39 - 53:42
    I want to just make three points
  • 53:42 - 53:47
    about what I think we've
    been experiencing as a result
  • 53:47 - 53:49
    of this surveillance
    debate over the last month.
  • 53:49 - 53:54
    I think fundamentally what
    we've had is a certain degree
  • 53:54 - 53:58
    of a crisis in confidence
    and a crisis in trust
  • 53:58 - 54:00
    about the internet environment.
  • 54:00 - 54:02
    The question is, should
    you trust it or not?
  • 54:02 - 54:07
    And as Mellissa noted,
    it's, I think, useful to try
  • 54:07 - 54:11
    to breakdown to some extent
    our current sources of trust
  • 54:11 - 54:14
    in the internet in particular,
    in society in general.
  • 54:14 - 54:17
    I want to just make three
    points about the institutional,
  • 54:17 - 54:22
    legal and political levels of
    trust that we tend to look to.
  • 54:22 - 54:25
    To start with the
    institutional, you heard from--
  • 54:25 - 54:30
    you heard from John and
    from Lynn about some
  • 54:30 - 54:32
    of the institutions that
    make the internet work.
  • 54:32 - 54:34
    You know, I think
    it's interesting
  • 54:34 - 54:36
    when you look purely
    technically at the internet.
  • 54:36 - 54:40
    People pretty much trust that
    the pack is going to arrive more
  • 54:40 - 54:42
    or less in the right order.
  • 54:42 - 54:44
    And enough of them
    will get there
  • 54:44 - 54:46
    that you could get your
    message through it.
  • 54:46 - 54:47
    You can watch video.
  • 54:47 - 54:49
    We don't-- we don't have a
    lot of debates about that.
  • 54:49 - 54:55
    I will note just as a kind of
    a observation of the sociology
  • 54:55 - 54:57
    in a certain way of people
  • 54:57 - 54:59
    in the internet technical
    community.
  • 54:59 - 55:02
    There is a sense in some
    ways as Lynn suggested
  • 55:02 - 55:07
    that if any third party
    can get in the middle
  • 55:07 - 55:11
    of a communication stream
    between two parties,
  • 55:11 - 55:14
    but that is in a sort of
    very idealized sense a
  • 55:14 - 55:15
    technical failure.
  • 55:15 - 55:17
    It's the internet
    not working properly.
  • 55:17 - 55:18
    Now that's a narrow
    technical view
  • 55:18 - 55:20
    of the internet environment.
  • 55:20 - 55:23
    We, of course, have a
    broader legal and social
  • 55:23 - 55:27
    and political view of our
    societies, and we do recognize
  • 55:27 - 55:30
    that that surveillance
    and espionage will happen.
  • 55:30 - 55:34
    But part of the challenge, I
    think, in closing the trust gap
  • 55:34 - 55:36
    that we have is to articulate
    better what those sorts
  • 55:36 - 55:38
    of expectations are.
  • 55:38 - 55:39
    We do have administrative
    institutions
  • 55:39 - 55:41
    like that run the
    domain name service
  • 55:41 - 55:43
    and IP address assignment.
  • 55:43 - 55:44
    Thanks to people like John.
  • 55:44 - 55:45
    They just kind of worked--
  • 55:45 - 55:47
    people grumble about
    I can't [phonetic].
  • 55:47 - 55:50
    But so far, you know, people
    are getting new domain names.
  • 55:50 - 55:51
    They are being maintained.
  • 55:51 - 55:55
    Nothing has completely
    fallen apart at that point.
  • 55:55 - 55:57
    What I think is more complicated
  • 55:57 - 56:00
    on the institutional trust
    front is that in many cases,
  • 56:00 - 56:02
    we're used to looking
    to government
  • 56:02 - 56:04
    to establish trustworthiness
    in society.
  • 56:04 - 56:06
    And certainly governments
    believe their job is
  • 56:06 - 56:09
    to establish trustworthiness
    in society.
  • 56:09 - 56:12
    But as you've heard, many
    of our sources of trust
  • 56:12 - 56:14
    in the internet are
    actually not governmental.
  • 56:14 - 56:18
    They are working perfectly well
    largely without governments.
  • 56:18 - 56:20
    And I'll come back
    and talk about that.
  • 56:20 - 56:23
    But again because of that
    somewhat unusual circumstance
  • 56:23 - 56:27
    that Lynn alluded to, these
    multistakeholder processes,
  • 56:27 - 56:31
    we probably have to understand
    that a little bit better
  • 56:31 - 56:34
    so that people can trust
    those environments.
  • 56:34 - 56:37
    You know, on the-- when we
    look at legal institutions,
  • 56:37 - 56:39
    we rely a lot on our
    legal institutions
  • 56:39 - 56:42
    to establish trust in society.
  • 56:42 - 56:44
    We hope that our legal
    institutions make it
  • 56:44 - 56:48
    so that most people and most
    institutions mostly do the right
  • 56:48 - 56:49
    thing most of the time.
  • 56:49 - 56:50
    We don't expect perfection
  • 56:50 - 56:53
    but we do expect our legal
    institutions to set standards
  • 56:53 - 56:55
    and that there are consequences
    when they're not followed.
  • 56:55 - 56:58
    I think when we look at
    the privacy issues raised
  • 56:58 - 57:01
    by the current surveillance
    practices
  • 57:01 - 57:04
    that had been revealed, the
    problem that we have, I believe,
  • 57:04 - 57:08
    is that every one would like
    there to be a sense of privacy
  • 57:08 - 57:10
    in our communications
    environment.
  • 57:10 - 57:12
    But I think we have a real--
  • 57:12 - 57:17
    a lot of confusion about
    just what that ought to mean.
  • 57:17 - 57:19
    We tend to think about
    privacy particularly
  • 57:19 - 57:22
    in the computer network
    environment as being able
  • 57:22 - 57:24
    to keep things secret.
  • 57:24 - 57:26
    Well, I think we all understand
    now that we don't have a lot
  • 57:26 - 57:30
    of secrets, and we rely
    on lots of third parties
  • 57:30 - 57:33
    to maintain our information.
  • 57:33 - 57:38
    So, we don't keep secrets as
    well as we may be used to.
  • 57:38 - 57:41
    And my own view is that that
    means we have to start thinking
  • 57:41 - 57:43
    about privacy more as a question
  • 57:43 - 57:47
    of whether information is used
    properly whether it's misused,
  • 57:47 - 57:48
    whether it's used
  • 57:48 - 57:50
    to discriminate unfairly
    against people.
  • 57:50 - 57:54
    But that's going to require
    certain amount of discussion.
  • 57:54 - 57:55
    Lots of people have
    mentioned accountability.
  • 57:55 - 57:58
    We're going to need better
    accountability mechanisms
  • 57:58 - 58:00
    for these more complex
    privacy rules.
  • 58:00 - 58:03
    When privacy is not
    a binary phenomenon
  • 58:03 - 58:08
    that is either it's secret or
    it's not, we need mechanisms
  • 58:08 - 58:12
    to assure trust in the way
    personal information is handled.
  • 58:12 - 58:14
    I think there's a very
    simple analogy actually
  • 58:14 - 58:17
    that we can draw from
    the financial world.
  • 58:17 - 58:20
    Huge parts of our economy,
  • 58:20 - 58:24
    huge parts of the
    world's economy run based
  • 58:24 - 58:26
    on a pretty well understand
    set of accounting rules.
  • 58:26 - 58:29
    We're used to looking at
    balance sheets for corporations
  • 58:29 - 58:31
    and having some sense of trust
  • 58:31 - 58:33
    that those balance sheets
    reflect what's actually going
  • 58:33 - 58:35
    on in the financial life
    of the corporations.
  • 58:35 - 58:38
    We don't expect to see
    all the transactions
  • 58:38 - 58:40
    in the general ledger in order
    to look and get a picture
  • 58:40 - 58:43
    of whether the corporation is
    profitable, not profitable,
  • 58:43 - 58:46
    paying its taxes
    correctly, not, et cetera.
  • 58:46 - 58:49
    Now there are-- this doesn't
    always work perfectly,
  • 58:49 - 58:51
    but I the analogy particularly
  • 58:51 - 58:57
    to the NSA surveillance
    situation is quite strong.
  • 58:57 - 58:59
    We do as citizens of the
    United States want to be able
  • 58:59 - 59:01
    to have a sense of confidence
  • 59:01 - 59:05
    that our intelligence agencies
    are following the rules
  • 59:05 - 59:07
    that they say they're following.
  • 59:07 - 59:10
    My guess is they probably do
    about 90 percent of the time.
  • 59:10 - 59:12
    But we want to know that
    there's some accountability
  • 59:12 - 59:13
    to those rules.
  • 59:13 - 59:16
    I think we understand that
    we're not going to be able
  • 59:16 - 59:23
    to send auditors, independent
    auditors inside classified
  • 59:23 - 59:25
    environments and can have
    them come back and report
  • 59:25 - 59:27
    on everything they found.
  • 59:27 - 59:29
    But if we follow this balance
    sheet model, if we follow
  • 59:29 - 59:32
    to a methodology of assessing
    how information is used,
  • 59:32 - 59:35
    we can get that sense
    of trust back.
  • 59:35 - 59:40
    And finally, as a
    matter of political trust
  • 59:40 - 59:42
    and by political,
    I don't mean kind
  • 59:42 - 59:44
    of small P Washington politics.
  • 59:44 - 59:46
    I really mean politics
    in the sense
  • 59:46 - 59:49
    of how we organize
    ourselves as a society.
  • 59:49 - 59:54
    As people on this
    panel have noted,
  • 59:54 - 59:57
    our sources of political
    trust are complicated
  • 59:57 - 60:00
    in the internet environment
    and unusual.
  • 60:00 - 60:02
    We are used to the
    idea of states
  • 60:02 - 60:06
    of government's exercising
    authority directly
  • 60:06 - 60:11
    on institutions often
    on intermediaries,
  • 60:11 - 60:13
    but in the internet
    world and we're--
  • 60:13 - 60:15
    and if you considered
    the analogy
  • 60:15 - 60:19
    between telephone
    networks in the past
  • 60:19 - 60:22
    and internet service providers
    today, telephone networks,
  • 60:22 - 60:25
    broadcast networks were
    really creatures of the state.
  • 60:25 - 60:29
    They were authorized by
    the actions of legislatures
  • 60:29 - 60:33
    and therefore controlled in that
    way at the local state federal
  • 60:33 - 60:36
    and even international
    legal level.
  • 60:36 - 60:39
    The internet doesn't--
    has not happened that way,
  • 60:39 - 60:42
    it was not a creature of the
    state ever, it was really
  • 60:42 - 60:47
    in many ways a creature of
    individual and voluntary action
  • 60:47 - 60:50
    by some people on this panel,
    by others all around the world
  • 60:50 - 60:54
    who participate in making this
    internet institutions work.
  • 60:54 - 60:57
    But now when we're
    nervous about how they work
  • 60:57 - 61:00
    and how the state work
    we have rethink some
  • 61:00 - 61:01
    of these relationships.
  • 61:01 - 61:06
    So, we're going to have
    to get use to the fact
  • 61:06 - 61:10
    that governments-- I
    would submit cannot reach
  • 61:10 - 61:12
    into these internet institutions
  • 61:12 - 61:15
    like the internet engineering
    test force like I can--
  • 61:15 - 61:18
    like the worldwide
    web consortium
  • 61:18 - 61:20
    and achieve exactly
    the result they want.
  • 61:20 - 61:23
    But of course at the same time
    governments will make laws
  • 61:23 - 61:28
    and rules about how individuals
    and corporations act whether
  • 61:28 - 61:29
    for intellectual
    property protection
  • 61:29 - 61:33
    or privacy protection
    or anything else.
  • 61:33 - 61:34
    So, I think that what this--
  • 61:34 - 61:38
    the whole surveillance
    experience has revealed is
  • 61:38 - 61:43
    that the rules and expectations
  • 61:43 - 61:45
    that we have are quite
    a bit more nuanced
  • 61:45 - 61:48
    than the very binary
    technical behavior
  • 61:48 - 61:50
    of the internet environment.
  • 61:50 - 61:52
    And I think our challenge
    is now to do a better job
  • 61:52 - 61:54
    of articulating just
    what we expected,
  • 61:54 - 61:56
    all these different
    institutions at all these levels
  • 61:56 - 61:58
    and how we're going
    to find accountability
  • 61:58 - 62:01
    to those expectations.
  • 62:01 - 62:02
    Thanks.
  • 62:02 - 62:05
    >> Thank you Danny and thank
    you everybody on the panel.
  • 62:05 - 62:09
    I think what we'll do now,
    I'm going to open up first
  • 62:09 - 62:11
    to the panelist for
    five or 10 minutes
  • 62:11 - 62:15
    so they can ask questions
    of their fellow panelists,
  • 62:15 - 62:20
    make further, you know,
    comments, go some back and forth
  • 62:20 - 62:21
    that way for five or 10 minutes.
  • 62:21 - 62:24
    In the meantime if
    you in the audience,
  • 62:24 - 62:28
    if you have a question would
    you please if you're physically
  • 62:28 - 62:31
    in the audience here
    at GW step up to one
  • 62:31 - 62:35
    of these two microphones in
    either of the isles, form a line
  • 62:35 - 62:39
    and when your turn comes please
    state your name and affiliation
  • 62:39 - 62:42
    and then ask your question.
  • 62:42 - 62:45
    If you are in the internet
    world, if you're off
  • 62:45 - 62:49
    in the clouds somewhere and
    want to communicate send it in,
  • 62:49 - 62:52
    Paul any other directions
    on that
  • 62:52 - 62:54
    or you have some
    questions already?
  • 62:54 - 62:57
    While you're waiting, I want to
    give the panelist a chance first
  • 62:57 - 63:00
    but any other protocol?
  • 63:00 - 63:07
    >> So, they type the
    questions into live stream
  • 63:07 - 63:09
    and I'll be reading them
    here at one of these mics.
  • 63:09 - 63:12
    So, that's for the
    live stream panelists.
  • 63:12 - 63:15
    In New York they can just line
    up at their microphone there.
  • 63:15 - 63:16
    >> In New York they can
    line up their microphone
  • 63:16 - 63:18
    and we'll see them
    live up there?
  • 63:18 - 63:18
    >> We'll see them,
    yes, that's right.
  • 63:18 - 63:19
    >> OK.
  • 63:19 - 63:20
    >> Very good.
  • 63:20 - 63:20
    >> All right.
  • 63:20 - 63:21
    This will be interesting.
  • 63:21 - 63:23
    OK, but first let's get to the--
  • 63:23 - 63:28
    give the panelists a chance
    to ask questions, Leslie?
  • 63:28 - 63:32
    >> So, I want to ask a question
    of Danny because talking
  • 63:32 - 63:36
    about the concern or persuading
    people that, you know,
  • 63:36 - 63:38
    winning people of off
    government is sort
  • 63:38 - 63:42
    of the governance structure
    for the internet and trying
  • 63:42 - 63:44
    to educate them that it's
    not really government,
  • 63:44 - 63:47
    it's all of these other
    institutions and we do them
  • 63:47 - 63:50
    in a multi-stakeholder
    way and many people
  • 63:50 - 63:53
    up here have spent
    considerable time trying
  • 63:53 - 63:56
    to move people to that model.
  • 63:56 - 63:58
    It just seems to
    me that the rest
  • 63:58 - 64:01
    of the world right now is going
    to think we've been involved
  • 64:01 - 64:02
    in a slight of hand
  • 64:02 - 64:06
    that basically we're
    saying governments stay out,
  • 64:06 - 64:10
    governments don't get
    too involved here.
  • 64:10 - 64:13
    All these other into multi
    stakeholder institutions are
  • 64:13 - 64:15
    really the core of
    the internet and then
  • 64:15 - 64:18
    at the same time
    building an environment
  • 64:18 - 64:22
    where the United States
    uses historically--
  • 64:22 - 64:25
    historical dominance
  • 64:25 - 64:28
    to basically trump all
    of those governance.
  • 64:28 - 64:32
    So, I hear what you're saying
    but it was a hard argument
  • 64:32 - 64:34
    to make to the rest of
    the world beforehand.
  • 64:34 - 64:36
    I'm just curious
    whether you're continuing
  • 64:36 - 64:38
    to say it with a straight face.
  • 64:38 - 64:39
    But I'm having trouble.
  • 64:39 - 64:41
    >> It's a habit.
  • 64:41 - 64:47
    I guess what I would say
    is the ability to make
  • 64:47 - 64:50
    that argument really
    depends on--
  • 64:50 - 64:52
    as several people
    have said the layer
  • 64:52 - 64:53
    at which you're making
    the argument.
  • 64:53 - 65:00
    I don't believe that the NSA
    surveillance changes one bit the
  • 65:00 - 65:07
    question of who should be
    setting standards for TCPIP
  • 65:07 - 65:10
    or who should be determining
    how domain names get assigned.
  • 65:10 - 65:15
    I do believe that as you
    said Leslie that the question
  • 65:15 - 65:20
    of government espionage
    activities whether it's the NSA
  • 65:20 - 65:26
    or GCHQ or MI5 or, you know,
    the German Intelligence Agency,
  • 65:26 - 65:28
    the French Intelligence
    Agency, anyone of them,
  • 65:28 - 65:31
    all of which you're doing
    exactly the same thing.
  • 65:31 - 65:36
    I do believe that we now have
    to have a more public discussion
  • 65:36 - 65:38
    about what our expectations are
  • 65:38 - 65:40
    for surveillance
    including espionage
  • 65:40 - 65:42
    in the internet environment.
  • 65:42 - 65:48
    But that shouldn't get confused
    with the question of whether all
  • 65:48 - 65:50
    of a sudden we need
    a treaty about how
  • 65:50 - 65:52
    to set internet technical
    standards.
  • 65:52 - 65:55
    And I believe that, you know,
    I'm highly uncomfortable talking
  • 65:55 - 65:57
    about the rest of the world.
  • 65:57 - 66:02
    But I think that what we saw in
    the internet governance debate
  • 66:02 - 66:09
    at the United Nations at the--
    in Dubai where the proposition
  • 66:09 - 66:16
    from Democratic countries like
    Iran and China and Russia was
  • 66:16 - 66:17
    to exert greater control
  • 66:17 - 66:21
    over the internet environment
    was rejected not just
  • 66:21 - 66:26
    by the usual suspects, that
    is the 34 OACD countries
  • 66:26 - 66:28
    who you could somewhat
    expect to do that.
  • 66:28 - 66:33
    But by another 20 countries,
    from Africa and South Asia
  • 66:33 - 66:36
    and different parts of
    the world, that I think
  • 66:36 - 66:43
    at Brazil have recognized that
    while we could always do better
  • 66:43 - 66:45
    with the current internet
    governors arrangements
  • 66:45 - 66:48
    that they're working and
    messing with them has a cause
  • 66:48 - 66:49
    for all those countries.
  • 66:49 - 66:52
    So, I think as long as we
    keep these issues distinct
  • 66:52 - 66:55
    and don't confuse them
    I think there's a way
  • 66:55 - 66:59
    to have both discussions
    in a coherent way.
  • 66:59 - 67:00
    >> Well, so I agree with you.
  • 67:00 - 67:01
    >> I know you do.
  • 67:01 - 67:02
    [laughs]
  • 67:02 - 67:06
    >> But it seems to me
    the opportunity here
  • 67:06 - 67:09
    for many countries to
    combine them together
  • 67:09 - 67:11
    to not make them distinct.
  • 67:11 - 67:15
    The opportunity to
    reclaim control
  • 67:15 - 67:18
    by basically bringing
    this together.
  • 67:18 - 67:19
    >> Sure. So, what
    did Russia say?
  • 67:19 - 67:23
    What did Russia say this-- you
    know, Russia said in response
  • 67:23 - 67:25
    to the NSA we need more
    government control--
  • 67:25 - 67:26
    >> Right.
  • 67:26 - 67:30
    >> -- of both the
    internet infrastructure
  • 67:30 - 67:32
    and the internet companies.
  • 67:32 - 67:34
    It's a-- I mean, I think the
    question is how does anyone say
  • 67:34 - 67:36
    that with a straight face
  • 67:36 - 67:39
    when the concern was undo
    government intrusion?
  • 67:39 - 67:42
    But, yeah, I mean it--
  • 67:42 - 67:45
    people who want to make those
    arguments will find ways
  • 67:45 - 67:47
    to make them.
  • 67:47 - 67:50
    >> I worry that many of those
    in between are suddenly going
  • 67:50 - 67:57
    to become more persuadable
    because of the outrage.
  • 67:57 - 68:00
    >> So, I have a question
    for my fellow panelists.
  • 68:00 - 68:04
    And it's sort of following
    what Danny just raised.
  • 68:04 - 68:07
    In December we had an
    interesting conference,
  • 68:07 - 68:10
    the World Conference
  • 68:10 - 68:12
    on International
    Telecommunications.
  • 68:12 - 68:15
    I'll use the acronym
    WCIT not 'cause I want
  • 68:15 - 68:17
    to have the table bounded.
  • 68:17 - 68:18
    I explain the acronym first.
  • 68:18 - 68:19
    >> You did.
  • 68:19 - 68:20
    >> But you hear the
    term WCIT so I want
  • 68:20 - 68:22
    to pronounce it as
    you might hear it.
  • 68:22 - 68:23
    But the World Conference
  • 68:23 - 68:26
    on International
    Telecommunications was look
  • 68:26 - 68:28
    at some treaty arrangements.
  • 68:28 - 68:29
    So, some tariffs.
  • 68:29 - 68:33
    Regarding interconnection
    and there were number
  • 68:33 - 68:36
    of very interesting proposals
    that shut up in Dubai.
  • 68:36 - 68:41
    And I'm struck by the
    timing because that happened
  • 68:41 - 68:47
    and now we now have a lot of
    events regarding surveillance
  • 68:47 - 68:50
    and then I say and
    so on and so forth.
  • 68:50 - 68:52
    And I guess with my panelists
    I'd ask the question,
  • 68:52 - 68:57
    does anyone care to speculate
    if water had been reversed?
  • 68:57 - 68:58
    What the outcome
    would have been?
  • 68:58 - 69:01
    Because I've had a
    few people suggest
  • 69:01 - 69:04
    that that's an interesting
    exercise.
  • 69:04 - 69:08
    I am not sure whether or not
    the outcome that we had in Dubai
  • 69:08 - 69:13
    in December is what
    we would be seeing
  • 69:13 - 69:16
    if in fact it was being
    done instead next month.
  • 69:16 - 69:18
    >> Hold on, that was my concern.
  • 69:18 - 69:21
    That's my concern.
  • 69:21 - 69:26
    Well, I think that if you
    look at the World Conference
  • 69:26 - 69:30
    on International
    Telecommunications,
  • 69:30 - 69:35
    it was an important treaty
    negotiation, renegotiation
  • 69:35 - 69:41
    that hadn't been renegotiated
    since 1988, is that right?
  • 69:41 - 69:46
    '88. And as Americans we look
    at these negotiations as one--
  • 69:46 - 69:53
    as a one negotiation and I think
    that that's a poor perspective
  • 69:53 - 69:56
    because the WCIT was really
  • 69:56 - 70:01
    about how does one monetize
    the internet and to pay
  • 70:01 - 70:03
    for court infrastructure.
  • 70:03 - 70:09
    And it is one of a multi-series
    of negotiations, the world--
  • 70:09 - 70:14
    there's another
    telecommunications conference
  • 70:14 - 70:17
    on policy, WTPF, it
    just happened in Geneva
  • 70:17 - 70:20
    and they'll be the world
    summit on information society
  • 70:20 - 70:25
    that will be culminating
    in 2014 I think, right?
  • 70:25 - 70:26
    2014, '15.
  • 70:26 - 70:27
    >> So on, yup.
  • 70:27 - 70:31
    >> And there will be 24
    negotiations to go between now
  • 70:31 - 70:36
    and then that will ultimately
    be where things will land.
  • 70:36 - 70:37
    And then from-- the world summit
  • 70:37 - 70:39
    on information society
    will be the strategy
  • 70:39 - 70:43
    by which will be executed
    in the WTPF for policy.
  • 70:43 - 70:48
    The WCIT for regulation and
    in the internet standards--
  • 70:48 - 70:50
    in the international
    standards organization
  • 70:50 - 70:53
    for the overall technology.
  • 70:53 - 70:57
    So, I think if you are a
    person in the corporate world
  • 70:57 - 71:00
    and you're worried about these
    things you shouldn't just look
  • 71:00 - 71:02
    at one of these forums
    as one off.
  • 71:02 - 71:03
    And if you're worried about it
  • 71:03 - 71:05
    from a government perspective
    it's not just the policy forum
  • 71:05 - 71:07
    where the regulatory forum,
    it's all of this forum
  • 71:07 - 71:09
    and they're all interconnected.
  • 71:09 - 71:13
    So, I would say that yes, I
    think you actually suggested
  • 71:13 - 71:16
    that it was a positive outcome
    at WCIT in Dubai and I think
  • 71:16 - 71:19
    that the United States lost
    in that negotiation but--
  • 71:19 - 71:23
    and I think that the United
    States and many are going
  • 71:23 - 71:26
    to continue to lose and it's
    going to be a quick erosion
  • 71:26 - 71:31
    of our stance not a slower
    version of our stance.
  • 71:31 - 71:34
    >> I'll take one more
    question from the panel.
  • 71:34 - 71:37
    Any other panelist has
    a question before we go
  • 71:37 - 71:39
    out to the audience?
  • 71:39 - 71:42
    Going once?
  • 71:44 - 71:45
    OK.
  • 71:45 - 71:47
    >> Mike Nelson with
    Bloomberg Government
  • 71:47 - 71:49
    and with Georgetown University.
  • 71:49 - 71:52
    I want to commend the internet
    society for great panel,
  • 71:52 - 71:54
    we got the lawyers,
    we've got the techies,
  • 71:54 - 71:57
    we've got the scholars,
    we've got the activists,
  • 71:57 - 71:59
    and that's great but we
    tend here in Washington
  • 71:59 - 72:01
    to talk about the policy.
  • 72:01 - 72:04
    And Danny I tweeted your--
  • 72:04 - 72:07
    a [inaudible] of your talk
    which I thought was exceptional.
  • 72:07 - 72:09
    And that was--
  • 72:09 - 72:11
    >> You got it to 142 characters?
  • 72:11 - 72:13
    >> Less than that.
  • 72:13 - 72:18
    But you basically said, data
    will flow, we can't really focus
  • 72:18 - 72:20
    as we used to on
    controlling that flow,
  • 72:20 - 72:25
    we have to control the
    misuse of that data.
  • 72:25 - 72:28
    Policy makers are starting
    to understand that.
  • 72:28 - 72:30
    But I think we also
    have to figure out a way
  • 72:30 - 72:34
    that techies can start
    implementing systems
  • 72:34 - 72:35
    that reflect that.
  • 72:35 - 72:38
    And the first way to
    do that is to make sure
  • 72:38 - 72:42
    that systems are more
    transparent so that we can see
  • 72:42 - 72:44
    where data is being misused.
  • 72:44 - 72:48
    And I guess I'd challenge the
    audience to think about privacy,
  • 72:48 - 72:51
    not to tell the audience
    and the panel to think
  • 72:51 - 72:54
    about transparency by design.
  • 72:54 - 72:58
    We've heard about privacy by
    design but has anybody thought
  • 72:58 - 73:01
    of examples of where we're
    building in the transparencies
  • 73:01 - 73:03
    so its in the technology
    and other places
  • 73:03 - 73:05
    where we could do that better?
  • 73:05 - 73:08
    Just an open question.
  • 73:08 - 73:09
    >> Laura?
  • 73:09 - 73:13
    >> OK. I think that's a really
    great question and a good point.
  • 73:13 - 73:17
    I'll give one example of where
    I think the transparency is
  • 73:17 - 73:19
    excellent and then
    a couple of examples
  • 73:19 - 73:21
    where I think we need
    more transparency.
  • 73:21 - 73:24
    What is one of the oldest and
    most vulnerable institutions
  • 73:24 - 73:26
    of internet governance?
  • 73:26 - 73:29
    The internet engineering
    taskforce jumps to mind.
  • 73:29 - 73:30
    So, this is the-- one
  • 73:30 - 73:33
    of the standard setting
    organizations for the internet.
  • 73:33 - 73:35
    There are many others
    but they have set many
  • 73:35 - 73:40
    of the core standards so they
    have a tradition of being open
  • 73:40 - 73:41
    in three different ways.
  • 73:41 - 73:44
    They're open in the
    development of a standard
  • 73:44 - 73:46
    and that anyone can participate.
  • 73:46 - 73:49
    Now, granted there are a lot
    of barriers to participation,
  • 73:49 - 73:51
    it requires a lot of
    technical knowledge,
  • 73:51 - 73:55
    it requires in many
    cases money to go to some
  • 73:55 - 73:57
    of the events in time.
  • 73:57 - 74:00
    But it is basically
    open to anyone.
  • 74:00 - 74:02
    They are also open
    and transparent
  • 74:02 - 74:05
    in that the actual
    specification.
  • 74:05 - 74:07
    So, a standard is
    not really software
  • 74:07 - 74:09
    or hardware specifications
    that are written down
  • 74:09 - 74:12
    and people can go
    online and view them.
  • 74:12 - 74:15
    So, I would differ with some
    of the panelists and say that,
  • 74:15 - 74:18
    "Oh the technology
    is-- it is political."
  • 74:18 - 74:22
    So, the technology designers
    make political decisions
  • 74:22 - 74:26
    in the design whether they
    like to call it that or not.
  • 74:26 - 74:28
    So, sometimes privacy
    is designed in.
  • 74:28 - 74:30
    Think about encryption
    standards for example.
  • 74:30 - 74:34
    Think about unique identifiers
    and the privacy implications.
  • 74:34 - 74:37
    Yet the specification is
    open so there are some degree
  • 74:37 - 74:39
    of accountability where
    people can view it.
  • 74:39 - 74:41
    It's also open in
    the implementation
  • 74:41 - 74:44
    because it results in
    multiple competing products
  • 74:44 - 74:46
    that are based on that standard.
  • 74:46 - 74:48
    So, that's an example.
  • 74:48 - 74:51
    In other cases we don't have a
    lot of transparency and I agree
  • 74:51 - 74:52
    with you that we need more.
  • 74:52 - 74:56
    So, here is an example, how do
    we look at interconnections?
  • 74:56 - 74:59
    So, this is an area where
    I'm worried because of all
  • 74:59 - 75:02
    of the calls for greater
    government regulation
  • 75:02 - 75:03
    of interconnection in an area
  • 75:03 - 75:06
    that has worked fairly
    well up until now.
  • 75:06 - 75:09
    Well, part of the reason
    we're seeing this calls
  • 75:09 - 75:11
    for a regulation is that we
    can't really see what the
  • 75:11 - 75:13
    agreements are between
    these private companies.
  • 75:13 - 75:17
    So, I think it would be more
    helpful to have transparency
  • 75:17 - 75:20
    in an area such as that as well
    as other infrastructure areas.
  • 75:20 - 75:26
    >> So, I want to go
    back to Danny's premise
  • 75:26 - 75:29
    which you apparently
    support which seems to be--
  • 75:29 - 75:33
    data will flow, everybody
    will collect it
  • 75:33 - 75:34
    and we should only
    focus on the uses.
  • 75:34 - 75:37
    And that's certainly a
    discussion we've had on sort
  • 75:37 - 75:39
    of the consumer side
    of the ledger?
  • 75:39 - 75:41
    You know, Google will
    collect it and where we have
  • 75:41 - 75:44
    to focus our attention
    is how they're using it.
  • 75:44 - 75:48
    I have to submit the government
    is collecting information
  • 75:48 - 75:52
    and certainly the US government
    collecting information really
  • 75:52 - 75:56
    to-- is subject to-- I mean in
    our own country it's subject
  • 75:56 - 76:00
    to this little thing called
    the Fourth Amendment and I know
  • 76:00 - 76:04
    of no case and I stand-- I'm
    willing to stand corrected
  • 76:04 - 76:07
    that says, you can go in--
  • 76:07 - 76:10
    I mean, this is essentially
    what the governments is arguing
  • 76:10 - 76:11
    in NSA collection.
  • 76:11 - 76:16
    And they're calling it
    acquisition and not collection
  • 76:16 - 76:20
    that acquiring the
    information is not collection.
  • 76:20 - 76:24
    And that any rights that
    might attach don't happen
  • 76:24 - 76:26
    until you open and
    look at the packets.
  • 76:26 - 76:29
    I don't know, you know, I've
    never heard anybody come
  • 76:29 - 76:32
    into somebody's house, walk
    away with their desk drawer
  • 76:32 - 76:34
    and say, "We're acquiring."
  • 76:34 - 76:38
    To let you know if we ever
    get around to looking at it.
  • 76:38 - 76:41
    And so-- Danny and I have
    some disagreement although I'm
  • 76:41 - 76:44
    getting persuaded
    more and more by his--
  • 76:44 - 76:48
    by the question of use
    in the commercial side
  • 76:48 - 76:50
    of this big data world.
  • 76:50 - 76:53
    But, I'm not willing to go
    there in terms of governments.
  • 76:53 - 76:55
    And I think it's a really
    dangerous thing to do.
  • 76:55 - 77:01
    >> So I-- just to
    clarify, my observation
  • 77:01 - 77:08
    that data will flow is not
    meant as a moral conclusion.
  • 77:08 - 77:11
    And I think the extension
    which we choose to put limits
  • 77:11 - 77:16
    on how much data government
    can acquire from those
  • 77:16 - 77:17
    who have already
    collected it, right.
  • 77:17 - 77:22
    Because that's what
    we're talking about here,
  • 77:22 - 77:27
    it's very important and I think
    that historically we have relied
  • 77:27 - 77:32
    on technical barriers to large
    scale information collection
  • 77:32 - 77:35
    as a way to limit how much
    power the government has
  • 77:35 - 77:36
    and we don't have those anymore.
  • 77:36 - 77:39
    So, we're going to have
    to get very explicit
  • 77:39 - 77:41
    about what limits we think
    government should have
  • 77:41 - 77:43
    on both the collection
    and the use front.
  • 77:43 - 77:47
    My only observation kind of
    back to your point Mike is
  • 77:47 - 77:50
    that we are much
    better technically
  • 77:50 - 77:53
    at managing collection
    limitation
  • 77:53 - 77:55
    than we are managing
    use limitation.
  • 77:55 - 77:57
    Just as a pure matter
    of the kinds
  • 77:57 - 78:01
    of computer science techniques
    that we have available
  • 78:01 - 78:05
    and I would submit
    that's because, you know,
  • 78:05 - 78:08
    the computer security community
    cryptographers have taught us a
  • 78:08 - 78:12
    huge amount about how
    to keep data secret
  • 78:12 - 78:14
    and control access to data.
  • 78:14 - 78:17
    There's a whole other set
    of disciplines developing
  • 78:17 - 78:22
    in computer science that try to
    characterize information usage,
  • 78:22 - 78:26
    track information usage,
    but it is a different--
  • 78:26 - 78:31
    but there is less progress
    on that because I think
  • 78:31 - 78:33
    that that's not a
    view of privacy
  • 78:33 - 78:37
    that computer sciences
    have previously focused on.
  • 78:37 - 78:39
    So, I think you're
    exactly right to point
  • 78:39 - 78:41
    out that we need
    more work there.
  • 78:41 - 78:42
    >> I'd actually hear
    from the techies.
  • 78:42 - 78:44
    Because I wasn't thinking
    so much about transparency
  • 78:44 - 78:46
    at the institution
    level at layer eight.
  • 78:46 - 78:49
    I was thinking more
    the lower levels.
  • 78:49 - 78:50
    Well, like--
  • 78:50 - 78:51
    >> Oh, I'll give the--
    let me give the--
  • 78:51 - 78:53
    >> -- like route to tracing,
    you know, when we send an e-mail
  • 78:53 - 78:55
    to each other you can find out
    where it balanced along the way.
  • 78:55 - 78:57
    >> Let me give the techies
    about two minutes before we move
  • 78:57 - 78:59
    on to another question
    because we will have a chance
  • 78:59 - 79:02
    to circle back in
    the round table again
  • 79:02 - 79:04
    so we can take another
    bite at the apple here.
  • 79:04 - 79:04
    >> Yeah.
  • 79:04 - 79:05
    >> Go ahead.
  • 79:05 - 79:07
    >> I just want to--
    just very quickly.
  • 79:07 - 79:08
    Stealing is stealing.
  • 79:08 - 79:10
    It doesn't matter whether
    it's in the physical world
  • 79:10 - 79:11
    or in the cyber world.
  • 79:11 - 79:16
    So, whatever the techniques you
    have to collect data of a theft,
  • 79:16 - 79:19
    in the physical world you
    can apply the same techniques
  • 79:19 - 79:20
    in the cyber world.
  • 79:20 - 79:22
    The difference is speed.
  • 79:22 - 79:25
    I mean it takes me
    physically a long time to go in
  • 79:25 - 79:28
    and steal a laptop from
    the gentleman's tester
  • 79:28 - 79:30
    but I can steal all the
    information on his laptop
  • 79:30 - 79:33
    in a second, probably as long
    as it took me to do it as long
  • 79:33 - 79:35
    as I can connect
    it over the net.
  • 79:35 - 79:38
    So, the comment about
    the government, you know,
  • 79:38 - 79:41
    and the search warrant in
    the constitution, you should,
  • 79:41 - 79:42
    you know, that's why it's there.
  • 79:42 - 79:44
    That's why we have
    the constitution is
  • 79:44 - 79:45
    to follow those laws.
  • 79:45 - 79:48
    And you can still achieve
    the same goals and stay
  • 79:48 - 79:50
    within the constitutional
    limits.
  • 79:50 - 79:51
    It's been done.
  • 79:51 - 79:54
    It's just that what's happened
    is we got a bunch of people
  • 79:54 - 79:58
    at the policy level that
    didn't understand what the
  • 79:58 - 80:00
    implications are.
  • 80:00 - 80:01
    And that's where-- everything
  • 80:01 - 80:03
    that they've done
    has been legal.
  • 80:03 - 80:04
    It-- the law is the problem.
  • 80:04 - 80:07
    Not what whether
    they're doing it or that.
  • 80:07 - 80:09
    >> I would question that.
  • 80:09 - 80:10
    >> But--
  • 80:10 - 80:13
    >> OK. I'm going to move
    this along because I want
  • 80:13 - 80:16
    to give Joanne a chance and
    I'm sure Steve is getting a lot
  • 80:16 - 80:18
    of farther for the round table
    to come back to just later on.
  • 80:18 - 80:19
    >> So, I don't want
    to pick up on thing
  • 80:19 - 80:24
    which is the internet has
    a tradition or convention,
  • 80:24 - 80:28
    I wouldn't say tradition,
    convention of protocols
  • 80:28 - 80:31
    which behave in an open manner.
  • 80:31 - 80:35
    And so, this is for
    example, your--
  • 80:35 - 80:37
    when you're looking at how
    packets flow, there are commands
  • 80:37 - 80:40
    like traceroute that let you see
    how they go through the internet
  • 80:40 - 80:44
    and you can map them through an
    exchange point most of the time.
  • 80:44 - 80:46
    When you look at mail headers,
    you can look at e-mail headers
  • 80:46 - 80:49
    and you can actually see,
    wow that's not my e-mail,
  • 80:49 - 80:52
    it got from point A to
    point B and it's visible
  • 80:52 - 80:54
    and you can see those
    most of the time.
  • 80:54 - 80:57
    When you look at a packet that
    you've received you can look
  • 80:57 - 81:00
    at the source address and
    say, "Oh, where is that from?"
  • 81:00 - 81:04
    And you can look it up and
    find it most of the time.
  • 81:04 - 81:07
    Now, there is no
    obligations that any
  • 81:07 - 81:08
    of this information is accurate.
  • 81:08 - 81:10
    It's all sufficiently accurate
  • 81:10 - 81:13
    to keep the internet
    running or at least so far.
  • 81:13 - 81:15
    And we hope it will
    just keep it running,
  • 81:15 - 81:19
    but there's no actual
    obligations and you need
  • 81:19 - 81:23
    to be careful because while
    there would be some benefit
  • 81:23 - 81:27
    to having an attribute that says
    there's an actual obligation
  • 81:27 - 81:31
    to make this accurate, then
    I see people emerging saying,
  • 81:31 - 81:34
    "Well, wait a second, now I'm
    worried about my anonymity
  • 81:34 - 81:38
    because now you can trace my
    IP, or you can trace my e-mail."
  • 81:38 - 81:44
    So, we have just a
    convention of transparency
  • 81:44 - 81:46
    which has been enough to
    keep the internet running.
  • 81:46 - 81:48
    There is a question on
    whether or not that's going
  • 81:48 - 81:51
    to actually work long term.
  • 81:51 - 81:55
    The point that was made by
    Randy which is that the rate
  • 81:55 - 81:58
    at which you can attack
    something digitally is a lot
  • 81:58 - 82:00
    faster than physically.
  • 82:00 - 82:03
    And this means that the ability
    to have an accountable internet
  • 82:03 - 82:08
    where we can actually figure
    out who sent the bomb threat
  • 82:08 - 82:11
    or figure out who sent the
    e-mail which cause the problem
  • 82:11 - 82:16
    at the school is
    potentially a very large duty
  • 82:16 - 82:20
    and it may require us making
    a tradeoff between anonymity
  • 82:20 - 82:24
    and curated anonymity in
    order to have accountability.
  • 82:24 - 82:26
    Right now it's not
    clear that one
  • 82:26 - 82:28
    or the other is the
    right answer.
  • 82:28 - 82:31
    >> So, we may-- we'll circle
    back to this in the round table.
  • 82:31 - 82:35
    Let me move on to the
    next question over here.
  • 82:35 - 82:39
    >> All right, I'm [inaudible]
    cofounder of Codex for Africa.
  • 82:39 - 82:41
    This is more to as
    the techie side too
  • 82:41 - 82:43
    but at the same time
    from a global view.
  • 82:43 - 82:48
    We represent thousand
    of developers in Africa,
  • 82:48 - 82:50
    in the upcoming years it's going
  • 82:50 - 82:52
    to be more software
    developers who're are going
  • 82:52 - 82:56
    to be creating thousands of
    applications on mobile and web.
  • 82:56 - 83:00
    And one of the things is what
    are the strategies there,
  • 83:00 - 83:03
    when you have emerging countries
    or emerging economy jumping
  • 83:03 - 83:06
    on a bag of bandwagon of using
    the internet to do transaction
  • 83:06 - 83:09
    with the US or the
    western world.
  • 83:09 - 83:12
    What do you think what would
    happen because I could be
  • 83:12 - 83:17
    in a country like Synagogue,
    you know, do some hacking,
  • 83:17 - 83:21
    I can do anything I can because
    you don't have access to the--
  • 83:21 - 83:23
    maybe the continent
    level network.
  • 83:23 - 83:28
    But as well as the
    western world,
  • 83:28 - 83:30
    what are your strategies
    and all that?
  • 83:30 - 83:34
    >> Well, this is where
    the law comes into play.
  • 83:34 - 83:36
    What we hear on the
    techie side for instance is
  • 83:36 - 83:38
    of course there is laws
    in hacking in the US,
  • 83:38 - 83:41
    but there is no laws
    in hacking in China,
  • 83:41 - 83:42
    as we would understand it.
  • 83:42 - 83:46
    So, each country gets to define,
    you know, what a hacking is.
  • 83:46 - 83:49
    Some may take it as an
    assault against the government
  • 83:49 - 83:51
    if you are hacking, you know,
  • 83:51 - 83:55
    some might just say it's
    a plain and simple theft.
  • 83:55 - 83:57
    So, you're going to have to-- I
    would think you'd have to look
  • 83:57 - 84:00
    at each individual
    country's definition
  • 84:00 - 84:04
    of what they considered to
    be hacking in that case.
  • 84:04 - 84:08
    Now, on the other side, again,
    if you're writing an application
  • 84:08 - 84:10
    for someone like that then
    you're going to do logging,
  • 84:10 - 84:12
    you're going to do all of this
    type of stuff because you want
  • 84:12 - 84:16
    to provide an auditor
    if you will
  • 84:16 - 84:18
    if it's a financial
    application with some record
  • 84:18 - 84:21
    of whatever transactions
    your software handles.
  • 84:21 - 84:24
    So, there's always going to be
    a record of something that you--
  • 84:24 - 84:26
    either your app did or
    where your app went.
  • 84:26 - 84:30
    Whether it's accurate or
    not that's another question
  • 84:30 - 84:32
    but there's always
    going to be a record.
  • 84:32 - 84:35
    And again, metadata
    analysis I can use that
  • 84:35 - 84:36
    and make some inferences
  • 84:36 - 84:39
    as to what you did even
    though I can't see what you--
  • 84:39 - 84:44
    what's in your individual
    packets if you encrypted them.
  • 84:44 - 84:45
    >> John?
  • 84:45 - 84:49
    >> So, that's an amazing
    wake up call here.
  • 84:49 - 84:52
    So, we have this internet
    that's remarkable.
  • 84:52 - 84:56
    It allows people to interact,
    people in different countries
  • 84:56 - 84:59
    with different expectations
    to interact.
  • 84:59 - 85:01
    And yet, the conventions
  • 85:01 - 85:07
    by which governments work
    haven't evolved fast enough.
  • 85:07 - 85:10
    The idea that citizen in country
    A is interacting with citizen
  • 85:10 - 85:15
    in country B and it might be
    illegal in one country and not
  • 85:15 - 85:17
    in another is a whole
    new concept
  • 85:17 - 85:20
    that governments are going
    to take some time to try
  • 85:20 - 85:21
    to figure out how to deal with.
  • 85:21 - 85:23
    So, we have a problem.
  • 85:23 - 85:25
    We literally have an
    internet that has capabilities
  • 85:25 - 85:29
    that governments haven't
    come to grip on how
  • 85:29 - 85:33
    to handle their duties
    and responsibilities.
  • 85:33 - 85:35
    I know governments
    have feel very strongly
  • 85:35 - 85:38
    about protecting their
    citizens against pornography
  • 85:38 - 85:42
    or against certain
    types of content.
  • 85:42 - 85:43
    I know governments
    in other countries
  • 85:43 - 85:47
    that feel very strongly that
    that's up to each citizen.
  • 85:47 - 85:50
    But the reality is that that
    intersection has now happened
  • 85:50 - 85:51
    because of the internet.
  • 85:51 - 85:54
    Two governments can have
    very different views
  • 85:54 - 85:57
    on what their responsibilities
    of their citizens are.
  • 85:57 - 86:00
    So, to answer your
    question in a general case,
  • 86:00 - 86:02
    the internets move
    faster than governments.
  • 86:02 - 86:04
    Governments literally do
    not know how to interact
  • 86:04 - 86:06
    with the situation you describe.
  • 86:06 - 86:09
    On a practical matter, if
    you end up doing something
  • 86:09 - 86:13
    of significance, something
    that causes a lot of harm
  • 86:13 - 86:17
    or actually there's a very
    nice list of types of attacks,
  • 86:17 - 86:22
    distraction of property, theft
    of intellectual property.
  • 86:22 - 86:25
    There's different types
    of extortion or DDoS,
  • 86:25 - 86:29
    if you actually do something of
    a major magnitude you'll find
  • 86:29 - 86:32
    out that law enforcement does
    cooperate between countries
  • 86:32 - 86:34
    and it works very well.
  • 86:34 - 86:36
    It's just not set up for
    the scale of the internet.
  • 86:36 - 86:40
    And it's in the age
    of the cooperation
  • 86:40 - 86:41
    between law enforcement
  • 86:41 - 86:45
    and various computer
    response teams is one step
  • 86:45 - 86:48
    above fax machines ringing
    and going back and forth.
  • 86:48 - 86:52
    We have the mechanism to
    handle the really bad events
  • 86:52 - 86:53
    fairly slowly.
  • 86:53 - 86:55
    We don't have anything
    to handle the scale
  • 86:55 - 86:59
    of automatic attacks
    happening 24 hours a day
  • 86:59 - 87:00
    around the entire
    globe which don't--
  • 87:00 - 87:02
    >> Let me move to Danny.
  • 87:02 - 87:04
    He wanted to say something
    and then I'm going to move
  • 87:04 - 87:05
    on to the next question.
  • 87:05 - 87:07
    By the way, I can't
    really talk to anybody
  • 87:07 - 87:09
    in New York once
    I ask a question.
  • 87:09 - 87:11
    So, what they do I guess
    they'll stand up or something
  • 87:11 - 87:12
    or else go through Paul.
  • 87:12 - 87:14
    Danny, go ahead.
  • 87:14 - 87:18
    >> I just want to make one
    observation about the challenge
  • 87:18 - 87:24
    of international law
    enforcement cooperation.
  • 87:24 - 87:26
    Most people in this room
    are probably familiar
  • 87:26 - 87:31
    with the SOPA debate that was
    proposed in United States.
  • 87:31 - 87:34
    In a certain sense we only
    have that debate with all
  • 87:34 - 87:40
    of its cataclysm because of the
    failure of current mechanisms
  • 87:40 - 87:43
    in international law
    enforcement cooperation.
  • 87:43 - 87:46
    We have that debate, the
    Congress was considering
  • 87:46 - 87:51
    that law blocking access
  • 87:51 - 87:53
    to websites outside
    the United States
  • 87:53 - 87:58
    that might have infringing
    content because people
  • 87:58 - 87:59
    in Congress were concerned
  • 87:59 - 88:03
    that US Law Enforcement didn't
    have an effective way of working
  • 88:03 - 88:08
    with law enforcement
    authorities from the countries
  • 88:08 - 88:10
    where the infringement
    was actually happening.
  • 88:10 - 88:16
    I think that we have to get a
    lot better as John is suggesting
  • 88:16 - 88:19
    at add enforcement cooperation.
  • 88:19 - 88:22
    There are realms where that
    works reasonably well but it's--
  • 88:22 - 88:26
    the problem is it's mostly
    cooperation in the form of how
  • 88:26 - 88:30
    to make a criminal conviction
    against someone to stick.
  • 88:30 - 88:33
    Law enforcement cooperation
    mechanisms are good
  • 88:33 - 88:35
    at exchanging evidence,
    it make sure
  • 88:35 - 88:38
    that you have the information
    you need to, you know,
  • 88:38 - 88:41
    bring someone to
    trial but not good
  • 88:41 - 88:45
    at actually stopping
    the behavior
  • 88:45 - 88:47
    that maybe harmful
    as it's happening.
  • 88:47 - 88:49
    So, I think it's a
    very big challenge.
  • 88:49 - 88:52
    >> All right, go to Paul for a--
  • 88:52 - 88:54
    >> I actually did see that
    David Salmon [assumed spelling]
  • 88:54 - 88:57
    of the New York Society's
    chapter president would
  • 88:57 - 88:58
    like to have a question.
  • 88:58 - 89:00
    So, why don't we go
    to him if he will--
  • 89:00 - 89:02
    I think they have a
    little bit of a delay.
  • 89:02 - 89:06
    So, he might just be
    hearing this in a second.
  • 89:06 - 89:07
    >> OK David.
  • 89:07 - 89:15
    [ Inaudible Remark ]
  • 89:15 - 89:23
    >> OK, that's your cue David.
  • 89:23 - 89:25
    >> OK. I'll speak to the
    camera then or-- yeah.
  • 89:25 - 89:26
    OK, so.
  • 89:26 - 89:34
    [ Inaudible Remark ]
  • 89:34 - 89:37
    OK, we have some delay here.
  • 89:37 - 89:39
    My question is what sort
  • 89:39 - 89:42
    of scenarios do our
    panelists envision
  • 89:42 - 89:46
    and what would be some
    alternative solutions
  • 89:46 - 89:50
    if our government specifically
    Congress is unwilling or unable
  • 89:50 - 89:55
    to rain in agencies such as the
    NSA in terms of surveillance?
  • 89:55 - 89:57
    If they can't do
    it or choose not
  • 89:57 - 89:59
    to what would be
    the consequences?
  • 89:59 - 90:02
    Are there technical solutions
    or are there possible solutions
  • 90:02 - 90:05
    that could be implemented
    without US government?
  • 90:05 - 90:10
    >> OK. The panelists are being--
    the panelists are being asked,
  • 90:10 - 90:12
    is there a work around congress?
  • 90:12 - 90:15
    >> I would just to
    speak to it just
  • 90:15 - 90:17
    as an individual not
    as a panel member.
  • 90:17 - 90:18
    But if congress can't rain
  • 90:18 - 90:22
    in the government agency we
    have a lot more serious problems
  • 90:22 - 90:24
    than what's going
    on in the internet.
  • 90:24 - 90:28
    >> When was the last time
    we passed a federal budget?
  • 90:28 - 90:30
    >> I mean, they are
    government agency.
  • 90:30 - 90:31
    Congress has to do that.
  • 90:31 - 90:35
    Now, again, there is
    classified information
  • 90:35 - 90:38
    and all this other stuff that
    can influence how law is passed.
  • 90:38 - 90:41
    But quite frankly if congress
    didn't have that power
  • 90:41 - 90:44
    to do it then we wouldn't see
    General Alexander making a run
  • 90:44 - 90:47
    up to talk to members, I
    think was it today or--
  • 90:47 - 90:48
    >> No, it's tomorrow.
  • 90:48 - 90:49
    >> Yeah, the vote is tomorrow.
  • 90:49 - 90:51
    So, you know, doing that,
  • 90:51 - 90:54
    so certainly I would think
    congress has the ability to do
  • 90:54 - 90:56
    and should have the ability or
    else we don't have a democracy.
  • 90:56 - 91:00
    >> OK, any other
    panelist winging on that?
  • 91:00 - 91:00
    Lynn?
  • 91:00 - 91:02
    >> Well--
  • 91:02 - 91:03
    >> You can just say quicker
    than I'm-- oh, I'm sorry.
  • 91:03 - 91:04
    >> No please.
  • 91:04 - 91:07
    >> That, you know, there
    are ways for people
  • 91:07 - 91:08
    to manage their traffic.
  • 91:08 - 91:11
    I know I'm certainly certain
    that would be even more ways
  • 91:11 - 91:15
    for people to manage their
    traffic and choose the routing.
  • 91:15 - 91:17
    We actually are quite
    concerned about that
  • 91:17 - 91:21
    because it will make the global
    internet much less resilient.
  • 91:21 - 91:25
    It will-- putting something
    in a box doesn't mean it's--
  • 91:25 - 91:29
    it maybe protecting it from
    one country actually examine it
  • 91:29 - 91:31
    but it doesn't protect
    the other country.
  • 91:31 - 91:36
    And so, I'm not quite sure
    what David's question was but,
  • 91:36 - 91:40
    you know, if it's about
    routing and the ability to route
  • 91:40 - 91:43
    around what you see as a
    problem, I think we need
  • 91:43 - 91:44
    to be very, very
    careful about what some
  • 91:44 - 91:45
    of those potential solutions are
  • 91:45 - 91:48
    and because I don't think
    it will address the question
  • 91:48 - 91:50
    you're-- the problem you're
    trying to route around.
  • 91:50 - 91:54
    And in fact it will overtime
    make the global internet much
  • 91:54 - 91:57
    less resilient.
  • 91:57 - 91:58
    >> Paul?
  • 91:58 - 92:00
    >> OK, I have a question
    from live stream.
  • 92:00 - 92:02
    It's from Garth Gram
    [assumed spelling].
  • 92:02 - 92:04
    He asks, is the real
    issue autonomy
  • 92:04 - 92:07
    and self determined
    choice rather than privacy?
  • 92:07 - 92:09
    And if so, what is
    the role of identity
  • 92:09 - 92:13
    in addressing the
    issue of trust?
  • 92:13 - 92:21
    >> Well, I'll take
    one step at that.
  • 92:21 - 92:25
    I think that in the
    discussions of privacy
  • 92:25 - 92:32
    over the last 10 years or
    so, maybe longer I think
  • 92:32 - 92:36
    that we've gotten a little
    bit distracted by the promise
  • 92:36 - 92:42
    of individual choice as
    somehow the key to privacy.
  • 92:42 - 92:45
    I think that a lot of
    the privacy values--
  • 92:45 - 92:48
    and what I mean by that
    is the dialogue box is
  • 92:48 - 92:52
    that everyone sees in, you
    know, one website or another
  • 92:52 - 92:55
    where you have to click here
    to accept the privacy policy
  • 92:55 - 92:58
    or swat away a dialogue
    box to proceed.
  • 92:58 - 93:03
    And I think certainly in-- it's
    actually a remarkable point
  • 93:03 - 93:06
    of convergence between the
    United States and Europe
  • 93:06 - 93:10
    in the last couple years
    both the White House
  • 93:10 - 93:14
    and the Federal Trade Commission
    issued major privacy policy
  • 93:14 - 93:17
    statements that noted
    the limitations of this
  • 93:17 - 93:20
    so called noticing choice or
    individual determination model.
  • 93:20 - 93:24
    The European certainly have
    pointed that out as well.
  • 93:24 - 93:26
    A lot of the things
    that we value associated
  • 93:26 - 93:28
    with privacy are
    collected values.
  • 93:28 - 93:31
    We want to make sure
    people can associate freely,
  • 93:31 - 93:34
    can engage in politics,
    can engage in commerce,
  • 93:34 - 93:36
    can seek medical
    care, et cetera.
  • 93:36 - 93:40
    And I think that giving
    people the choices to opt
  • 93:40 - 93:43
    out of those things or
    somehow control their identity
  • 93:43 - 93:46
    when they are trying to
    speak to their doctor or make
  • 93:46 - 93:48
    up public political
    statement really seems
  • 93:48 - 93:52
    to be exactly the opposite of
    some of our core privacy value.
  • 93:52 - 93:55
    So, I think that
    there are situations
  • 93:55 - 93:58
    in which individual
    autonomy is quite important
  • 93:58 - 94:01
    but a little bit also to the
    last question about the NSA.
  • 94:01 - 94:04
    I don't think we get ourselves
    out of these privacy problems
  • 94:04 - 94:13
    by just giving people, you
    know, 20K long encryption keys
  • 94:13 - 94:16
    to wield against everyone else.
  • 94:16 - 94:17
    >> Laura?
  • 94:17 - 94:21
    >> That's a really
    interesting question and I want
  • 94:21 - 94:23
    to tie it back to something
  • 94:23 - 94:26
    that I think Danny said
    before about data will flow.
  • 94:26 - 94:28
    And, you know, while
    I agree with that,
  • 94:28 - 94:31
    that's also not necessarily
    the case.
  • 94:31 - 94:34
    We have interconnection disputes
    that have resulted in outages,
  • 94:34 - 94:36
    we've had countries
    that have cut off access
  • 94:36 - 94:37
    for their citizens.
  • 94:37 - 94:40
    We have areas of the world
    that have infrastructures
  • 94:40 - 94:41
    of complete censorship.
  • 94:41 - 94:44
    We have digital divide
    issues, we have trends away
  • 94:44 - 94:47
    from interoperability where
    we're going in the cloud
  • 94:47 - 94:50
    to more proprietary
    protocols for example.
  • 94:50 - 94:53
    So, it's-- so data will
    not necessarily flow.
  • 94:53 - 94:55
    But one of the things
    that is required for it
  • 94:55 - 94:59
    to flow is this issue of trust
    that the questioner brought up.
  • 94:59 - 95:04
    So, I think I can mention just
    a couple of areas or maybe three
  • 95:04 - 95:05
    that are very important.
  • 95:05 - 95:09
    So, the trust has always excited
    between network providers
  • 95:09 - 95:13
    to exchange information about
    IP addresses that are either
  • 95:13 - 95:16
    in their control or that they
    can reach on the internet.
  • 95:16 - 95:19
    But we have seen
    examples that's done
  • 95:19 - 95:21
    by through a boarder
    gateway protocol.
  • 95:21 - 95:22
    There have been examples
  • 95:22 - 95:25
    where false routes have been
    advertised whether intentionally
  • 95:25 - 95:29
    or not and outages
    have occurred.
  • 95:29 - 95:32
    So, there is-- our effort
    is underway now to secure
  • 95:32 - 95:33
    that which are very necessary.
  • 95:33 - 95:35
    So, we have to build
    trust into the network.
  • 95:35 - 95:37
    It's not something that
    we can just assume.
  • 95:37 - 95:40
    It has to be designed in,
    it has to be build in.
  • 95:40 - 95:43
    The same thing with how the
    domain name system works.
  • 95:43 - 95:46
    We have servers located around
    the world that resolve queries
  • 95:46 - 95:50
    of domain names like maybe I'm
    up here looking at cnn.com,
  • 95:50 - 95:54
    I'm not but the domain
    name server would resolve
  • 95:54 - 95:57
    that into its IP address
    and route the information.
  • 95:57 - 96:00
    Well, that can be gamed also
  • 96:00 - 96:03
    and that there can be a
    false return of a query.
  • 96:03 - 96:07
    So, having things like domain
    name system security extensions
  • 96:07 - 96:09
    has to be continued
    to be implemented.
  • 96:09 - 96:11
    You know, these are just
    a few of the examples,
  • 96:11 - 96:15
    an answer to the question
    that it has to be designed in,
  • 96:15 - 96:18
    same thing with website
    authentication.
  • 96:18 - 96:22
    If I'm saying that correctly the
    role of certificate authorities
  • 96:22 - 96:24
    and how they verify
    through a digital signatures
  • 96:24 - 96:27
    that a website is who
    the website says it is.
  • 96:27 - 96:30
    So, again, this is an
    example of the politics
  • 96:30 - 96:32
    of the architecture, it's
    no just about agreements
  • 96:32 - 96:35
    between people but about
    designing this trust
  • 96:35 - 96:37
    and identity into
    the infrastructure.
  • 96:37 - 96:40
    >> I'm going to move on
    to the next question just
  • 96:40 - 96:44
    because I want to get the
    audience as much opportunity
  • 96:44 - 96:47
    as I can because the
    panelists are going to be able
  • 96:47 - 96:51
    to circle back on this
    later on over here.
  • 96:51 - 96:53
    >> So, I'm Luke Wadman [assumed
    spelling], I'm a student working
  • 96:53 - 96:55
    as a policy analyst intern,
  • 96:55 - 96:58
    analysis intern for
    IEEE this summer.
  • 96:58 - 97:01
    I'm working on internet
    governance issues.
  • 97:01 - 97:04
    And does the panel
    think there is any--
  • 97:04 - 97:10
    would it be a good way to
    frame the debate on privacy
  • 97:10 - 97:13
    and internet governance and
    et cetera in economic terms
  • 97:13 - 97:16
    because if I've learned
    anything in my time in DC it's
  • 97:16 - 97:18
    that catching the ear
  • 97:18 - 97:20
    of our congressional
    representatives is easy
  • 97:20 - 97:23
    if you start talking about
    jobs and job creation
  • 97:23 - 97:27
    and we've already talked a
    little bit about the impact
  • 97:27 - 97:32
    of things like prism on US-based
    IT companies like Google
  • 97:32 - 97:34
    and Facebook, particularly
    in the EU
  • 97:34 - 97:37
    but also around the world.
  • 97:37 - 97:40
    One case is that Google isn't
    really competitive in China
  • 97:40 - 97:43
    and that's probably going to
    continue in that direction.
  • 97:43 - 97:47
    So, would that be a good way to
    frame this whole conversation
  • 97:47 - 97:48
    and actually encourage some sort
  • 97:48 - 97:52
    of positive congressional
    action?
  • 97:54 - 97:57
    >> So, I would never say
    that I have any idea how
  • 97:57 - 97:59
    to encourage positive
    congressional action.
  • 97:59 - 98:02
    I just want to put
    that on the table.
  • 98:02 - 98:04
    I think it is true
  • 98:04 - 98:08
    that everything that's
    happened this sort
  • 98:08 - 98:14
    of last six weeks inside this
    sort of NSA bubble has happened
  • 98:14 - 98:18
    without any reference
    as to what it might--
  • 98:18 - 98:23
    the impact that it might
    have on US industry.
  • 98:23 - 98:26
    And I think it is possible
    that the impact at least
  • 98:26 - 98:27
    in the short run maybe severe.
  • 98:27 - 98:30
    On the other hand a
    lot of people may carry
  • 98:30 - 98:34
    on about being unhappy
    about discovering
  • 98:34 - 98:38
    that the NSA is sucking
    up their data.
  • 98:38 - 98:43
    Historically, when
    the big comp--
  • 98:43 - 98:46
    if you look at the SOPA fight
  • 98:46 - 98:48
    which I think was
    probably the first time
  • 98:48 - 98:52
    that US internet industry sort
    of held hands with activists
  • 98:52 - 98:56
    and technologist, it does
    get congresses attention.
  • 98:56 - 99:00
    I will say though that
    National Security is different.
  • 99:00 - 99:04
    It is just always
    different and the arc
  • 99:04 - 99:09
    of National Security has been
    more, more, more since 9/11
  • 99:09 - 99:14
    and the question is whether
    these revelations have sort
  • 99:14 - 99:18
    of pushed us beyond the
    more and more place.
  • 99:18 - 99:21
    >> OK. Let's move over here.
  • 99:21 - 99:24
    >> Hi, I'm Susan Aaronson
    with GW, I'm a professor here
  • 99:24 - 99:27
    and I work with the
    Worldwide Web Foundation
  • 99:27 - 99:28
    on measuring internet openness.
  • 99:28 - 99:32
    And I want to ask you a
    question that relates to trust,
  • 99:32 - 99:34
    the trust of policy makers.
  • 99:34 - 99:38
    So, in the last couple of
    days we've seen [inaudible]
  • 99:38 - 99:41
    and Angle a Miracle [assumed
    spelling] make these delightful
  • 99:41 - 99:47
    comments about sever locations
    and threats in terms of privacy.
  • 99:47 - 99:49
    And again, I wonder
    if there are--
  • 99:49 - 99:53
    so, they're basically saying
    if the server can't be located
  • 99:53 - 99:59
    where we can control, where our
    privacy rules dominate we might
  • 99:59 - 100:01
    not accept for example some
  • 100:01 - 100:02
    of the things the
    United States wants
  • 100:02 - 100:04
    in the trade agreement or--
  • 100:04 - 100:08
    and we see similar things with
    the Trans-Pacific Partnership
  • 100:08 - 100:10
    and I just wonder if you
    could talk a little bit
  • 100:10 - 100:12
    about this now.
  • 100:12 - 100:15
    You know, you can always, for
    national security reasons,
  • 100:15 - 100:19
    you can always say you have
    a particular policy in place
  • 100:19 - 100:21
    and it's not protectionist.
  • 100:21 - 100:23
    But this is opposite
    of that, right?
  • 100:23 - 100:26
    They're saying that
    for privacy reasons,
  • 100:26 - 100:31
    they want to essentially
    protect their citizens
  • 100:31 - 100:33
    from their information
    being traded
  • 100:33 - 100:37
    by having the server
    location in the United Stated.
  • 100:37 - 100:41
    If I may add one other
    thing which is Frank La Rue,
  • 100:41 - 100:45
    who works for the-- who is
    the UN Special Representative
  • 100:45 - 100:48
    on Freedom of Expression,
    he has said basically
  • 100:48 - 100:51
    that the US' failure to protect
    the privacy is a violation
  • 100:51 - 100:53
    of its human rights' obligation
  • 100:53 - 100:55
    because that is a
    basic human right
  • 100:55 - 100:58
    under the Universal
    Declaration, blah, blah, blah.
  • 100:58 - 101:04
    So I want to hear your comments.
  • 101:04 - 101:05
    >> Sure. I'll do this.
  • 101:05 - 101:07
    Pull it out there.
  • 101:07 - 101:09
    I'll-- the Trans-Pacific
    Partnership
  • 101:09 - 101:18
    and the US Free Trade Agreement
    is and always will have had
  • 101:21 - 101:25
    to address the data
    privacy laws.
  • 101:25 - 101:27
    The Safe Harbor that
    we had in place
  • 101:27 - 101:32
    in 2001 will actually
    expire with the agreement.
  • 101:32 - 101:35
    And the United States, as you
    know, has no national umbrella
  • 101:35 - 101:37
    for data breach and
    data privacy.
  • 101:37 - 101:41
    We have 47 individual states
    with their individual programs
  • 101:41 - 101:43
    and no national umbrella.
  • 101:43 - 101:46
    And so, if we're recalling
    for congressional action
  • 101:46 - 101:50
    that had an economic, you
    know, significant impact,
  • 101:50 - 101:54
    there are 52 pieces of
    legislation currently
  • 101:54 - 101:58
    in 113th Congress
    around cyber security,
  • 101:58 - 102:00
    about 10 of which
    around data breach.
  • 102:00 - 102:01
    It would be wonderful
    if we could get
  • 102:01 - 102:04
    to some bipartisan
    agreement on that
  • 102:04 - 102:08
    so we could enable the
    overall Free Trade Agreement
  • 102:08 - 102:11
    to move forward between
    the two continents.
  • 102:11 - 102:16
    More specifically though, noting
    the 47 different state laws
  • 102:16 - 102:20
    and noting the difference
    between the Europe
  • 102:20 - 102:24
    and the United States,
    cloud computing
  • 102:24 - 102:28
    and where the data is
    stored follows the geography
  • 102:28 - 102:30
    and will always follow
    the geography.
  • 102:30 - 102:33
    So if there's a data breach
    here in the state of Virginia,
  • 102:33 - 102:34
    it follows a different
    set of rules
  • 102:34 - 102:38
    than Massachusetts
    and in California.
  • 102:38 - 102:41
    And a company, whoever the
    company might be, actually has
  • 102:41 - 102:44
    to know all sets of laws
    for that particular state
  • 102:44 - 102:48
    in this case in order to follow
    the regulatory compliance,
  • 102:48 - 102:49
    et cetera.
  • 102:49 - 102:52
    That also is the same
    for if it's stored
  • 102:52 - 102:53
    in the United Kingdom,
  • 102:53 - 102:56
    it follows the United Kingdom's
    laws, and the Netherlands,
  • 102:56 - 102:58
    and Brazil, and you
    pick the place.
  • 102:58 - 103:05
    And so when the EU and
    the leaders of Germany
  • 103:05 - 103:07
    and elsewhere are talking
    about the data protection
  • 103:07 - 103:12
    and data privacy, and they are
    looking at the United States
  • 103:12 - 103:15
    and worried about how our data--
  • 103:15 - 103:18
    we're protecting
    data and the privacy,
  • 103:18 - 103:22
    then it would be also important
    understand how the European
  • 103:22 - 103:27
    companies are mirroring data
    in other countries like Brazil
  • 103:27 - 103:31
    or South Africa or
    Egypt or China or India,
  • 103:31 - 103:34
    et cetera because the data
    always follows the law
  • 103:34 - 103:39
    of the geography
    that it sits in.
  • 103:39 - 103:42
    >> I'll just add to that
    that one of the things to--
  • 103:42 - 103:47
    I mean, obviously, the
    NSA revelations have sort
  • 103:47 - 103:50
    of strengthened the EU's
    hand in a discussion
  • 103:50 - 103:52
    that was going long before.
  • 103:52 - 103:57
    We have a particular view
    of what privacy means.
  • 103:57 - 104:02
    It doesn't match up, going back
    to this question earlier about,
  • 104:02 - 104:07
    you know, what economic kind of
    motivation might move Congress.
  • 104:07 - 104:10
    One would think the US
    companies would move forward
  • 104:10 - 104:13
    on a comprehensive
    data protection regime
  • 104:13 - 104:16
    in the United States that
    might be more flexible
  • 104:16 - 104:21
    and perhaps reflect the internet
    more than the European one,
  • 104:21 - 104:23
    but I haven't seen them
    step forward on that.
  • 104:23 - 104:27
    I think it would be interesting
    for somebody to ask the question
  • 104:27 - 104:31
    about the various EU countries
    in their surveillance regimes
  • 104:31 - 104:34
    because as much as I say
    some very unpleasant things
  • 104:34 - 104:37
    about ours, I think you
    would find that it--
  • 104:37 - 104:39
    with the exception
    of our capacity
  • 104:39 - 104:44
    for the just incredible
    scale of collection,
  • 104:44 - 104:48
    that the actual legal
    protections are no better
  • 104:48 - 104:51
    at best and probably
    a lot worse.
  • 104:51 - 104:56
    >> Just one comment on
    the trade discussions.
  • 104:56 - 104:58
    Suzanne, I think
    it's a very good,
  • 104:58 - 105:01
    it's a very important question.
  • 105:01 - 105:05
    I certainly think that,
    you know, as you well know,
  • 105:05 - 105:08
    better than probably anyone
    in this room, you know,
  • 105:08 - 105:11
    trade agreements have always
    made exceptions for things
  • 105:11 - 105:14
    like national security,
    public morals,
  • 105:14 - 105:17
    sometimes consumer
    protections, things like that.
  • 105:17 - 105:21
    I think what we see now is
    that simply pushing them off,
  • 105:21 - 105:24
    those issues off into the
    exception category is not going
  • 105:24 - 105:28
    to work so we need
    some kind of mechanism
  • 105:28 - 105:30
    that on the one hand
    respects the fact
  • 105:30 - 105:32
    that governments do
    have a legitimate
  • 105:32 - 105:35
    and important interest in
    protecting their citizens
  • 105:35 - 105:40
    against unsafe products,
    against human rights violations
  • 105:40 - 105:42
    if that's the way they do
    privacy, against, you know,
  • 105:42 - 105:45
    security breaches,
    what have you.
  • 105:45 - 105:51
    But tying that to the
    location of data is, I think,
  • 105:51 - 105:57
    just an overly simplistic way
    of accomplishing that purpose,
  • 105:57 - 106:00
    and I think, you know,
    you can make fancy--
  • 106:00 - 106:04
    well, you can make fancy trade
    arguments about why that's not,
  • 106:04 - 106:08
    you know, most favored
    nation treatment.
  • 106:08 - 106:13
    But I think the bottom line is,
    we used to have trade agreements
  • 106:13 - 106:16
    that were fundamentally about
    tariffs and we've mostly dealt
  • 106:16 - 106:18
    with those issues, and now we
    are going to trade agreements
  • 106:18 - 106:22
    that are fundamentally about
    the non-tariff barriers
  • 106:22 - 106:24
    that exist between economies.
  • 106:24 - 106:27
    So we're going to have
    to deal with that either.
  • 106:27 - 106:31
    I think that in the, you
    know-- for some period of time,
  • 106:31 - 106:35
    I think the surveillance issues
    will cloud those discussions
  • 106:35 - 106:38
    but we'll come back
    to them at some point
  • 106:38 - 106:41
    and they will be
    the same issues.
  • 106:41 - 106:44
    So-- and I-- the only thing I--
  • 106:44 - 106:50
    the only final thing I
    would say, I think that one
  • 106:50 - 106:51
    of the big challenges
    we're going to have
  • 106:51 - 106:55
    in the trade context on
    internet issues is the challenge
  • 106:55 - 107:00
    that we found with ACTA,
    that the ACTA was making--
  • 107:00 - 107:02
    [Inaudible Remark] Sorry, oh,
    sorry, oh God, Steve is going
  • 107:02 - 107:04
    to throw that look at me.
  • 107:04 - 107:11
    So there was a trade agreement
    involving intellectual property
  • 107:11 - 107:17
    enforcement whose
    acronym is ACTA
  • 107:17 - 107:22
    and was very strenuously opposed
    by civil society groups all
  • 107:22 - 107:24
    over the world because they
    didn't know what it was,
  • 107:24 - 107:27
    they didn't know what was in
    the agreement and they argued,
  • 107:27 - 107:29
    I think-- I thought,
    quite legitimately
  • 107:29 - 107:31
    that if there are
    going to be rules made
  • 107:31 - 107:34
    about intellectual property
    rights that affect individuals,
  • 107:34 - 107:36
    there should be some
    public discussion
  • 107:36 - 107:38
    of what those rules are.
  • 107:38 - 107:42
    Trade people believe they
    somehow can't negotiate
  • 107:42 - 107:43
    in public.
  • 107:43 - 107:46
    And that's a sort of an article
    of faith in the trade world.
  • 107:46 - 107:50
    They're going to have to learn
    to be a little bit more public
  • 107:50 - 107:53
    if they're going to get anything
    done on these issues is my deal.
  • 107:53 - 107:55
    >> Lynn, you got the last
    word before the break.
  • 107:55 - 107:57
    >> Just quickly.
  • 107:57 - 107:58
    Not only-- it wasn't that they
    wouldn't negotiate in public,
  • 107:58 - 108:00
    they would not authorize
    a release
  • 108:00 - 108:02
    of the documents
    post-negotiation.
  • 108:02 - 108:04
    They were not available
    publicly.
  • 108:04 - 108:07
    >> So this sounds
    like a thread going--
  • 108:07 - 108:12
    circling back to NSA and FISA
    and all that, but I am sorry
  • 108:12 - 108:15
    that we have hit the point
    where we promised we were going
  • 108:15 - 108:18
    to take a very brief five-minute
    break that you're going
  • 108:18 - 108:21
    to have a chance to stretch
    your legs, then we're going
  • 108:21 - 108:27
    to come back and people on the
    panel will mix it up some more,
  • 108:27 - 108:31
    even better I suspect,
    moderated by Steve Roberts.
  • 108:31 - 108:33
    But let's take a quick
    five-minute break.
  • 108:33 - 108:36
    There will be one other
    opportunity for you all to meet
  • 108:36 - 108:39
    and greet at least
    most of the panelists
  • 108:39 - 108:43
    and there's a reception that
    starts at 5:15 afterwards.
  • 108:43 - 108:46
    But for now, five-minute break,
    we'll convene at 4 o'clock.
  • 108:46 - 108:47
    Thank you very much.
  • 108:47 - 108:54
    [ Inaudible Discussions ]
  • 108:54 - 108:55
    Well, thanks for
    sticking around.
  • 108:55 - 108:57
    I know it's been
    a long afternoon.
  • 108:57 - 109:00
    As Lance said, I'm
    Steve Roberts.
  • 109:00 - 109:02
    I'm a professor here at
    GW in the School of Media
  • 109:02 - 109:05
    and Public Affairs,
    right across the street.
  • 109:05 - 109:09
    And my job is to try to
    crystallize some of the issues
  • 109:09 - 109:12
    that we've been discussing
  • 109:12 - 109:14
    and [inaudible] some
    conversation among
  • 109:14 - 109:16
    the panelists.
  • 109:16 - 109:20
    And I want to start by quoting
    a couple of things I heard.
  • 109:20 - 109:22
    Lynn, for instance, said that
  • 109:22 - 109:25
    "unwanted surveillance
    is not acceptable."
  • 109:25 - 109:30
    Leslie talked a lot
    about human rights.
  • 109:30 - 109:32
    But there was a phrase
  • 109:32 - 109:36
    that I did not hear the entire
    first hour and a half except
  • 109:36 - 109:40
    in passing, and that word
    was "national security."
  • 109:40 - 109:45
    And so I want to pose this
    question to the panel.
  • 109:45 - 109:48
    Isn't national security
    a human right?
  • 109:48 - 109:50
    Isn't safety a human right?
  • 109:50 - 109:57
    Isn't the unwanted surveillance,
  • 109:57 - 110:01
    one person's unwanted
    surveillance is another person's
  • 110:01 - 110:06
    protection from danger
    and from terrorism?
  • 110:06 - 110:11
    So, I want to ask everybody,
    what's the tradeoff here?
  • 110:11 - 110:17
    The whole idea in the first half
    of this panel was the importance
  • 110:17 - 110:20
    of the freedoms in the internet.
  • 110:20 - 110:21
    But what are the limits,
  • 110:21 - 110:24
    and what are the
    legitimate tradeoffs,
  • 110:24 - 110:27
    and how do we balance
    legitimate human rights
  • 110:27 - 110:30
    against legitimate
    rights to be safe
  • 110:30 - 110:32
    from terrorism and
    other threats?
  • 110:32 - 110:34
    Who wants to start?
  • 110:34 - 110:37
    Go ahead, John.
  • 110:37 - 110:42
    >> So I'm going to almost answer
    the question, but not quite.
  • 110:42 - 110:51
    It is true that the
    governments have certain roles
  • 110:51 - 110:52
    and responsibilities.
  • 110:52 - 110:57
    And one of those roles is
    there's a certain protection,
  • 110:57 - 110:59
    a certain defensive role
  • 110:59 - 111:02
    that a government feels
    it has to provide.
  • 111:02 - 111:07
    And the question is how do
    governments do what they see
  • 111:07 - 111:09
    as their obligations,
  • 111:09 - 111:13
    their actual responsibilities
    given the internet.
  • 111:13 - 111:16
    The internet has not been
    very good at this, OK?
  • 111:16 - 111:17
    So, --
  • 111:17 - 111:18
    >> Would you excuse me.
  • 111:18 - 111:21
    It's not just an obligation,
    it's not just something
  • 111:21 - 111:25
    that they passing, this
    is the first obligation
  • 111:25 - 111:27
    of government, is to protect--
  • 111:27 - 111:27
    >> Right.
  • 111:27 - 111:28
    >> -- people.
  • 111:28 - 111:29
    Isn't it the first obligation?
  • 111:29 - 111:31
    >> Ideal. One of the things that
    [inaudible] does, of course,
  • 111:31 - 111:34
    is we're responsible for
    maintaining a registry
  • 111:34 - 111:36
    of IP addresses and that
    registry is often used
  • 111:36 - 111:40
    because someone does
    something in cyberspace
  • 111:40 - 111:41
    and the first thing
    law enforcement would
  • 111:41 - 111:44
    like to know is, where is
    that in the real world?
  • 111:44 - 111:47
    Because real world has the
    people in organizations
  • 111:47 - 111:49
    and the cyberspace has to do
  • 111:49 - 111:50
    with domain names
    and IP addresses.
  • 111:50 - 111:53
    And so, governments
    feel they have
  • 111:53 - 111:55
    to enforce laws for example.
  • 111:55 - 111:58
    And yet the internet wasn't
    built with an interface
  • 111:58 - 112:01
    for government is saying, "If
    you're trying to do your duty,
  • 112:01 - 112:04
    here's how you go about
    finding that person.
  • 112:04 - 112:08
    Here's how you go about doing
    what you see as an obligation."
  • 112:08 - 112:11
    So, I know a lot of
    people look at this
  • 112:11 - 112:15
    and they go this entire area of
    governments and what they want
  • 112:15 - 112:18
    to do with the internet and
    they want to take control
  • 112:18 - 112:20
    and they want to
    do surveillance.
  • 112:20 - 112:23
    If you turn it around
    and look at it,
  • 112:23 - 112:25
    remember that from the
    government's perspective,
  • 112:25 - 112:28
    in many cases, these governments
    feel they have an obligation
  • 112:28 - 112:32
    and the internet is
    actively preventing them
  • 112:32 - 112:34
    from doing something
    they're required
  • 112:34 - 112:35
    by their citizens to do.
  • 112:35 - 112:38
    So, we need to not omit the fact
  • 112:38 - 112:43
    that the internet doesn't
    provide a friendly interface
  • 112:43 - 112:44
    to government.
  • 112:44 - 112:47
    So, me people would see that
    as a feature but it's a fact
  • 112:47 - 112:49
    that shouldn't be
    overlooked in the discussion.
  • 112:49 - 112:50
    >> Tradeoff.
  • 112:50 - 112:51
    What's the tradeoff?
  • 112:51 - 112:52
    Leslie?
  • 112:52 - 112:54
    >> So, I want to back this up.
  • 112:54 - 112:57
    This is not a new question
    and it doesn't have
  • 112:57 - 112:58
    to do with the internet.
  • 112:58 - 113:00
    I mean, we have a--
  • 113:00 - 113:04
    an international
    human rights framework
  • 113:04 - 113:09
    that explicitly makes national
    security an exception for--
  • 113:09 - 113:13
    exception for-- in
    human rights treaties.
  • 113:13 - 113:17
    It is an obligation of
    the countries themselves
  • 113:17 - 113:19
    to protect our national
    security.
  • 113:19 - 113:21
    But we also have an entire
    developed jurisprudence
  • 113:21 - 113:26
    about publicly enacted
    transparent law,
  • 113:26 - 113:30
    proportionate law, fair
    process, remedy and oversight.
  • 113:30 - 113:34
    And, you know, I think
    we just throw this
  • 113:34 - 113:37
    into the internet context, we're
    just missing, there's a bit--
  • 113:37 - 113:41
    you know, there's a basic--
    there's bodies of law and norms.
  • 113:41 - 113:45
    And the question here is not
    whether there's a tradeoff,
  • 113:45 - 113:47
    it's whether you
    reach a balance.
  • 113:47 - 113:52
    And I think when you have
    most of this process secret,
  • 113:52 - 113:56
    the judicial process
    secret, the oversight secret,
  • 113:56 - 113:59
    the interpretation
    of the law secret,
  • 113:59 - 114:01
    then you cannot achieve
    that balance.
  • 114:01 - 114:03
    The question at the
    end of day is balance.
  • 114:03 - 114:05
    I certainly would take
    issue with the idea
  • 114:05 - 114:09
    that the internet has been
    unfriendly to law enforcement
  • 114:09 - 114:12
    and it's always this, you
    know, we're going dark
  • 114:12 - 114:13
    and we can't see anything.
  • 114:13 - 114:17
    I think if we learned anything
    over the last couple of months,
  • 114:17 - 114:20
    it's that law enforcement
    really has access
  • 114:20 - 114:22
    to much more information
  • 114:22 - 114:25
    and that there is
    therefore a temptation
  • 114:25 - 114:28
    to use what technology
    has created
  • 114:28 - 114:33
    to go beyond what a balanced
    human rights frame would allow.
  • 114:33 - 114:34
    And I think that's
    what our problem is,
  • 114:34 - 114:36
    not that they're not
    supposed to be [inaudible].
  • 114:36 - 114:38
    It is a first order of
    business for government.
  • 114:38 - 114:41
    They are supposed to
    protect their citizens.
  • 114:41 - 114:45
    But Steve, if you look at this
    framework that's been created,
  • 114:45 - 114:49
    it was framework that was
    created on a battlefield in Iraq
  • 114:49 - 114:55
    to make sure that you had every
    less possible bid of information
  • 114:55 - 114:59
    to make sure you would
    know about the IED.
  • 114:59 - 115:03
    And this whole haystack
    and needle analogy assumes
  • 115:03 - 115:07
    that collecting a
    haystack is proportionate,
  • 115:07 - 115:08
    and I don't think it is.
  • 115:08 - 115:11
    >> But the President
    has said this program
  • 115:11 - 115:13
    of surveillance is
    essential, the Director
  • 115:13 - 115:17
    of national security said that
    it's essential, the Chairman
  • 115:17 - 115:19
    of the Senate Intelligence
    Committee,
  • 115:19 - 115:21
    a liberal former mayor
  • 115:21 - 115:24
    of San Francisco had
    said it's essential,
  • 115:24 - 115:26
    what's your quarrel with this?
  • 115:26 - 115:29
    And how do you rebut their
    argument of everybody
  • 115:29 - 115:33
    who has had access to the
    secret, say this is justifiable.
  • 115:33 - 115:38
    I'd like other people
    to deal with this.
  • 115:38 - 115:39
    >> Please.
  • 115:39 - 115:40
    >> Yeah. [Multiple Speakers]
  • 115:40 - 115:45
    >> OK. So, it's-- there's
  • 115:45 - 115:48
    so many portrayals
    of this as a binary.
  • 115:48 - 115:51
    And, you know, I think
    the issue is more one
  • 115:51 - 115:53
    of degrees [phonetic] and
    a more granular issue.
  • 115:53 - 115:55
    But there are some
    binaries here, right?
  • 115:55 - 115:58
    We either have a
    constitution or we don't.
  • 115:58 - 116:00
    We either have a fourth
    amendment or we don't.
  • 116:00 - 116:02
    But if you start looking
    at the actual practices
  • 116:02 - 116:05
    and I don't think we have a full
    picture of exactly what's going
  • 116:05 - 116:08
    on but based on some of the
    things that we've heard,
  • 116:08 - 116:12
    there are ways to enact the
    necessary national security
  • 116:12 - 116:17
    without crossing the lines
    that would be unacceptable
  • 116:17 - 116:18
    to the majority of the people.
  • 116:18 - 116:22
    So, having low level-- so
    layers of control and layers
  • 116:22 - 116:25
    of accountability in the
    processes of surveillance,
  • 116:25 - 116:27
    right, so not just having any
    low level analyst being able
  • 116:27 - 116:30
    to take a fire hose of
    information and download
  • 116:30 - 116:33
    that unto their computer just
    to exaggerate the point, right?
  • 116:33 - 116:35
    So, there's the granularity,
  • 116:35 - 116:37
    there's getting the
    information that's necessary,
  • 116:37 - 116:39
    there's the process
    of accountability,
  • 116:39 - 116:40
    there's the issue
    of judicial review.
  • 116:40 - 116:42
    So, I think that there are ways
  • 116:42 - 116:45
    to enact the necessary
    national security now
  • 116:45 - 116:48
    that our public's
    fear is online and now
  • 116:48 - 116:51
    that we have the privatization
    of that public's fear
  • 116:51 - 116:53
    without going to the extremes
  • 116:53 - 116:56
    of basically having
    the fire hose analogy
  • 116:56 - 117:00
    and just downloading
    whatever data about anybody.
  • 117:00 - 117:02
    >> Anybody else on the panel
    who want to pick on this?
  • 117:02 - 117:04
    Please, go ahead.
  • 117:04 - 117:07
    >> So, you want us to talk about
    tradeoffs but I actually want
  • 117:07 - 117:12
    to suggest that I think a more
    accountable clear system is
  • 117:12 - 117:13
    better for national security.
  • 117:13 - 117:16
    I think what's happening
    now where you have very,
  • 117:16 - 117:21
    very broad authorities that
    are unclear is almost the worst
  • 117:21 - 117:24
    of all possible worlds
    for both civil liberties
  • 117:24 - 117:27
    and national security
    because you have--
  • 117:27 - 117:32
    you have policy officials,
    other governments, activists,
  • 117:32 - 117:34
    et cetera, poking
    around in the business
  • 117:34 - 117:37
    of the intelligence community,
    and that's not a very good thing
  • 117:37 - 117:40
    for them frankly, they want
    to able to do what they do
  • 117:40 - 117:46
    of quietly and-- but in
    order for us to do that,
  • 117:46 - 117:50
    in order for that to happen,
    there has to be a clear sense
  • 117:50 - 117:51
    of what the rules are.
  • 117:51 - 117:53
    One sense I which-- I
    largely agree with Leslie
  • 117:53 - 117:55
    but the one sense in which
    I think this is an internet
  • 117:55 - 117:59
    problem is it's really
    kind of a 9/11 problem,
  • 117:59 - 118:00
    and then I think a lot of
    what's going on is going
  • 118:00 - 118:06
    on under the kind of exceptional
    basis that we've handled a lot
  • 118:06 - 118:09
    of national security
    issues post 9/11.
  • 118:09 - 118:13
    And to quote the President
    again, ironically enough,
  • 118:13 - 118:17
    just a few weeks before this
    whole surveillance story broke,
  • 118:17 - 118:19
    the President went to the
    National Defense Universities,
  • 118:19 - 118:24
    you know, and gave a
    speech saying, number one,
  • 118:24 - 118:28
    it is bad for a country
    to be perpetually at war,
  • 118:28 - 118:30
    and number two, that
    the threat level
  • 118:30 - 118:35
    from Al Qaeda was basically
    below the level that it was
  • 118:35 - 118:38
    at 9/11, and that we should--
  • 118:38 - 118:41
    and that we should start
    treating that threat,
  • 118:41 - 118:45
    that national security threat
    as part of the norm not
  • 118:45 - 118:47
    as an exception that
    we have to respond
  • 118:47 - 118:50
    to with these kinds
    of exceptions.
  • 118:50 - 118:53
    This whole surveillance
    program was created
  • 118:53 - 118:55
    as the President's
    surveillance programs, you know,
  • 118:55 - 118:57
    as an exception, and
    that's its problem.
  • 118:57 - 118:58
    If the problem is not--
  • 118:58 - 119:00
    >> Although continued by
    a democratic president--
  • 119:00 - 119:05
    >> Yes, that will-- that's
    right but I think that it needs
  • 119:05 - 119:10
    to be continued on a more
    clear accountable basis.
  • 119:10 - 119:15
    And so I don't tend to accept
    that the issue gets solved
  • 119:15 - 119:17
    by saying more surveillance,
    less surveillance.
  • 119:17 - 119:20
    I think that the issue
    is surveillance according
  • 119:20 - 119:22
    to what rules and with
    what accountability.
  • 119:22 - 119:25
    >> Lynn I quoted [phonetic]
    you-- please, your turn.
  • 119:25 - 119:26
    >> Yeah. I mean, I'll come
    to my point [inaudible]
  • 119:26 - 119:30
    but just [inaudible] said
    unwarranted surveillance.
  • 119:30 - 119:32
    I recognize the difficulty
    in unwanted thief
  • 119:32 - 119:35
    in the perspective
    of the individual.
  • 119:35 - 119:37
    So, it's very much
    an unwarranted.
  • 119:37 - 119:38
    >> Thank you.
  • 119:38 - 119:39
    >> But-- And I just
    really wanted
  • 119:39 - 119:40
    to echo the three comments
  • 119:40 - 119:43
    that have been made here
    very, very, very strongly.
  • 119:43 - 119:45
    When-- In a system
    like the internet
  • 119:45 - 119:48
    which is breaking all barriers
    where we're so interconnected
  • 119:48 - 119:51
    and so interdependent, we
    have to change our paradigm
  • 119:51 - 119:54
    of looking at this and move
    it to one of managing risk.
  • 119:54 - 119:57
    We can talk about in the context
    of tradeoffs if you like.
  • 119:57 - 120:00
    But we really do
    have to look at--
  • 120:00 - 120:02
    >> Wasn't that the only context?
  • 120:02 - 120:06
    Or isn't tradeoff is the only
    legitimate way to talk about it?
  • 120:06 - 120:09
    Because we-- isn't that
    what we do everyday?
  • 120:09 - 120:11
    >> I think tradeoff
    is legitimate way
  • 120:11 - 120:13
    but I'm not sure we're
    making the right tradeoffs
  • 120:13 - 120:16
    and I think we continue to
    focus on national security
  • 120:16 - 120:20
    and quite often run
    by flagrant abuses
  • 120:20 - 120:24
    than we're conflating the issue
    or we're not pulling it apart.
  • 120:24 - 120:26
    It's not about national security
    and what is the best way
  • 120:26 - 120:29
    to protect national security.
  • 120:29 - 120:30
    From my perspective, you know,
  • 120:30 - 120:32
    what our members
    are actually upset
  • 120:32 - 120:35
    about is the flagrant
    abuses and the lack
  • 120:35 - 120:37
    of transparency and the secrecy.
  • 120:37 - 120:38
    >> Yes, please.
  • 120:38 - 120:42
    >> And I think part of
    the problem that I have
  • 120:42 - 120:44
    with hearing what they've said
  • 120:44 - 120:46
    for giving the reasons is
    we haven't heard the reason
  • 120:46 - 120:47
    that makes sense.
  • 120:47 - 120:50
    It's-- why are you
    collecting the data,
  • 120:50 - 120:52
    it's like the equivalent
    of your mom
  • 120:52 - 120:55
    and dad saying well,
    because I said so.
  • 120:55 - 120:58
    You have to give it something
    behind that, you know, to--
  • 120:58 - 121:02
    is there a credible threat
    at whatever level there is.
  • 121:02 - 121:03
    I think Danny is right.
  • 121:03 - 121:07
    I think the threat level from
    what we've read in previous,
  • 121:07 - 121:09
    you know, speeches
    by the President is
  • 121:09 - 121:11
    that the threat level is a
    lot less than it was at 9/11.
  • 121:11 - 121:13
    Well, if it's a lot less
    than it wasn't 9/11,
  • 121:13 - 121:15
    why do we need to expand this?
  • 121:15 - 121:18
    Give us something to look at.
  • 121:18 - 121:21
    But I think when I hear
    them talk about these things
  • 121:21 - 121:25
    and give these reasons as
    somebody who does this type
  • 121:25 - 121:28
    of metadata analysis
    and, you know,
  • 121:28 - 121:30
    just as for my own
    infrastructure was certainly
  • 121:30 - 121:33
    in my own infrastructure I
    can do a lot of that analysis.
  • 121:33 - 121:35
    But there's an uplink to me.
  • 121:35 - 121:38
    And those people that
    control that infrastructure,
  • 121:38 - 121:39
    my infrastructure,
    your infrastructure,
  • 121:39 - 121:41
    they can do the same
    type of analysis
  • 121:41 - 121:43
    and it trees all the way up.
  • 121:43 - 121:46
    It's-- So, why?
  • 121:46 - 121:48
    You have to give us
    a credible reason.
  • 121:48 - 121:50
    >> But was running through a lot
  • 121:50 - 121:54
    of your comments is a basic
    mistrust of government.
  • 121:54 - 121:55
    I mean you've been told--
  • 121:55 - 122:00
    public has been told by
    the Intelligence committees
  • 122:00 - 122:03
    which have in brief that this
    is a very valuable tool PRISM
  • 122:03 - 122:06
    and other surveillance are
    very valuable to the President.
  • 122:06 - 122:09
    Duly elected democratic liberal
    president had said it's a very
  • 122:09 - 122:11
    valuable tool, the
    liberal chairman
  • 122:11 - 122:13
    of the Senate Intelligence
    Committee has said it's a very
  • 122:13 - 122:16
    valuable tool and everyone
    of you is doubting it.
  • 122:16 - 122:21
    So, what is the source of your
    suspicions and your mistrust
  • 122:21 - 122:22
    of what you're being told?
  • 122:22 - 122:25
    >> So, I actually
    don't, you know, when--
  • 122:25 - 122:28
    there's no way for us to
    know if it's a valuable tool
  • 122:28 - 122:29
    or not a valuable tool.
  • 122:29 - 122:33
    So, you could take them at face
    value that it's a valuable tool.
  • 122:33 - 122:35
    Collection of data
    is a valuable thing.
  • 122:35 - 122:39
    It's just not the only analysis
    in a democratic society.
  • 122:39 - 122:41
    And I think-- so, I mean, I--
  • 122:41 - 122:44
    and that's I think
    the problem here,
  • 122:44 - 122:48
    we live in a democratic
    society where we're supposed
  • 122:48 - 122:52
    to have proportional laws
    that more than a small number
  • 122:52 - 122:56
    of people are able to know
    about to assess that balance
  • 122:56 - 122:58
    between liberty and security.
  • 122:58 - 123:00
    We really don't have that.
  • 123:00 - 123:03
    I mean, if this was a discussion
  • 123:03 - 123:07
    that they were tasking
    these companies
  • 123:07 - 123:12
    with specific request
    based on articulable facts
  • 123:12 - 123:13
    about specific individuals
  • 123:13 - 123:16
    or even several hundred
    individuals,
  • 123:16 - 123:19
    we might be having a different
    discussion about the balance
  • 123:19 - 123:21
    between liberty and security.
  • 123:21 - 123:24
    But we're having a
    discussion about the value
  • 123:24 - 123:26
    of basically collecting
    the data potentially
  • 123:26 - 123:28
    on everybody in the world.
  • 123:28 - 123:30
    I mean, we don't know the set.
  • 123:30 - 123:33
    So, it's not about trust
    of government or not.
  • 123:33 - 123:36
    You know, I think you could
    probably come, you know,
  • 123:36 - 123:39
    device a program where you
    know everything about everybody
  • 123:39 - 123:42
    in the world and claim
    it makes you safer,
  • 123:42 - 123:44
    maybe it makes you safer.
  • 123:44 - 123:48
    But we no longer are
    following either the values,
  • 123:48 - 123:50
    the constitution
    or the norms of it.
  • 123:50 - 123:52
    >> Well, we see you're not
    following the constitution.
  • 123:52 - 123:56
    These laws were passed by
    democratically-elected Congress.
  • 123:56 - 123:58
    >> Well, having been there.
  • 123:58 - 123:58
    Yeah.
  • 123:58 - 123:59
    >> Yeah, right.
  • 123:59 - 124:00
    >> Having been there,
  • 124:00 - 124:01
    they didn't know what
    they were passing
  • 124:01 - 124:03
    and they didn't know
    what [inaudible].
  • 124:03 - 124:04
    >> So, [inaudible] to that, you
    just said the issue wasn't trust
  • 124:04 - 124:06
    and you now you just say, well,
  • 124:06 - 124:07
    they didn't know what
    they were passing.
  • 124:07 - 124:08
    >> They didn't know--
    they didn't know--
  • 124:08 - 124:11
    >> If you believe well-- well,
    let me ask you the question.
  • 124:11 - 124:14
    If you believe in a democratic
    system, now you're saying,
  • 124:14 - 124:15
    I don't believe
  • 124:15 - 124:17
    in the democratic process
    that's passed these laws.
  • 124:17 - 124:18
    >> No.
  • 124:18 - 124:19
    >> That they were
    misinformed and we know better?
  • 124:19 - 124:23
    >> No. The laws have been
    after the fact, stretched
  • 124:23 - 124:26
    and manipulated in ways
    the Congress didn't intend.
  • 124:26 - 124:32
    And that's where-- and that's
    where we've gone off the rails.
  • 124:32 - 124:36
    Nobody who is in Congress
    at that time, Section 215,
  • 124:36 - 124:37
    the metadata law
    that we're talking
  • 124:37 - 124:40
    about which is aimed
    specifically US citizens.
  • 124:40 - 124:45
    In making the changes they
    were making, nobody believed
  • 124:45 - 124:49
    that relevant data meant
    everybody in the United States.
  • 124:49 - 124:50
    >> Let me ask you--
  • 124:50 - 124:52
    >> So, you know, this is--
    so this is ultimately--
  • 124:52 - 124:54
    yes, it's a question
    of trust in so far
  • 124:54 - 124:58
    as the people implementing the
    law have made it as elastic
  • 124:58 - 125:00
    as possible without it
    completely exploding
  • 125:00 - 125:04
    and it might not be exploding.
  • 125:04 - 125:05
    >> John.
  • 125:05 - 125:08
    >> I've expressed no view
    on whether it's desirable
  • 125:08 - 125:12
    or undesirable or whether
    it's illegal or legal.
  • 125:12 - 125:16
    Neither of those questions
    are really interesting to me.
  • 125:16 - 125:19
    When it comes to
    the internet though,
  • 125:19 - 125:22
    the question is there's
    a set of events going
  • 125:22 - 125:25
    on regarding surveillance,
  • 125:25 - 125:28
    and the internet is
    global in nature.
  • 125:28 - 125:32
    We have allies and
    trading partners
  • 125:32 - 125:39
    and organizations globally,
    other governments who want
  • 125:39 - 125:42
    to understand what's going
  • 125:42 - 125:46
    on because they may
    have the same desire.
  • 125:46 - 125:51
    There may be another government
    that wishes for its reasons
  • 125:51 - 125:55
    to engage in surveillance of
    communication in its country.
  • 125:55 - 125:59
    And it wants to understand
    what is the framework
  • 125:59 - 126:03
    by which this is occurring
    and how does it happen
  • 126:03 - 126:07
    in the internet, how was it's
    supported, how does it go
  • 126:07 - 126:09
    on because they have their
    own national interests
  • 126:09 - 126:12
    and they may pass laws in their
    region that are perfectly fine
  • 126:12 - 126:16
    and acceptable according
    to their processes.
  • 126:16 - 126:18
    So, we now have this
    framework that says,
  • 126:18 - 126:20
    there are circumstances during
  • 126:20 - 126:23
    which surveillance is
    apparently an accepted part
  • 126:23 - 126:24
    of the architecture.
  • 126:24 - 126:26
    What we don't have
    is a transparency
  • 126:26 - 126:30
    about where that's occurring and
    how that occurs and what happens
  • 126:30 - 126:33
    if another 130 countries
    also do it.
  • 126:33 - 126:37
    I believe that if we're to have
    equitable internet governance,
  • 126:37 - 126:40
    it's necessary to have
    a fully articulated
  • 126:40 - 126:42
    and transparent framework.
  • 126:42 - 126:44
    I do not know whether
    or not everyone is going
  • 126:44 - 126:47
    to like what their country
    chooses for what it does
  • 126:47 - 126:48
    with surveillance or not.
  • 126:48 - 126:50
    That's a different question.
  • 126:50 - 126:52
    That's a question of laws
    and governance structure.
  • 126:52 - 126:54
    But if this is going to go
    on, we need to understand
  • 126:54 - 126:57
    where it occurs and how
    it occurs and the fact
  • 126:57 - 126:59
    that it could be occurring
    in a lot of places.
  • 126:59 - 127:02
    And that should be understood
    and documented as part
  • 127:02 - 127:06
    of a clearly transparent
    recognition
  • 127:06 - 127:08
    that this is part
    of the internet.
  • 127:08 - 127:11
    >> Let me ask you
    all this question.
  • 127:11 - 127:13
    You said you're not
    interested in legality.
  • 127:13 - 127:18
    But one of the key debates
    here is the legal framework
  • 127:18 - 127:20
    for PRISM and the FISA law.
  • 127:20 - 127:26
    And there's not much
    debate about whether
  • 127:26 - 127:30
    or what's been taking places
    legal or not under the law,
  • 127:30 - 127:33
    but there's a big debate about
    whether the law is the right law
  • 127:33 - 127:34
    and whether the framework
    should be altered.
  • 127:34 - 127:39
    If you all were to testify
    before an intelligence
  • 127:39 - 127:42
    committees and others
    on the Hill and asked,
  • 127:42 - 127:44
    how should this law be change
  • 127:44 - 127:46
    to advance the values
    your talking about?
  • 127:46 - 127:48
    What would be some
    of your suggestions?
  • 127:48 - 127:52
    Go ahead.
  • 127:52 - 127:55
    >> I think we have to start
    by walking out the changes
  • 127:55 - 127:59
    that we made in the
    records law because we took
  • 127:59 - 128:03
    out all the words that
    make it proportionate
  • 128:03 - 128:07
    and make it targeted
    and transformed it
  • 128:07 - 128:09
    into a broad collection statute.
  • 128:09 - 128:12
    And I, you know, I think
    the ways to walk it back
  • 128:12 - 128:18
    that allows substantial
    collection of targets
  • 128:18 - 128:22
    and information related to
    that targets and backs it
  • 128:22 - 128:27
    out of this broad relevancy
    standard that allows it--
  • 128:27 - 128:31
    if one to be collected
    on anybody but also not--
  • 128:31 - 128:35
    it just takes out the whole
    national security purpose.
  • 128:35 - 128:38
    Basically, you no longer
    have to be collecting data
  • 128:38 - 128:40
    because of a particular
    terrorist
  • 128:40 - 128:43
    or particular intelligence
    activity.
  • 128:43 - 128:46
    You only have to be
    collecting in order
  • 128:46 - 128:48
    to be protecting
    the United States.
  • 128:48 - 128:53
    And that is simply to fraud
    and to likely to abuse.
  • 128:53 - 128:55
    It is harder for me to
    note what have changed 702
  • 128:55 - 128:58
    because we still don't
    understand what they're doing.
  • 128:58 - 128:59
    >> And [inaudible] going
    to thought about these--
  • 128:59 - 129:02
    >> Except for the transparence--
    except for greater transparency.
  • 129:02 - 129:05
    >> 'Cause this is-- this
    is going to be debated
  • 129:05 - 129:06
    in the weeks and months ahead.
  • 129:06 - 129:08
    It's a very important question.
  • 129:08 - 129:09
    >> But the traditional method
  • 129:09 - 129:11
    of before the haystack
    love [phonetic],
  • 129:11 - 129:13
    if you allow me to call it that.
  • 129:13 - 129:15
    Before that was placed,
    the traditional methods
  • 129:15 - 129:17
    of law enforcement
    worked, and they would work
  • 129:17 - 129:19
    with whatever the
    internet had to provide.
  • 129:19 - 129:21
    You needed to identify somebody.
  • 129:21 - 129:24
    You usually identify them
    through a non-technical aspect.
  • 129:24 - 129:27
    You had an informant
    somebody tell you, "Hey,
  • 129:27 - 129:30
    I think that you were doing
    something illegal," and that--
  • 129:30 - 129:32
    that puts the focus on you.
  • 129:32 - 129:34
    From there, you would use
    your traditional tools whether
  • 129:34 - 129:36
    they're on the net or
    not to find, you know,
  • 129:36 - 129:38
    to find out what's going on.
  • 129:38 - 129:45
    Collecting data when there's
    no evidence of a crime is kind
  • 129:45 - 129:48
    of counter intuitive and
    kind of productive in a way
  • 129:48 - 129:50
    because it's sitting there.
  • 129:50 - 129:53
    So, if it were me and I was an
    adversary of the United States,
  • 129:53 - 129:57
    I would find where those--
    that database was being stored
  • 129:57 - 130:00
    and I would go after it to
    collect all that information
  • 130:00 - 130:02
    because I can do the same
    type of metadata analysis
  • 130:02 - 130:03
    that could be done that way.
  • 130:03 - 130:07
    The third thing is--
    the reason why we--
  • 130:07 - 130:10
    you've mentioned earlier that
    maybe we've mistrusted is
  • 130:10 - 130:14
    because we know what can be
    done with that data and we know
  • 130:14 - 130:15
    when somebody-- we know
  • 130:15 - 130:18
    when somebody has given
    us a [inaudible] answer.
  • 130:18 - 130:21
    Because I know that if I
    control my infrastructure,
  • 130:21 - 130:24
    I know the power that I
    have doing in my office,
  • 130:24 - 130:27
    I have to act responsibly
    and within, you know,
  • 130:27 - 130:28
    the laws and all that.
  • 130:28 - 130:30
    But I know what can be done
  • 130:30 - 130:33
    if there's no oversight
    on my position.
  • 130:33 - 130:35
    And that's why I
    think you hear--
  • 130:35 - 130:36
    >> But there is oversight.
  • 130:36 - 130:39
    It is, you know, I mean they--
  • 130:39 - 130:42
    >> It would be if you were-- if
    you were the oversight committee
  • 130:42 - 130:46
    for what I would do technically,
    do you have the technical skills
  • 130:46 - 130:48
    to know what I'm doing?
  • 130:48 - 130:49
    [Inaudible Remark]
  • 130:49 - 130:53
    >> So I'm going to
    leave the legal--
  • 130:53 - 130:56
    so the legal policy
    questions, I largely agree
  • 130:56 - 130:57
    with what has been said.
  • 130:57 - 130:59
    I don't think that we
    have adequate oversight.
  • 130:59 - 131:03
    And if I were-- I was talking
    to Congress about what to do
  • 131:03 - 131:05
    to increase the trust
    in this environment,
  • 131:05 - 131:07
    I would say that that Congress
  • 131:07 - 131:10
    and the FISA court
    needs a more effective
  • 131:10 - 131:11
    accountability mechanism.
  • 131:11 - 131:14
    There is no way that the FISA
    court judges or the members
  • 131:14 - 131:16
    of the Intelligence
    community are looking
  • 131:16 - 131:18
    at every single query--
  • 131:18 - 131:19
    >> Right.
  • 131:19 - 131:21
    >> -- that NSA analysts
    are performing
  • 131:21 - 131:22
    on these enormous
    amounts of data.
  • 131:22 - 131:24
    It is technically possible to do
  • 131:24 - 131:26
    that to Mike Nelson's [assumed
    spelling] earlier point
  • 131:26 - 131:31
    to actually evaluate whether
    when Bob led the general counsel
  • 131:31 - 131:34
    of ODNI says, "Yeah, we have
    the whole state haystack
  • 131:34 - 131:38
    but we only ask questions
    based on certain predicates."
  • 131:38 - 131:40
    I know Bob and I more
    or less trust him
  • 131:40 - 131:44
    but I think it's outrageous
    for the government to say
  • 131:44 - 131:46
    to its citizens,
    "Don't worry, trust us."
  • 131:46 - 131:49
    We give the government
    considerable authority
  • 131:49 - 131:50
    but we need mechanisms
    to make sure it's being
  • 131:50 - 131:52
    used responsibly.
  • 131:52 - 131:54
    And none of the oversight
    mechanisms that exist
  • 131:54 - 131:58
    that you've mentioned are
    able to provide that kind
  • 131:58 - 132:00
    of accountability
    other than to call
  • 132:00 - 132:03
    up a responsible
    person like, you know,
  • 132:03 - 132:04
    Mr. Ledin [assumed
    spelling] say,
  • 132:04 - 132:05
    "Are you following the rules?"
  • 132:05 - 132:07
    And he says, "Yes, we're
    following the rules."
  • 132:07 - 132:09
    He doesn't even know what
    every single analyst does
  • 132:09 - 132:11
    in the agencies that's
    responsible for it.
  • 132:11 - 132:15
    >> Let me ask you about one of
    the proposals that has been made
  • 132:15 - 132:17
    and why they debated
    in Washington.
  • 132:17 - 132:21
    And so, a lot of people
    pointed out one of the flaws,
  • 132:21 - 132:23
    critics had pointed
    out one of the flaws
  • 132:23 - 132:27
    in the FISA process is there's
    no adversarial mechanism
  • 132:27 - 132:32
    within the FISA system that the
    government comes in and asks
  • 132:32 - 132:36
    for permission and there
    is no counter voice.
  • 132:36 - 132:37
    There is no adversarial
    proceeding.
  • 132:37 - 132:42
    And in most legal systems, there
    is a mechanism for challenging.
  • 132:42 - 132:46
    So, a number of legislative
    proposals have been advanced
  • 132:46 - 132:47
    in one form or another.
  • 132:47 - 132:50
    Is this one of the mechanism set
    that Congress should be looking
  • 132:50 - 132:54
    at to try to reorder
    this balance that many
  • 132:54 - 132:55
    of you think is out way?
  • 132:55 - 132:59
    >> Yeah. I actually think
    this question of some kind
  • 132:59 - 133:02
    of a public advocate and
    I think it's probably--
  • 133:02 - 133:06
    we figured it out how to give
    private lawyers the ability
  • 133:06 - 133:07
    to look at classified
    information
  • 133:07 - 133:09
    and criminal procedures.
  • 133:09 - 133:12
    We ought to be able to come up
    with this table of those people
  • 133:12 - 133:15
    who litigate the
    national security state,
  • 133:15 - 133:18
    who can stand in for the public.
  • 133:18 - 133:25
    And, you know, I think it's one
    of many potential points of sort
  • 133:25 - 133:28
    of strengthening oversight here.
  • 133:28 - 133:31
    I think it's-- I think
    it's critically important.
  • 133:31 - 133:32
    I think different levels of sort
  • 133:32 - 133:35
    of reporting back
    to the FISA court.
  • 133:35 - 133:37
    I think part of the
    problem and the reason
  • 133:37 - 133:39
    that people are responding
    is, I mean,
  • 133:39 - 133:43
    it's interesting FISA was
    a civil liberties measure
  • 133:43 - 133:46
    when it was adopted in 1978.
  • 133:46 - 133:49
    And, you know, it's basically
    my mentors were the civil
  • 133:49 - 133:52
    libertarians who
    proposed this crazy idea.
  • 133:52 - 133:59
    And the world is really
    different from the ones in 1978.
  • 133:59 - 134:02
    And being in contact with
    a foreign person, I mean,
  • 134:02 - 134:04
    you know, there was
    an iron curtain.
  • 134:04 - 134:05
    If you were-- if they
    could actually figure
  • 134:05 - 134:08
    out you were communicating
    with a foreign person,
  • 134:08 - 134:11
    it probably had some
    significance.
  • 134:11 - 134:13
    We now live in this
    world where 20 percent
  • 134:13 - 134:16
    of us are first generation,
  • 134:16 - 134:19
    where our corporations
    are global,
  • 134:19 - 134:22
    where we travel all the
    time, and where we--
  • 134:22 - 134:25
    people don't identify
    themselves in that the same kind
  • 134:25 - 134:28
    of we-them way and
    yet we have a law
  • 134:28 - 134:31
    and a designation
    of foreign persons.
  • 134:31 - 134:35
    That really is a Cold
    War era and almost has
  • 134:35 - 134:37
    to be reconsidered
    all by itself.
  • 134:37 - 134:40
    And the law can't
    be working very well
  • 134:40 - 134:42
    because it's not
    supposed to sweep
  • 134:42 - 134:44
    up American's communications.
  • 134:44 - 134:46
    And all you have to do is read
    the minimization guidelines
  • 134:46 - 134:49
    to understand how much of our
    communications [inaudible].
  • 134:49 - 134:53
    >> Hal Berghel wrote
    an interesting article
  • 134:53 - 134:56
    that just came out in
    IEEE Computer magazine.
  • 134:56 - 134:58
    It's called "Through
    the PRISM Darkly".
  • 134:58 - 135:00
    And there's a quote
    in there that says,
  • 135:00 - 135:02
    "The Foreign Intelligence
    Surveillance Court has an
  • 135:02 - 135:06
    approval rate of 99.93 percent
    of all surveillance requests.
  • 135:06 - 135:09
    While this might not meet
    the strict definition
  • 135:09 - 135:11
    of a kangaroo court,
    it seems to fall
  • 135:11 - 135:13
    within the marsupial family."
  • 135:13 - 135:14
    So--
  • 135:14 - 135:15
    [ Laughter ]
  • 135:15 - 135:19
    The point of it is is that there
    are, you know, there are two
  • 135:19 - 135:22
    and he calls them cyber urban
    myths that deal with this.
  • 135:22 - 135:25
    One is that you do need
    to all of this, you know,
  • 135:25 - 135:29
    information to do it
    and that there's not--
  • 135:29 - 135:31
    there's sufficient oversight.
  • 135:31 - 135:32
    And I think we all agree,
  • 135:32 - 135:35
    there's not sufficient
    oversight.
  • 135:35 - 135:40
    You wouldn't ask me to be on
    a medical oversight committee
  • 135:40 - 135:42
    to review surgical procedures
  • 135:42 - 135:44
    because I'm not a
    doctor, I'm not a surgeon.
  • 135:44 - 135:46
    I don't have that
    expertise in that area.
  • 135:46 - 135:47
    And so, I think that's the--
  • 135:47 - 135:49
    I think that's the thing
  • 135:49 - 135:53
    that bothers the technology
    people is we know it is a small
  • 135:53 - 135:56
    community of the actual people
    that know the nuts and bolts
  • 135:56 - 135:59
    about technology security
    from that standpoint.
  • 135:59 - 136:01
    And if we don't know
    each other by--
  • 136:01 - 136:04
    personally, we know each other
    by reputation and whatever.
  • 136:04 - 136:08
    And so when you look and
    you query your other peers
  • 136:08 - 136:10
    and they have the same
    misgivings that you do,
  • 136:10 - 136:12
    then it makes you
    wonder what's going on.
  • 136:12 - 136:16
    >> I just want to make one
    other point about the people
  • 136:16 - 136:19
    who are being swept up in
    this-- in this surveillance.
  • 136:19 - 136:21
    I think one of the reasons
  • 136:21 - 136:23
    that more fine-grained
    accountability is really
  • 136:23 - 136:26
    critical is we don't know
  • 136:26 - 136:29
    but I think it's a safe
    assumption that, you know,
  • 136:29 - 136:33
    somewhere north of 90 percent
    of the people who are on any
  • 136:33 - 136:36
    of these terrorism watch list
    who were the targets of any
  • 136:36 - 136:39
    of this surveillance
    are Arab-Americans
  • 136:39 - 136:40
    who have some connection
    to the Middle East,
  • 136:40 - 136:43
    many of whom are
    probably Muslims.
  • 136:43 - 136:46
    In this country, we don't
    have a great history
  • 136:46 - 136:50
    of treating minorities
    perfectly fairly.
  • 136:50 - 136:54
    And part of the reason we
    have oversight mechanisms,
  • 136:54 - 136:57
    part of reason we have open
    courts, part of reason we try
  • 136:57 - 137:03
    to have due process with public
    visibility is to make sure
  • 137:03 - 137:06
    that we actually do
    our job and really--
  • 137:06 - 137:08
    and really treating people
    fairly and not discriminating.
  • 137:08 - 137:12
    And again, I don't--
    I don't think--
  • 137:12 - 137:14
    I think it's really interesting,
    you hear this panel,
  • 137:14 - 137:15
    you hear the public discussion.
  • 137:15 - 137:17
    I don't think through
    that many people saying,
  • 137:17 - 137:19
    "Stop the Surveillance."
  • 137:19 - 137:20
    People are not saying that I was
  • 137:20 - 137:22
    in Boston during the
    marathon bombings.
  • 137:22 - 137:25
    People are clamoring for the
    instillation of more cameras.
  • 137:25 - 137:28
    You know, the question
    I really believe is not
  • 137:28 - 137:31
    about whether law enforcement
    should have these tools.
  • 137:31 - 137:32
    It's about whether
    we can feel confident
  • 137:32 - 137:35
    that they're using
    them responsibly.
  • 137:35 - 137:38
    >> One of you made
    the point that--
  • 137:38 - 137:39
    actually, this is
    not a new issue--
  • 137:39 - 137:43
    that just a few years ago,
    Washington posted an expose,
  • 137:43 - 137:46
    a very lengthy one, Diana Press
    [assumed spelling] and others.
  • 137:46 - 137:50
    And there was not anything
    like the same kind of reaction
  • 137:50 - 137:52
    that there's been lately.
  • 137:52 - 137:55
    And I'm wondering why you
    think what's the difference,
  • 137:55 - 137:59
    why has there been such--
    why are we having this panel
  • 137:59 - 138:00
    and not three years ago.
  • 138:00 - 138:02
    Why is there so much
    conversation now
  • 138:02 - 138:05
    about this issue
    when, in fact, when--
  • 138:05 - 138:08
    at least some of this
    was made known through--
  • 138:08 - 138:10
    to post another mechanisms
    some years ago?
  • 138:10 - 138:12
    >> Smartphones.
  • 138:12 - 138:13
    >> We didn't have the numbers,
    we didn't have the numbers.
  • 138:13 - 138:16
    I mean, from the Verizon
    order, we now know.
  • 138:16 - 138:17
    It's-- I don't remember
    how many millions
  • 138:17 - 138:19
    of people's records were--
  • 138:19 - 138:21
    tens of millions of people's
    records were collected.
  • 138:21 - 138:23
    We had-- we have
    general outlines
  • 138:23 - 138:27
    from the post reporting,
    from, you know, from all kinds
  • 138:27 - 138:30
    of things but we did-- it was
    not dramatized with numbers.
  • 138:30 - 138:34
    And I think and now we know
    it's basically everyone's data.
  • 138:34 - 138:36
    We never actually knew
    that was happening.
  • 138:36 - 138:37
    >> Right. I think that's right.
  • 138:37 - 138:39
    I also think there was an uproar
  • 138:39 - 138:43
    when the warrantless
    wiretapping was first revealed.
  • 138:43 - 138:46
    We had a very big
    battle in Congress.
  • 138:46 - 138:48
    People were really
    worried about it.
  • 138:48 - 138:49
    People who were not
    following it closely
  • 138:49 - 138:52
    and they passed an amendment
    might have actually thought
  • 138:52 - 138:54
    that what we wound
    up with was some kind
  • 138:54 - 138:56
    of additional procedural
    protections.
  • 138:56 - 139:00
    Instead, we wound up-- what we
    wound up with was a ratification
  • 139:00 - 139:02
    of that program, immunity
  • 139:02 - 139:05
    for the providers,
    and not much more.
  • 139:05 - 139:09
    And so I, you know, I
    think a lot of us knew some
  • 139:09 - 139:12
    of this was going on but
    I think we didn't know.
  • 139:12 - 139:15
    One of my colleagues, Jim
    Dempsey who's on the Privacy
  • 139:15 - 139:20
    and Civil Liberties Oversight
    Board wrote an article saying,
  • 139:20 - 139:23
    "Did they just approve
    a vacuum cleaner?"
  • 139:23 - 139:25
    And lots of people then
    responded with, "Oh,
  • 139:25 - 139:28
    you know, that's ridiculous."
  • 139:28 - 139:32
    Well, no, in fact, they
    approved a vacuum cleaner
  • 139:32 - 139:33
    and we didn't know it.
  • 139:33 - 139:34
    >> Laura? Laura.
  • 139:34 - 139:36
    >> I agree with those points.
  • 139:36 - 139:38
    I think it is a matter
    of scale in this case.
  • 139:38 - 139:40
    But I think there are
    few other points, too,
  • 139:40 - 139:41
    that are a little
    bit more subtle.
  • 139:41 - 139:43
    So, one issue is that this came
  • 139:43 - 139:47
    after Hillary Clinton's
    internet freedom speech
  • 139:47 - 139:51
    and this entire mean
    [phonetic] that has been created
  • 139:51 - 139:54
    and I think rightly
    replicated around the world
  • 139:54 - 139:56
    about internet freedom.
  • 139:56 - 140:01
    So, this is a very important
    issue and what this does is it--
  • 140:01 - 140:04
    it's providing an opportunity
    for people to challenge
  • 140:04 - 140:07
    that notion of the internet
    freedom and to point
  • 140:07 - 140:10
    out what they're
    calling hypocrisy.
  • 140:10 - 140:12
    So, that to me is a problem.
  • 140:12 - 140:16
    It can also be an opportunity
    if it's addressed appropriately.
  • 140:16 - 140:22
    Another issue is that individual
    citizens are living their life
  • 140:22 - 140:23
    online to a much greater extent
  • 140:23 - 140:25
    than they did five
    years ago even.
  • 140:25 - 140:27
    It's really amazing
    if you think.
  • 140:27 - 140:29
    I mean, I'll just give you
    an example I was talking
  • 140:29 - 140:30
    about today.
  • 140:30 - 140:32
    So, five years ago, I was--
  • 140:32 - 140:34
    I would read the paper
    in the morning and now,
  • 140:34 - 140:37
    I'm completely connected to
    my smartphone all the time
  • 140:37 - 140:40
    and I am posting information,
    I'm communicating in it.
  • 140:40 - 140:43
    So, the public sphere
    is what's at stake here.
  • 140:43 - 140:44
    The public sphere is online.
  • 140:44 - 140:46
    And I think that
    that is much greater
  • 140:46 - 140:49
    than during these last
    instances a few years ago
  • 140:49 - 140:50
    that were mentioned.
  • 140:50 - 140:53
    So, scale, the issue of
    people being more cognizant
  • 140:53 - 140:56
    about how their internal
    private life is lived
  • 140:56 - 140:59
    out in the digital
    realm and also the mean
  • 140:59 - 141:02
    that has been fairly successful
  • 141:02 - 141:04
    about internet freedom
    has ensued
  • 141:04 - 141:07
    in these intervening years.
  • 141:07 - 141:09
    >> I think those were
    grilling [phonetic] points.
  • 141:09 - 141:10
    >> Yeah. [Multiple Speakers]
  • 141:10 - 141:12
    >> I was just going
    to say quickly
  • 141:12 - 141:16
    that technology makes lots of
    things possible and it was clear
  • 141:16 - 141:17
    at the time of those articles.
  • 141:17 - 141:19
    It's clear today.
  • 141:19 - 141:21
    It doesn't mean that
    it's eminently minable
  • 141:21 - 141:24
    but that's done outside
    of due process.
  • 141:24 - 141:25
    And I think those are the things
  • 141:25 - 141:27
    that are getting
    people excitable.
  • 141:27 - 141:30
    It's not technology and it's
    not what's technically possible
  • 141:30 - 141:32
    or even what's technically
    feasible.
  • 141:32 - 141:37
    It's actually ultimately
    the full mining
  • 141:37 - 141:40
    and then the lack
    of public scrutiny.
  • 141:40 - 141:42
    >> What about the argument
    that you hear from some people
  • 141:42 - 141:46
    that not only are we three
    years farther away from 9/11?
  • 141:46 - 141:48
    Several of you made
    the point that a lot
  • 141:48 - 141:52
    of these original
    processes were passed almost
  • 141:52 - 141:56
    in a battlefield mentality,
    one of you used that phrase.
  • 141:56 - 142:01
    And that there were extreme
    circumstances with the value
  • 142:01 - 142:02
    of national security
  • 142:02 - 142:05
    in protecting the
    homeland very much
  • 142:05 - 142:07
    in the forefront
    of a public debate.
  • 142:07 - 142:11
    So you got three years later
    but some people would argue--
  • 142:11 - 142:12
    there's almost a
    contradiction here.
  • 142:12 - 142:15
    Certainly the intelligence
    community would say, "Well,
  • 142:15 - 142:18
    actually people are more
    relaxed now and less concerned
  • 142:18 - 142:22
    about the threat because
    our systems have worked
  • 142:22 - 142:24
    and that you shouldn't
    be dismantling systems
  • 142:24 - 142:27
    which have actually made
    people more secure."
  • 142:27 - 142:28
    How would you answer that?
  • 142:28 - 142:29
    >> Let's concentrate to what
    the President has said though.
  • 142:29 - 142:31
    I mean, what the
    President has told us--
  • 142:31 - 142:33
    >> Although he has said
    he's in favor of the system.
  • 142:33 - 142:37
    >> Yes. And he said that
    the capability is necessary.
  • 142:37 - 142:40
    He's in no way precluded
    additional oversight.
  • 142:40 - 142:42
    I think he's actually
    welcomed the discussion of--
  • 142:42 - 142:43
    >> Sure.
  • 142:43 - 142:45
    >> -- of what kind
    of oversight we need.
  • 142:45 - 142:49
    But he said we're living
    it in a lower threat level
  • 142:49 - 142:52
    and that we need to make
    changes as a society
  • 142:52 - 142:54
    and as government
    in response to that.
  • 142:54 - 142:58
    So, not the other way around.
  • 142:58 - 143:04
    >> Edward Snowden, to some
    people he's a traitor.
  • 143:04 - 143:05
    To some people, he's a hero.
  • 143:05 - 143:07
    What do you think?
  • 143:07 - 143:12
    >> Everybody, on your clearance
    you signed a document that says,
  • 143:12 - 143:16
    "If you disclose secrecy,
    there's a penalty."
  • 143:16 - 143:19
    So, from that standpoint,
    you clearly violated
  • 143:19 - 143:24
    that document and, you know,
    I don't have a problem with--
  • 143:24 - 143:27
    with being prosecuted
    because of that.
  • 143:27 - 143:28
    On the oversight--
  • 143:28 - 143:30
    >> Because it this contractual--
  • 143:30 - 143:31
    >> Because it is contractual.
  • 143:31 - 143:32
    I mean, it states, "Anybody
    who has a clearance."
  • 143:32 - 143:35
    You have that-- it's in
    your-- in the agreement,
  • 143:35 - 143:38
    and it says clearly that you,
    you know, 25 years in prison
  • 143:38 - 143:39
    or whatever it is that--
  • 143:39 - 143:40
    >> You actually read that?
  • 143:40 - 143:42
    >> No, it's-- yeah.
  • 143:42 - 143:45
    It happens when you come from
    a family of lawyers, you know.
  • 143:45 - 143:46
    >> It's privacy policy.
  • 143:46 - 143:48
    >> No. I mean, you look-- yeah,
    I mean, if I'm going to go
  • 143:48 - 143:50
    to jail, I want to know
    what I have to say, right?
  • 143:50 - 143:53
    And, so I think-- I think
    from that standpoint, yes.
  • 143:53 - 143:55
    I think the rest of the
    community is still trying
  • 143:55 - 143:58
    to divide it as to whether
    or not the information
  • 143:58 - 144:01
    that is disclosed is a problem.
  • 144:01 - 144:03
    And some people go back kind
  • 144:03 - 144:06
    of like what I was saying
    is there have been articles
  • 144:06 - 144:08
    about this for, you
    know, a long time.
  • 144:08 - 144:10
    You know, you can go
    back into the '70s
  • 144:10 - 144:14
    when NSA had the ECHELON
    program, in the mid-'90s
  • 144:14 - 144:17
    with Omnivore and
    Carnivore, you know,
  • 144:17 - 144:18
    all of those type of things.
  • 144:18 - 144:19
    You've had this type
  • 144:19 - 144:21
    of electronic surveillance
    for quite sometime.
  • 144:21 - 144:24
    So, we're not quite sure what
    it is and not having seen all
  • 144:24 - 144:27
    that is disclosed but we're
    not quite sure of what it is
  • 144:27 - 144:32
    that he's telling the world
    that we don't know already.
  • 144:32 - 144:35
    And so from that stand point,
    I think part of my community,
  • 144:35 - 144:38
    we're not sure about
    the content.
  • 144:38 - 144:41
    But certainly he violated an
    agreement and he, you know,
  • 144:41 - 144:43
    that's what they're going
    to go after him for.
  • 144:43 - 144:46
    >> So I have mixed views
    about Snowden because I think
  • 144:46 - 144:49
    that whistleblowing in a
    free society and a right
  • 144:49 - 144:51
    to know is critically important.
  • 144:51 - 144:55
    And if you go back
    to Pentagon Papers
  • 144:55 - 144:58
    and terrible things did not
    happen with the publishing
  • 144:58 - 145:00
    of the Pentagon Papers.
  • 145:00 - 145:02
    [ Pause ]
  • 145:02 - 145:04
    So-- so, part of--
  • 145:04 - 145:06
    >> I'll cover the
    Pentagon Papers story and--
  • 145:06 - 145:07
    >> I'm sure--
  • 145:07 - 145:08
    >> -- cover Daniel
    Ellsberg's trial.
  • 145:08 - 145:08
    So, I'm--
  • 145:08 - 145:09
    >> And Ellsberg's trial.
  • 145:09 - 145:10
    >> Yes.
  • 145:10 - 145:11
    >> And Ellsberg is very,
  • 145:11 - 145:14
    very out there right
    now about Snowden.
  • 145:14 - 145:16
    I was kind of right with him
  • 145:16 - 145:21
    until he started keeping away
    the US cyber security strategy
  • 145:21 - 145:24
    with respect to China
    while he was in Hong Kong.
  • 145:24 - 145:29
    And then I started to wonder
    exactly what he's motivations
  • 145:29 - 145:31
    were, you know.
  • 145:31 - 145:36
    And so, I do feel like
    there's a level or recklessness
  • 145:36 - 145:42
    that worries me and although
    it's hard for me to really know
  • 145:42 - 145:46
    if there's damage to national
    security because I've set
  • 145:46 - 145:49
    in some of these meetings
    with national security people
  • 145:49 - 145:52
    who I've held up the
    minimization guidelines
  • 145:52 - 145:53
    and said, "Is there anything
  • 145:53 - 145:55
    in here that's really
    secret or should be?"
  • 145:55 - 145:57
    And they said, "No."
  • 145:57 - 146:00
    The existence of
    these programs sort
  • 146:00 - 146:03
    of have been broad-brushed
    some people knew.
  • 146:03 - 146:05
    So, if at the end of the day,
  • 146:05 - 146:06
    it's that the public
    finally knows
  • 146:06 - 146:08
    that it's happening
    'cause I'm not sure
  • 146:08 - 146:11
    if any thing's been revealed
    that will allow people
  • 146:11 - 146:15
    to somehow evade all of this,
    that, I think, you know,
  • 146:15 - 146:18
    society has to make
    room for whistleblowers.
  • 146:18 - 146:20
    I just wish we had
    whistleblowers
  • 146:20 - 146:23
    who were a little less reckless.
  • 146:23 - 146:25
    >> Well, you know,
    as a whistleblower,
  • 146:25 - 146:27
    he appears to have
    delusions of grandeur.
  • 146:27 - 146:28
    >> Well, right.
  • 146:28 - 146:29
    >> I mean, the first
    couple of days, he had--
  • 146:29 - 146:31
    he'd made statements
    such as saying--
  • 146:31 - 146:33
    such as that he could order
    a wiretap of the President
  • 146:33 - 146:36
    and he could, you know,
    had all the locations
  • 146:36 - 146:37
    of all the NSA stations.
  • 146:37 - 146:43
    Pretty unlikely, that said,
    I guess the, you know,
  • 146:43 - 146:46
    I think that his disclosures
    have done a public service.
  • 146:46 - 146:48
    I think he's a whistleblower
    in that sense.
  • 146:48 - 146:49
    I actually agree.
  • 146:49 - 146:50
    It seems like he
    just broken the law
  • 146:50 - 146:52
    and doesn't seem
    very complicated.
  • 146:52 - 146:55
    What I find confusing and--
  • 146:55 - 147:00
    is that somehow the
    traditional notion
  • 147:00 - 147:03
    of civil disobedience seems
    to have been lost track
  • 147:03 - 147:05
    of in what he's doing.
  • 147:05 - 147:06
    I mean, I wish-- I mean, look,
  • 147:06 - 147:09
    he's obviously has put his
    life at total risk and--
  • 147:09 - 147:12
    >> Although traditional
    idea is you pay the penalty
  • 147:12 - 147:13
    for having broken the law.
  • 147:13 - 147:14
    >> You pay the penalty and
    you [inaudible] having a trial
  • 147:14 - 147:17
    and you expose-- you expose the
    things that you're concerned
  • 147:17 - 147:19
    about and you expose
    them, further look at me.
  • 147:19 - 147:21
    He's a civilian.
  • 147:21 - 147:24
    He wouldn't be tried in a
    court martial the way Bradley
  • 147:24 - 147:24
    Manning is.
  • 147:24 - 147:25
    >> Right.
  • 147:25 - 147:27
    >> He could've staged
    a public trial
  • 147:27 - 147:31
    and a big public discussion that
    he chose not to and I, you know,
  • 147:31 - 147:32
    again, it's not for me
  • 147:32 - 147:36
    to say how people should put
    their lives at risk as he'd has
  • 147:36 - 147:39
    but I think it's a-- I think
    it's an incomplete active
  • 147:39 - 147:40
    civil disobedience.
  • 147:40 - 147:43
    >> Anybody else want
    it, got a view on that?
  • 147:43 - 147:45
    Let me ask you another question.
  • 147:45 - 147:49
    One of the things that
    Dan said that the--
  • 147:49 - 147:51
    are the very interesting
    analysis when he talked
  • 147:51 - 147:54
    about the internet not being
    a creature of the state,
  • 147:54 - 147:56
    how earlier forms
  • 147:56 - 148:00
    of communication whether
    it's telephones and telegraph
  • 148:00 - 148:02
    and so many of them grew up.
  • 148:02 - 148:05
    Television spectrums,
    I mean, they were sold
  • 148:05 - 148:07
    by the government,
    right, and still are.
  • 148:07 - 148:12
    So, there was an inherent
    regulatory legal organic
  • 148:12 - 148:14
    connection between earlier
    forms of communication
  • 148:14 - 148:18
    and government regulation which,
    as you point out is missing
  • 148:18 - 148:22
    in a large part in this
    particular system we've all been
  • 148:22 - 148:24
    talking about.
  • 148:24 - 148:26
    >> But it's more than a legal
    question it seems to me,
  • 148:26 - 148:31
    it's also a cultural question,
    that a lot of you being soaked
  • 148:31 - 148:33
    in this culture and committed
  • 148:33 - 148:38
    to this culture have evinced
    skepticism of government
  • 148:38 - 148:42
    and of regulation and yet at
    the same time, you're grappling
  • 148:42 - 148:46
    with the notion of how do you
    bring some order to a system
  • 148:46 - 148:51
    that continues to cry out
    for some kind of regulation.
  • 148:51 - 148:54
    I'm interested for you
    to use a little bit
  • 148:54 - 148:58
    about the contradiction between
    sort of the historic origins
  • 148:58 - 149:02
    and the culture of this
    system and the need
  • 149:02 - 149:05
    to bring some order
    and regulation to it.
  • 149:05 - 149:08
    It seems to me there's an
    inherent conflict that's worth
  • 149:08 - 149:10
    talking about.
  • 149:10 - 149:11
    Start.
  • 149:11 - 149:12
    >> I'll kick off the
    discussion about that.
  • 149:12 - 149:16
    I think sometimes when
    you go to a workshop
  • 149:16 - 149:18
    that discusses internet
    governance,
  • 149:18 - 149:24
    it descends into a question
    such as who should grab the keys
  • 149:24 - 149:25
    of internet governance, right?
  • 149:25 - 149:26
    So, the question of--
  • 149:26 - 149:28
    so if question were asked
  • 149:28 - 149:31
    such as should the United
    Nations control the internet
  • 149:31 - 149:33
    or should the United
    States control it
  • 149:33 - 149:34
    or should Google control it?
  • 149:34 - 149:36
    Like those kinds of
    questions don't make any sense
  • 149:36 - 149:38
    in their first instance.
  • 149:38 - 149:41
    And part of that is because
    there is no monolithic system
  • 149:41 - 149:42
    of internet governance.
  • 149:42 - 149:45
    It's highly granular,
    it's multi layered.
  • 149:45 - 149:48
    There are-- there are
    so many different levels
  • 149:48 - 149:50
    of internet governance.
  • 149:50 - 149:51
    I mean, I-- my book is very long
  • 149:51 - 149:53
    and I only get into
    some of them.
  • 149:53 - 149:56
    But the issue here is
    multistakeholder governance
  • 149:56 - 149:58
    and what that means.
  • 149:58 - 150:01
    So, internet governance is very
    interesting because it is not
  • 150:01 - 150:04
    about governments necessarily
    but that doesn't mean
  • 150:04 - 150:05
    that there's not a
    role for government
  • 150:05 - 150:07
    in many different areas.
  • 150:07 - 150:10
    So, if I have identity theft,
    I'd like the government
  • 150:10 - 150:11
    to step in and help me.
  • 150:11 - 150:13
    I expect antitrust enforcement.
  • 150:13 - 150:16
    I expect laws about--
    some laws about privacy.
  • 150:16 - 150:17
    I expect some laws
  • 150:17 - 150:19
    about intellectual
    property rights enforcement.
  • 150:19 - 150:22
    I expect certain national
    statute on any variety
  • 150:22 - 150:26
    of things related to
    trade, child protection.
  • 150:26 - 150:29
    So, there are many places
    for the government to be
  • 150:29 - 150:31
    in internet regulation
    and governance.
  • 150:31 - 150:34
    But most functions
    that are related
  • 150:34 - 150:37
    to the operational
    stability and security
  • 150:37 - 150:39
    of the internet have
    not been the purview
  • 150:39 - 150:40
    of traditional governments
  • 150:40 - 150:43
    but they have been enacted
    by private industry.
  • 150:43 - 150:45
    And I think that that's
    a very important point.
  • 150:45 - 150:47
    Now, tying this back
    to the PRISM issue,
  • 150:47 - 150:52
    one concern that I have is
    that the one separate area
  • 150:52 - 150:54
    of how the internet is
    being used in the data
  • 150:54 - 150:57
    that can be collected
    and gathered will bleed
  • 150:57 - 150:59
    into this other area of
    the operational stability
  • 150:59 - 151:01
    and security of the internet.
  • 151:01 - 151:05
    So, we can expect to see
    this issue used as a proxy
  • 151:05 - 151:08
    for governments that are
    interested in gaining control
  • 151:08 - 151:10
    in certain operational
    areas of the internet.
  • 151:10 - 151:12
    I guarantee that
    that will happen.
  • 151:12 - 151:15
    So, we'll see that in some--
  • 151:15 - 151:17
    once he calls for
    international treaties.
  • 151:17 - 151:19
    Again, that leave
    out private industry.
  • 151:19 - 151:21
    That leave out civil society.
  • 151:21 - 151:24
    We'll see it in additional
    calls for bringing
  • 151:24 - 151:26
    in a more multilateral control
  • 151:26 - 151:28
    of some critical
    internet resources.
  • 151:28 - 151:31
    So, I think one bleeds
    into another.
  • 151:31 - 151:34
    But that it's really not
    about private industry
  • 151:34 - 151:37
    versus governments
    or civil society
  • 151:37 - 151:40
    versus private industry,
    for example.
  • 151:40 - 151:42
    It's about the more
    contextual question
  • 151:42 - 151:45
    of what governance is
    necessary for what--
  • 151:45 - 151:47
    which particular area.
  • 151:47 - 151:50
    So, in certain cases, it's
    completely appropriate
  • 151:50 - 151:53
    for only the private
    sector to be involved.
  • 151:53 - 151:55
    In other cases, it's the
    purview of government.
  • 151:55 - 151:58
    So, it has to be asked
    in a contextual area,
  • 151:58 - 152:00
    and that's what multistakeholder
    governance is.
  • 152:00 - 152:04
    It's not everybody in charge
    of every-- it's granular area.
  • 152:04 - 152:05
    That's not it.
  • 152:05 - 152:07
    It's about what's
    the appropriate form
  • 152:07 - 152:09
    of governance in
    a particular area.
  • 152:09 - 152:11
    >> John?
  • 152:11 - 152:12
    >> So the one thing
    we do know is
  • 152:12 - 152:16
    that the internet
    evolves very quickly.
  • 152:16 - 152:23
    I was involved 25 years of
    it and it changes rapidly.
  • 152:23 - 152:29
    What was dial-up
    modems and just Telnet
  • 152:29 - 152:36
    and FTP quickly became
    circuits and high speed fiber.
  • 152:36 - 152:38
    Yeah, OK, sorry.
  • 152:38 - 152:39
    [Inaudible] And the web--
  • 152:39 - 152:41
    >> My work here is done, yeah.
  • 152:41 - 152:48
    >> The fact is that that
    evolution doesn't work well
  • 152:48 - 152:50
    with government regulation.
  • 152:50 - 152:54
    Having been involved in
    the telecommunication side
  • 152:54 - 152:57
    of the industry as opposed the
    internet side, I also dealt
  • 152:57 - 152:59
    with a lot of [inaudible]
    regulations.
  • 152:59 - 153:02
    And generally, we're dealing
    with regulations that were three
  • 153:02 - 153:06
    to ten years behind the
    time trying to drag them
  • 153:06 - 153:11
    into current circumstance,
    and it was inevitable.
  • 153:11 - 153:15
    The process of regulation
    doesn't work well
  • 153:15 - 153:18
    with a forward-looking
    rapidly evolving internet.
  • 153:18 - 153:23
    So, when we start
    thinking about--
  • 153:23 - 153:28
    about the non-governmental
    nature of the internet,
  • 153:28 - 153:32
    as the internet grows up, we
    find ourselves in a conundrum
  • 153:32 - 153:35
    because the internet
    is no longer optional.
  • 153:35 - 153:38
    The internet is intwined
    in everyone's life.
  • 153:38 - 153:41
    Governments are looking at
    their economies and going,
  • 153:41 - 153:45
    "Some percentage of my
    economy is tied to this?"
  • 153:45 - 153:48
    You can say I have a choices
    to whether or not I'm going
  • 153:48 - 153:49
    to participate but I don't.
  • 153:49 - 153:52
    I have to be involved.
  • 153:52 - 153:55
    So, governments are now
    realizing they have no choice,
  • 153:55 - 153:56
    they need to be involved.
  • 153:56 - 153:59
    They look at their duties,
    their responsibilities.
  • 153:59 - 154:02
    And they go, "How do I
    affect what I'm required
  • 154:02 - 154:04
    to do in this new media?"
  • 154:04 - 154:09
    And it's a big challenge,
    folks, because the protocols
  • 154:09 - 154:11
    and the operational conventions
  • 154:11 - 154:14
    that hold the internet
    together have to be global.
  • 154:14 - 154:17
    They have to interoperate
    on global basis.
  • 154:17 - 154:18
    But governments are
    used to acting
  • 154:18 - 154:22
    on a national basis
    occasionally regional,
  • 154:22 - 154:26
    occasionally bilateral,
    multilateral, but generally,
  • 154:26 - 154:28
    they act on a national basis
  • 154:28 - 154:32
    and you can easily break
    the internet by attempting
  • 154:32 - 154:35
    to regulate it without the
    idea of a global context
  • 154:35 - 154:37
    in which it operates in.
  • 154:37 - 154:38
    This is really what's
    been opened
  • 154:38 - 154:39
    up in the last five years.
  • 154:39 - 154:41
    In the last five years,
  • 154:41 - 154:43
    governments are realizing
    they do need to get involved,
  • 154:43 - 154:44
    they don't understand
    the framework
  • 154:44 - 154:46
    by which they can do so safely.
  • 154:46 - 154:51
    In general, this has resulted
    to not much regulation
  • 154:51 - 154:54
    but that's not necessarily going
    to be the case going forward.
  • 154:54 - 154:57
    And it's-- the internet
    community and I say
  • 154:57 - 155:00
    that being the operators,
    the various entities
  • 155:00 - 155:01
    that operate pieces
  • 155:01 - 155:05
    of the infrastructure
    aren't exactly forthcoming
  • 155:05 - 155:07
    to government saying, "Here,
  • 155:07 - 155:09
    this is how to safely
    regulate the internet.
  • 155:09 - 155:11
    That guide book has
    not been written."
  • 155:11 - 155:15
    And so, without a meeting
    of the minds on this topic,
  • 155:15 - 155:19
    we run the risk that governments
    are going to make regulations
  • 155:19 - 155:22
    that make no sense or break
    portions of the internet.
  • 155:22 - 155:24
    Until this an understood
    common framework
  • 155:24 - 155:27
    for this, we're all at risk.
  • 155:27 - 155:29
    >> Lynn, do you wanted to talk?
  • 155:29 - 155:32
    >> Yeah, very good comments
    both from Laura and John,
  • 155:32 - 155:35
    and I just want to make sure
    that we're left with a myth here
  • 155:35 - 155:37
    that much of the internet
    committee believes
  • 155:37 - 155:39
    that there's no rule
    for governments.
  • 155:39 - 155:40
    We've been engaging
    with governments
  • 155:40 - 155:44
    and certainly we could have
    done it better over the years.
  • 155:44 - 155:45
    But, you know, for many,
  • 155:45 - 155:49
    many years when ISOC had a very
    small staff of seven people,
  • 155:49 - 155:52
    I was going to [inaudible]
    and spending weeks
  • 155:52 - 155:55
    because you don't go to United
    Nations meeting for two days,
  • 155:55 - 155:57
    you'd go for two
    and three weeks.
  • 155:57 - 155:59
    That's a significant investment,
  • 155:59 - 156:00
    and we went because
    we wanted people
  • 156:00 - 156:04
    to understand how the
    internets actually worked.
  • 156:04 - 156:06
    And I-- it's like a slight
    disagreement with John
  • 156:06 - 156:10
    and nobody has the answer
    but how we regulate for a lot
  • 156:10 - 156:11
    of this new environments.
  • 156:11 - 156:14
    It's not because we think
    there should be no regulations
  • 156:14 - 156:15
    because we can't
    figure out what sort
  • 156:15 - 156:17
    of regulation there should be.
  • 156:17 - 156:20
    And to John's other point,
    it is because of the pace
  • 156:20 - 156:22
    with which the internet changes.
  • 156:22 - 156:24
    And it is to the fact
    that it has broken
  • 156:24 - 156:25
    so many global boundaries.
  • 156:25 - 156:28
    You can do it within
    any single context.
  • 156:28 - 156:30
    So, it is a very,
    very complex world.
  • 156:30 - 156:35
    And we continue to advocate for
    dialogue discussion, thoughtful,
  • 156:35 - 156:37
    let it take the time it needs.
  • 156:37 - 156:40
    We need to pull apart lots
    of different complexities
  • 156:40 - 156:42
    and let the right thing emerge.
  • 156:42 - 156:44
    That is not the same
    as saying no rule
  • 156:44 - 156:47
    for governments but
    it often gets--
  • 156:47 - 156:50
    >> And another dimension of
    this is not all governments are
  • 156:50 - 156:51
    the same.
  • 156:51 - 156:57
    Because you can talk about China
    being a pretty malign force
  • 156:57 - 157:00
    in terms of internet freedom
    to say nothing of Iran
  • 157:00 - 157:04
    which uses the internet to roll
    up all networks of descent,
  • 157:04 - 157:06
    where in Tahrir Square,
  • 157:06 - 157:10
    the internet is a
    mechanism for expression.
  • 157:10 - 157:13
    So, part of it, you can--
  • 157:13 - 157:15
    part of the problem
    is government even
  • 157:15 - 157:18
    as a concept is subject to
    many different definitions.
  • 157:18 - 157:19
    Yeah.
  • 157:19 - 157:21
    >> It is Steve but that's
    probably why the western
  • 157:21 - 157:26
    governments that operate
    in democracies with legal
  • 157:26 - 157:29
    and constitutional frameworks
  • 157:29 - 157:31
    that it's particularly
    important.
  • 157:31 - 157:34
    And one of the reasons people
    are particularly unhappy
  • 157:34 - 157:37
    about this particular
    surveillance is if we're trying
  • 157:37 - 157:39
    to come up with this
    nuanced role
  • 157:39 - 157:42
    about where government belongs,
  • 157:42 - 157:44
    we haven't exactly
    sent the right message
  • 157:44 - 157:45
    to the rest of the world.
  • 157:45 - 157:47
    And I think that's part
    of the problem here.
  • 157:47 - 157:50
    >> I just want to offer a
    kind of a friendly amendment,
  • 157:50 - 157:53
    I hope to John's view, you know.
  • 157:53 - 157:57
    I think John correctly
    points out that attempting
  • 157:57 - 158:00
    to regulate the internet
    infrastructure whether it's the
  • 158:00 - 158:02
    institutions that
    operate and design it
  • 158:02 - 158:06
    or the actual operators
    is very [inaudible]
  • 158:06 - 158:08
    because it changes
    very quickly and it has
  • 158:08 - 158:10
    to work on a global basis.
  • 158:10 - 158:12
    That was the experience of the--
  • 158:12 - 158:15
    of SOPA, the Stop
    Online Privacy Act,
  • 158:15 - 158:19
    was that policymakers
    thought they could reach in--
  • 158:19 - 158:19
    >> Piracy.
  • 158:19 - 158:21
    >> -- sorry, Piracy Act.
  • 158:21 - 158:23
    >> [Inaudible] is the stop--
  • 158:23 - 158:24
    >> That's right.
  • 158:24 - 158:27
    Thank you, thank you, thank you.
  • 158:27 - 158:30
    Right, you know, governments
    thought they could reach
  • 158:30 - 158:33
    in to the internet
    infrastructure
  • 158:33 - 158:34
    to solve a problem
    that is really
  • 158:34 - 158:37
    about human behavior
    and violating laws.
  • 158:37 - 158:41
    However, at the same time
    there are plenty of legal rules
  • 158:41 - 158:45
    that actually work incredibly
    well on the internet.
  • 158:45 - 158:48
    You know, the FTC is still using
    the Fair Credit Reporting Act
  • 158:48 - 158:51
    from 1970 which was one
    of our first piracy laws
  • 158:51 - 158:55
    in United States to stop very
    advanced sophisticated online
  • 158:55 - 158:56
    services from harming people.
  • 158:56 - 158:58
    That's not causing
    John any problems.
  • 158:58 - 159:01
    He's not-- he [inaudible]
    even think about it.
  • 159:01 - 159:08
    And so I think there's-- I think
    that it's when governments try
  • 159:08 - 159:10
    to do this sort of
    shortcuts that they were used
  • 159:10 - 159:13
    to doing is you were suggesting
    with the broadcasters, you know,
  • 159:13 - 159:15
    say, "You guys have to behave
    this way in order for us
  • 159:15 - 159:17
    to achieve a broader
    policy goal."
  • 159:17 - 159:19
    That's where we really
    have problems
  • 159:19 - 159:21
    in the internet environment.
  • 159:21 - 159:26
    >> You know, number of you have
    used the word privacy as a value
  • 159:26 - 159:29
    and as a very curious
    [phonetic] value.
  • 159:29 - 159:32
    But I want you to
    play with the idea
  • 159:32 - 159:34
    that perhaps we're
    actually really ambivalent
  • 159:34 - 159:38
    about the question of privacy
    because in fact in many ways,
  • 159:38 - 159:44
    the wide availability of data is
    actually an enormous convenient.
  • 159:44 - 159:47
    Every time you go to
    a website and you put
  • 159:47 - 159:50
    in your e-mail address
    and up comes your password
  • 159:50 - 159:52
    and all your credit
    card information
  • 159:52 - 159:55
    and at one stop shopping
    at Amazon, right?
  • 159:55 - 159:59
    We all love it but isn't that--
  • 159:59 - 160:01
    >> But we've made a decision
    to share that with them, so.
  • 160:01 - 160:02
    >> Well--
  • 160:02 - 160:03
    >> Wait just a minute.
  • 160:03 - 160:05
    I want a-- I want a--
    I understand that.
  • 160:05 - 160:09
    But I want to pose a question
    about whether at some point,
  • 160:09 - 160:12
    aren't we really ambivalent
    about this question
  • 160:12 - 160:17
    that we actually like people
    to have information about us
  • 160:17 - 160:21
    when it serves our purposes of
    convenience and accessibility.
  • 160:21 - 160:23
    And that there was a certain--
  • 160:23 - 160:25
    I understand your
    point about voluntary.
  • 160:25 - 160:28
    But this will mark [phonetic]
    a point here as well.
  • 160:28 - 160:30
    >> I think part of
    the problem is--
  • 160:30 - 160:32
    is the public is
    realizing that the people
  • 160:32 - 160:34
    that are storing the
    information that we give
  • 160:34 - 160:38
    to them aren't doing a good
    a job in protecting it.
  • 160:38 - 160:40
    I could go to my local bank--
    I grew up in Falls Church.
  • 160:40 - 160:42
    I used to go to the local
    First Virginia Bank right there
  • 160:42 - 160:43
    in Falls Church.
  • 160:43 - 160:45
    I gave them all my
    information when I opened
  • 160:45 - 160:47
    up my first account
    when I was a teenager.
  • 160:47 - 160:50
    And I had a reasonable
    expectation
  • 160:50 - 160:51
    that the only way they
    were going to get that--
  • 160:51 - 160:53
    anybody else is going to
    get that information was
  • 160:53 - 160:57
    if they actually went in and
    rob the bank and stole the file,
  • 160:57 - 160:59
    you know, the file
    cabinet or whatever.
  • 160:59 - 161:01
    >> Or if anyone in the
    government wanted it?
  • 161:01 - 161:02
    >> Or if anyone in the
    government want it?
  • 161:02 - 161:05
    I mean that-- I mean
    that, you know, that--
  • 161:05 - 161:06
    that's a reasonable thing.
  • 161:06 - 161:07
    But nowadays when we find
  • 161:07 - 161:10
    out that you have all these
    massive data leaks, you know,
  • 161:10 - 161:13
    and when you examine
    the root cause of what--
  • 161:13 - 161:16
    what the leak was from
    a tactical standpoint,
  • 161:16 - 161:18
    I should have been
    fixed 15 years ago.
  • 161:18 - 161:22
    You know, some of these attacks,
    I do a talk now if they ask me,
  • 161:22 - 161:25
    you know, I guess it's because
    I had gray hair and I've been
  • 161:25 - 161:27
    in the security business
    for 20 years.
  • 161:27 - 161:29
    I think I noticed something
  • 161:29 - 161:33
    but I do [inaudible]
    that the contract--
  • 161:33 - 161:34
    >> It's because of the ponytail.
  • 161:34 - 161:35
    >> I think it is.
  • 161:35 - 161:37
    You know, it's the-- it's the
    same attack methods that we saw
  • 161:37 - 161:41
    in the early '90s are
    still effective in 2013.
  • 161:41 - 161:44
    And so my question to the
    technical community was,
  • 161:44 - 161:48
    what have we been doing
    this last 20 years?
  • 161:48 - 161:53
    I mean, you know, I in the SANS
    Institute that I was a part of,
  • 161:53 - 161:57
    we drew up a top
    ten threats in 2001.
  • 161:57 - 162:01
    I pulled it out and today
    in 2013, every single one
  • 162:01 - 162:05
    of those top ten threats from
    2001 is still affecting today.
  • 162:05 - 162:08
    So, what have we been
    doing to protect ourselves?
  • 162:08 - 162:09
    And I think it's that type
  • 162:09 - 162:13
    of thing that's causing this
    rumble about, you know, well,
  • 162:13 - 162:15
    private industry isn't
    doing anything about it,
  • 162:15 - 162:17
    individuals aren't
    doing anything about it.
  • 162:17 - 162:20
    So, that only leaves
    a government
  • 162:20 - 162:22
    to do something about
    it, you know.
  • 162:22 - 162:25
    And so I think that's why
    we're seeing that pushing
  • 162:25 - 162:28
    that direction, 'cause
    historically when you look
  • 162:28 - 162:31
    at it, the same things
    are still affecting us,
  • 162:31 - 162:33
    it's just that what's
    happening now is back in 2001,
  • 162:33 - 162:37
    the scope of a data breach
    was much less than the scope
  • 162:37 - 162:38
    of the data breach now.
  • 162:38 - 162:39
    >> Sure, sure.
  • 162:39 - 162:40
    >> And just one last
    aside [phonetic].
  • 162:40 - 162:41
    The iPhone, I think,
    I looked it up,
  • 162:41 - 162:44
    is five years old
    today-- this year.
  • 162:44 - 162:48
    So, to answer your question
    about why the difference from--
  • 162:48 - 162:49
    >> Hold on just a second.
  • 162:49 - 162:50
    John and Leslie, you want a-- is
    there a point you wanted to make
  • 162:50 - 162:53
    about the voluntary
    submission of data.
  • 162:53 - 162:55
    >> I don't think so.
  • 162:55 - 162:58
    I think every point has been
    made by everybody quite well.
  • 162:58 - 163:00
    >> OK. John?
  • 163:00 - 163:03
    >> You thought-- say that maybe
    people don't value privacy
  • 163:03 - 163:07
    over convenience, and that
    that we give up privacy
  • 163:07 - 163:10
    and there's no outrage
    about that.
  • 163:10 - 163:13
    Why is there an outrage
    in some cases?
  • 163:13 - 163:15
    And I guess the question
    that comes up is,
  • 163:15 - 163:18
    a lot of people voluntarily
    give up privacy
  • 163:18 - 163:20
    because we want the convenience.
  • 163:20 - 163:22
    We want to click the button
    and get the order done.
  • 163:22 - 163:24
    We want the website
    to know who we are.
  • 163:24 - 163:26
    We want to fill out
    all the information
  • 163:26 - 163:30
    on the airline profile because
    it just makes traveling a little
  • 163:30 - 163:32
    easier and anything
    that makes it easier--
  • 163:32 - 163:34
    >> And who can not remember all
    those damn passwords anyway,
  • 163:34 - 163:35
    right?
  • 163:35 - 163:37
    >> So there are times
    we do that.
  • 163:37 - 163:40
    But the reality is that there's
    also times when you consent
  • 163:40 - 163:44
    to losing your privacy
    in situations that aren't
  • 163:44 - 163:47
    for your convenience but
    you're going to anyway.
  • 163:47 - 163:48
    Next time you're
    going to your doctor,
  • 163:48 - 163:51
    pick up that HIPAA form
    and look at it, OK?
  • 163:51 - 163:54
    It says, "We're going to
    share your data with the CDC
  • 163:54 - 163:56
    because if there's an
    outbreak of something,
  • 163:56 - 163:58
    we're going to do that, OK?"
  • 163:58 - 164:00
    It's not for my convenience,
    hopefully, I'm not relevant,
  • 164:00 - 164:03
    but the fact is they're
    going to do what--
  • 164:03 - 164:06
    and we consent with because
    well, we knew about it.
  • 164:06 - 164:07
    We understand--
  • 164:07 - 164:09
    >> And there's a public
    interest involved because--
  • 164:09 - 164:12
    >> There's a public
    interest but also as annoying
  • 164:12 - 164:15
    as that form is, they
    took the time to tell us.
  • 164:15 - 164:17
    I do think that's a
    different question
  • 164:17 - 164:19
    when there's surveillance
    is going on
  • 164:19 - 164:21
    and you don't know
    about it at all.
  • 164:21 - 164:25
    And I think that might be the
    angst behind a lot of this.
  • 164:25 - 164:26
    >> Leslie?
  • 164:26 - 164:27
    >> Well, I also think
    that there's difference
  • 164:27 - 164:30
    between surveillance
    by the government
  • 164:30 - 164:34
    and surveillance by companies.
  • 164:34 - 164:36
    I mean I think they've
    become more joined together
  • 164:36 - 164:39
    because they are now the
    source of all the information
  • 164:39 - 164:40
    that the government
    is collecting.
  • 164:40 - 164:43
    But I think at least in
    the constitutional system,
  • 164:43 - 164:46
    not in Europe where they're
    making a distinction.
  • 164:46 - 164:52
    Having that government who has
    the capacity to impose penalties
  • 164:52 - 164:59
    on people and make choices about
    who is going to be prosecuted,
  • 164:59 - 165:04
    et cetera, is entirely different
    matter than whether I like
  • 165:04 - 165:05
    or disliked it, you know,
  • 165:05 - 165:09
    Google Profiles keep
    sending me the same crib
  • 165:09 - 165:12
    for my pregnant daughter
    over and over again.
  • 165:12 - 165:14
    I'm annoyed, I'm
    extremely annoyed.
  • 165:14 - 165:16
    But I also think--
  • 165:16 - 165:17
    >> They knew she was
    pregnant before she did, yes?
  • 165:17 - 165:18
    >> Yes.
  • 165:18 - 165:19
    >> Well, they probably
    did, but they knew
  • 165:19 - 165:21
    that I did this thing once
    and it's following the--
  • 165:21 - 165:24
    it's literally following
    me around the world.
  • 165:24 - 165:25
    I mean it's became
    kind of a joke.
  • 165:25 - 165:28
    But I find that annoying and
    I'm somebody who knows how
  • 165:28 - 165:30
    to set my privacy settings.
  • 165:30 - 165:34
    So, you know, in one-- in
    one circumstance it's a loss
  • 165:34 - 165:37
    of some measure of control,
    we ought to have more control,
  • 165:37 - 165:40
    it's one of the reasons we
    ought to have a baseline build
  • 165:40 - 165:42
    to give us some fair
    information practices.
  • 165:42 - 165:47
    In the other, you know, the
    government has the capacity
  • 165:47 - 165:49
    to make very important
    decisions about us.
  • 165:49 - 165:53
    And we have a constitution that
    says they have to follow rules.
  • 165:53 - 166:00
    And so we do react differently
    to the crib and to the NSA.
  • 166:00 - 166:01
    >> Yeah.
  • 166:01 - 166:05
    >> The one thing I would
    say about our tension
  • 166:05 - 166:09
    between convenience and
    privacy is-- I just think that--
  • 166:09 - 166:14
    I think privacy has always been
    a kind of a contested concept.
  • 166:14 - 166:16
    I mean, it's not an absolute.
  • 166:16 - 166:18
    I don't think there's
    anyone hardly in the world
  • 166:18 - 166:22
    who thinks it's-- it's
    an absolute condition.
  • 166:22 - 166:24
    But we know, as Leslie
    is saying,
  • 166:24 - 166:27
    we have different
    privacy expectations.
  • 166:27 - 166:30
    We want different privacy
    results at different times.
  • 166:30 - 166:34
    And some of the-- some
    of the privacy promise
  • 166:34 - 166:35
    that we have [inaudible]
  • 166:35 - 166:38
    like seeing the same
    crib too many times.
  • 166:38 - 166:41
    Others have real consequence
  • 166:41 - 166:42
    like if you're credit
    report is wrong
  • 166:42 - 166:44
    and you can't get a mortgage,
    you're really, you know,
  • 166:44 - 166:47
    it really-- or you don't
    get a job, you know,
  • 166:47 - 166:48
    it really makes a difference.
  • 166:48 - 166:53
    And I think that-- I think
    in a way the phrase--
  • 166:53 - 166:55
    I think categorizing privacy
  • 166:55 - 166:58
    as a single right is what
    makes the conversation a little
  • 166:58 - 167:00
    confused because it-- the
    reality is it's a number
  • 167:00 - 167:03
    of concerns wrapped up into one.
  • 167:03 - 167:07
    >> But in this day and age,
    isn't it a little naive to say,
  • 167:07 - 167:09
    I'm going to give my
    information to Amazon?
  • 167:09 - 167:11
    Or I'm going to give my
    information to Google
  • 167:11 - 167:15
    for my own purposes because I
    choose to, and somehow I'm going
  • 167:15 - 167:17
    to be able to control--
  • 167:17 - 167:18
    >> Oh, yes.
  • 167:18 - 167:20
    >> -- all the use
    of that information.
  • 167:20 - 167:22
    >> Yeah, that's not--
    that's not doable,
  • 167:22 - 167:24
    and I don't think any
    privacy law we would pass
  • 167:24 - 167:28
    in this country would
    change that.
  • 167:28 - 167:29
    >> It is naive but I think--
  • 167:29 - 167:34
    I think the problem is I don't
    mind giving you any information
  • 167:34 - 167:37
    you ask about me if you tell me
    ahead of time what you're going
  • 167:37 - 167:38
    to do with that information.
  • 167:38 - 167:41
    So, what John was talking
    about in the HIPAA thing
  • 167:41 - 167:44
    when they say, "We're going
    to give it out to the CDC,"
  • 167:44 - 167:47
    and you tell me ahead of
    time before I give you
  • 167:47 - 167:49
    that information, that's much--
  • 167:49 - 167:52
    that's a much different
    environment than giving it here
  • 167:52 - 167:55
    and then buried in [inaudible]
    fine print is a little click
  • 167:55 - 167:56
    thing that says,
    "Oh, by the way,
  • 167:56 - 167:58
    we're going to just give
    this to anybody we want."
  • 167:58 - 167:59
    >> We only have few
    minutes, and I want to--
  • 167:59 - 168:03
    you've mentioned out, a number
    of you, the fact that we're--
  • 168:03 - 168:06
    we're dealing with a very
    complex system as Laura said
  • 168:06 - 168:08
    with a lot of stakeholders here.
  • 168:08 - 168:13
    And one of the stakeholders are
    these providers whether it's
  • 168:13 - 168:18
    Google, whether it's Yahoo,
    who take that information.
  • 168:18 - 168:20
    They don't warn you in some
    ways that they're going
  • 168:20 - 168:23
    to give it away because they're
    not voluntarily giving it away.
  • 168:23 - 168:24
    They're being subpoenaed,
  • 168:24 - 168:26
    they're being ordered
    to by the government.
  • 168:26 - 168:27
    And they are caught.
  • 168:27 - 168:30
    We have-- this institution,
  • 168:30 - 168:34
    these huge institutions
    are private institutions,
  • 168:34 - 168:35
    but as we've learned
    through PRISM
  • 168:35 - 168:40
    and all the revelation
    subject to government orders
  • 168:40 - 168:42
    and government warrants.
  • 168:42 - 168:46
    And many of them are fighting
    this at least fighting
  • 168:46 - 168:50
    to be able to be more
    transparent because they have--
  • 168:50 - 168:52
    they're stakeholders, too.
  • 168:52 - 168:55
    And-- and they are worried
    that their brands are going
  • 168:55 - 168:57
    to be tarnished by
    being swept up in this.
  • 168:57 - 169:00
    And to talk a little bit
    about the special role
  • 169:00 - 169:02
    of this institution, these
    intermediary institutions,
  • 169:02 - 169:05
    the Apples, the Microsofts,
    the Googles who are
  • 169:05 - 169:07
    on one hand are getting
    pulled by the government
  • 169:07 - 169:08
    to release information.
  • 169:08 - 169:11
    On the other hand, they're being
    pressured to keep it private.
  • 169:11 - 169:12
    >> Yeah.
  • 169:12 - 169:15
    >> Yeah, that's a really
    important question.
  • 169:15 - 169:19
    And if you look at-- Let me
    just back it up to before PRISM.
  • 169:19 - 169:22
    If you look at even just the
    Google transparency reports
  • 169:22 - 169:24
    which probably don't have
    everything and I think
  • 169:24 - 169:28
    and even Google says that
    national security isn't--
  • 169:28 - 169:30
    not everything is reflected
    in the transparency reports.
  • 169:30 - 169:32
    But when you do look
    at something like that,
  • 169:32 - 169:34
    you can see that there
    is a very big disconnect
  • 169:34 - 169:37
    between what governments
    are asking information
  • 169:37 - 169:40
    intermediaries to do and
    what they actually do.
  • 169:40 - 169:44
    So, you can see that--
    just to give an example.
  • 169:44 - 169:49
    Maybe they disclosed data about
    an individual in 37 percent,
  • 169:49 - 169:54
    37 percent of the time in--
    to the Brazilian government
  • 169:54 - 169:57
    or 47 percent of the time
    to another government.
  • 169:57 - 169:58
    Do you see what I'm saying?
  • 169:58 - 170:01
    There is a disconnect in that
    differential between the request
  • 170:01 - 170:04
    to turn over user data
    and the actual instances
  • 170:04 - 170:05
    of turning over the data.
  • 170:05 - 170:09
    That's where they have this
    special governance role.
  • 170:09 - 170:11
    And that's an important
    point the make.
  • 170:11 - 170:16
    But they also bear a burden
    in carrying out a request
  • 170:16 - 170:18
    such as the more
    recent revelations
  • 170:18 - 170:21
    because there is a
    public relations hit.
  • 170:21 - 170:26
    There is the cost of hiring
    numerous attorneys to deal
  • 170:26 - 170:27
    with the fallout from this.
  • 170:27 - 170:29
    And there's the possible
    economic impact
  • 170:29 - 170:33
    of having trouble doing business
    in other parts of the world
  • 170:33 - 170:36
    that might be suspicious
    towards those companies.
  • 170:36 - 170:39
    >> Yeah, you might be thanking--
    they might be thanking Snowden
  • 170:39 - 170:42
    for the revelations
    because and, of course,
  • 170:42 - 170:45
    by the national security
    letters.
  • 170:45 - 170:46
    They're not allowed to--
  • 170:46 - 170:48
    to let anyone know that
    they collected the data.
  • 170:48 - 170:50
    So, now on the other
    hand, you have the--
  • 170:50 - 170:52
    the Snowden revelations
    which were saying, "Hey,
  • 170:52 - 170:54
    these guys had to
    give out the data".
  • 170:54 - 170:57
    So, you know, and
    now that's come out.
  • 170:57 - 170:59
    What they should have
    said is instead of trying
  • 170:59 - 171:00
    to back [inaudible]
    been saying, "You know,
  • 171:00 - 171:02
    trying to save their
    reputation."
  • 171:02 - 171:03
    We said, "Look, you
    got subpoenaed,
  • 171:03 - 171:04
    we know you got subpoenaed,
  • 171:04 - 171:07
    you gave up the data
    because it's a law."
  • 171:07 - 171:12
    But I think that that
    piece of it is the secrecy,
  • 171:12 - 171:16
    not being able to say why
    something was collected,
  • 171:16 - 171:19
    you know, and not being able
    to even say anything about it.
  • 171:19 - 171:20
    Lesley?
  • 171:20 - 171:24
    >> Well, the question raised
    is this bigger question
  • 171:24 - 171:28
    when we talk about players
    in the internet echo system.
  • 171:28 - 171:31
    The intermediaries
    are a critical part.
  • 171:31 - 171:36
    And we have-- under our law, we
    sort of provide them with a lot
  • 171:36 - 171:39
    of special protections
    because they can't be liable
  • 171:39 - 171:43
    for everything that's going
    on in their platforms.
  • 171:43 - 171:45
    And yet there under
    enormous pressure,
  • 171:45 - 171:50
    this national security
    revelation is one small piece
  • 171:50 - 171:54
    of this very complex
    environment where when you ask
  • 171:54 - 171:57
    about governments and government
    is trying to regulate,
  • 171:57 - 172:02
    the first place that governments
    go in trying to solve anything,
  • 172:02 - 172:05
    a social problem is to
    try to figure out how
  • 172:05 - 172:08
    to get the intermediaries to
    take on that responsibility.
  • 172:08 - 172:12
    So, this has been sort of
    a tension that's existed
  • 172:12 - 172:16
    since we actually had any
    kind of powerful intermediary.
  • 172:16 - 172:19
    So, that's one side, and the
    other side is the amounts
  • 172:19 - 172:21
    of power they have as
    a governance entity
  • 172:21 - 172:23
    to set the rules of
    [inaudible] for the internet.
  • 172:23 - 172:24
    And so--
  • 172:24 - 172:26
    >> And one of the
    interesting variables here is,
  • 172:26 - 172:29
    I noticed just recently that
    several of the big players,
  • 172:29 - 172:31
    Microsoft, Google, Yahoo,
  • 172:31 - 172:34
    I think three have
    all filed suit--
  • 172:34 - 172:35
    >> Right.
  • 172:35 - 172:38
    >> -- in an attempt to be
    able to be more transparent
  • 172:38 - 172:41
    about the role and the-- and
    the requirements they're under.
  • 172:41 - 172:43
    I think in part because of
    the point Laura was raising
  • 172:43 - 172:47
    that they themselves, given the
    culture that they're part of
  • 172:47 - 172:49
    and the stakeholders they have,
  • 172:49 - 172:52
    they are taking a big public
    relations in branding hit
  • 172:52 - 172:53
    for this kind of cooperation.
  • 172:53 - 172:55
    >> And they also have
    real leverage here
  • 172:55 - 172:59
    which they have exercised
    on occasion, you know.
  • 172:59 - 173:03
    Many of the big intermediaries
    Google, Twitter,
  • 173:03 - 173:07
    others have at times gone to
    court to challenge subpoenas
  • 173:07 - 173:09
    and other kinds of court
    orders they receive saying
  • 173:09 - 173:10
    that they're too broad.
  • 173:10 - 173:12
    I mean, Google had been--
  • 173:12 - 173:16
    had been under order to
    turnover large volumes
  • 173:16 - 173:18
    of search log data, this
    was just a criminal case,
  • 173:18 - 173:18
    it's not [inaudible] cases.
  • 173:18 - 173:20
    >> Right.
  • 173:20 - 173:21
    >> And they've challenged
    those and they, you know,
  • 173:21 - 173:24
    so the intermediaries have
    a really important role
  • 173:24 - 173:27
    because at least until
    there's a change in the law,
  • 173:27 - 173:30
    they are often the only ones
    who can actually bring any kind
  • 173:30 - 173:31
    of challenge to the
    scope of the--
  • 173:31 - 173:34
    of the authority that the
    government is claiming.
  • 173:34 - 173:37
    >> This is nothing new
    because in World War II,
  • 173:37 - 173:41
    the federal government used to
    look at the telegraphs at RCA
  • 173:41 - 173:44
    and look at the phone system
    for records and stuff like that.
  • 173:44 - 173:46
    I mean what's changed
    is the number of players
  • 173:46 - 173:49
    because back then it was
    just those two, I suppose,
  • 173:49 - 173:50
    plus maybe some minors.
  • 173:50 - 173:51
    Now you have you know multi--
  • 173:51 - 173:53
    multinational groups
    all over the place.
  • 173:53 - 173:57
    But the technique, I mean,
    the tactic of government going
  • 173:57 - 174:02
    to a provider and getting the
    information they need, it's--
  • 174:02 - 174:03
    it's [inaudible] been there.
  • 174:03 - 174:05
    >> I would assume the
    Egyptian government did
  • 174:05 - 174:07
    that when the communications
    were chiseled, you know,
  • 174:07 - 174:09
    letters chiseled
    on stone tablets.
  • 174:09 - 174:11
    >> I'm going to-- We
    have five minutes left.
  • 174:11 - 174:14
    I'm going to give each
    you one minute to--
  • 174:14 - 174:15
    we've had a long day,
  • 174:15 - 174:18
    very attentive audience,
    lot of issues.
  • 174:18 - 174:23
    But I want to give each of you
    a chance to make a final point
  • 174:23 - 174:25
    after this conversation.
  • 174:25 - 174:27
    You started with
    your initial remarks.
  • 174:27 - 174:30
    But what do you want the
    audience to be left with?
  • 174:30 - 174:34
    What-- in each of your mind
    is something significant we've
  • 174:34 - 174:36
    talked about today, an issue
    that you want to reinforce?
  • 174:36 - 174:38
    I want to give each you
    a chance to do that,
  • 174:38 - 174:40
    and since we've started
    with you earlier,
  • 174:40 - 174:44
    I'll start with you this
    way and we'll go down.
  • 174:44 - 174:47
    >> So as contentious
    this processes is,
  • 174:47 - 174:50
    I think it's incredibly
    important,
  • 174:50 - 174:53
    the dialogue that's happening
    in the United States.
  • 174:53 - 174:57
    Hopefully, we can do more than
    just, you know, have battles
  • 174:57 - 175:00
    in the newspapers and on panels
    like this and really look
  • 175:00 - 175:04
    at our legal system and make
    sure that it has the kind
  • 175:04 - 175:07
    of accountability
    that we needed.
  • 175:07 - 175:09
    Laura said, you know,
    the problem here is scale
  • 175:09 - 175:11
    and I really agree with that.
  • 175:11 - 175:12
    Courts have been good
  • 175:12 - 175:15
    at supervising electronic
    surveillance [inaudible] one
  • 175:15 - 175:18
    wiretap on one phone number
    or a handful of phone numbers.
  • 175:18 - 175:23
    The scale of intrusion here is
    such that we can only manage it
  • 175:23 - 175:28
    and have it a proper oversight
    with using exactly the same kind
  • 175:28 - 175:30
    of computational tools
    and analytic tools
  • 175:30 - 175:32
    that are being used by the
    intelligence agencies to try
  • 175:32 - 175:34
    to mind this data to begin with.
  • 175:34 - 175:38
    And the one other thing I
    would just say is, this is--
  • 175:38 - 175:41
    as several people said,
    this is a global discussion.
  • 175:41 - 175:48
    The US is obviously an important
    part of the internet environment
  • 175:48 - 175:51
    but we are not the only
    country on the internet
  • 175:51 - 175:56
    and we do not control the whole
    internet and I think that just
  • 175:56 - 175:59
    as we are examining the behavior
    of our intelligence agencies,
  • 175:59 - 176:01
    I think it's very important
    that other countries
  • 176:01 - 176:05
    that have democratic values
    that believe in transparency
  • 176:05 - 176:08
    and due process really look
    very hard at their intelligence
  • 176:08 - 176:10
    and surveillance
    practices as well.
  • 176:10 - 176:11
    >> Thanks, Dan.
  • 176:11 - 176:12
    Lynn.
  • 176:12 - 176:16
    >> I guess I'd start with saying
  • 176:16 - 176:18
    that everybody's voice is
    extraordinarily important,
  • 176:18 - 176:20
    not only here in the US
  • 176:20 - 176:21
    but every individual
    around the world.
  • 176:21 - 176:24
    And we need attention
    and we need voice
  • 176:24 - 176:28
    to every issue whether it's
    this particular set of issues
  • 176:28 - 176:32
    or it's United Nation's
    expressed interest
  • 176:32 - 176:35
    in some portions of
    the internet ecosystem.
  • 176:35 - 176:38
    And I think specific
    to this issue,
  • 176:38 - 176:40
    the only thing I would say is--
  • 176:40 - 176:42
    [inaudible] we're all
    very interconnected
  • 176:42 - 176:44
    and very interdependent.
  • 176:44 - 176:46
    And when we think about
    things such as security
  • 176:46 - 176:50
    or managing risk or trying
    to manage those tradeoffs,
  • 176:50 - 176:52
    we actually need
    to do that in a--
  • 176:52 - 176:54
    with both sides of the dialogue.
  • 176:54 - 176:57
    One actually looking at what
    we can actually do to protect
  • 176:57 - 177:01
    and ensure things that
    support economic prosperity
  • 177:01 - 177:04
    and social development
    while trading off
  • 177:04 - 177:07
    against preventing
    any perceived harm.
  • 177:07 - 177:11
    I think we're far too much to
    [inaudible] over in the side
  • 177:11 - 177:13
    of preventing perceived harm,
  • 177:13 - 177:15
    and that we don't quite
    have the balanced rate yet.
  • 177:15 - 177:17
    >> Laura.
  • 177:17 - 177:23
    >> I think it's important not to
    take the stability and success
  • 177:23 - 177:26
    and the security of the
    internet for granted.
  • 177:26 - 177:29
    So, we're used to it working for
    the most part because of efforts
  • 177:29 - 177:32
    like John's organization and
    others and it is working,
  • 177:32 - 177:36
    but we also have examples
    of problems with denial
  • 177:36 - 177:38
    of service attacks, examples
  • 177:38 - 177:40
    of government is
    cutting off access,
  • 177:40 - 177:42
    examples of interconnection
    problems.
  • 177:42 - 177:45
    We have trends away
    from interoperability
  • 177:45 - 177:46
    which I think is a
    really big problem.
  • 177:46 - 177:48
    We have trends away
    from anonymity
  • 177:48 - 177:51
    which changes the nature of
    the technical infrastructure.
  • 177:51 - 177:54
    So, point one is that we
    cannot take the stability
  • 177:54 - 177:55
    and security for granted.
  • 177:55 - 177:58
    It has required a
    tremendous amount of effort
  • 177:58 - 178:00
    on the parts of many,
    many people.
  • 178:00 - 178:04
    And it's something that concerns
    me everyday when I see some
  • 178:04 - 178:08
    of the trends away from
    interoperability and security
  • 178:08 - 178:12
    and interconnection
    security, basically.
  • 178:12 - 178:13
    So, that's point number one.
  • 178:13 - 178:16
    The second point
    is if we believe
  • 178:16 - 178:20
    that we have the public sphere
    in the online environment,
  • 178:20 - 178:22
    if we believe that there's
    a technical mediation
  • 178:22 - 178:24
    of the public sphere,
    then we have to think
  • 178:24 - 178:25
    about what that means.
  • 178:25 - 178:27
    What does the democracy look
  • 178:27 - 178:31
    like when the public
    sphere is now digital?
  • 178:31 - 178:35
    So, we know historically that
    possibility is for anonymity
  • 178:35 - 178:38
    or at least traceable anonymity
    have been closely linked
  • 178:38 - 178:39
    to democracy.
  • 178:39 - 178:44
    So, I think it's important
    to just ask the question
  • 178:44 - 178:45
    of what does democracy look
  • 178:45 - 178:49
    like when we have this technical
    mediation of the public sphere
  • 178:49 - 178:51
    and the privatization of
    conditions of civil liberties
  • 178:51 - 178:56
    where private companies are
    getting a request delegated
  • 178:56 - 178:59
    from governments or carrying
    out their own governance
  • 178:59 - 179:00
    of these infrastructures.
  • 179:00 - 179:02
    So, don't take the
    internet security
  • 179:02 - 179:04
    and instability for granted.
  • 179:04 - 179:06
    Remember that we have
    the technical mediation
  • 179:06 - 179:08
    of the public sphere
    and the privatization
  • 179:08 - 179:09
    of conditions of
    civil liberties.
  • 179:09 - 179:14
    And the final point is that
    internet governance is something
  • 179:14 - 179:19
    that is-- that the
    public can be engaged in.
  • 179:19 - 179:22
    We've seen examples of
    civil society action
  • 179:22 - 179:27
    of private action, civil
    liberties advocates involved
  • 179:27 - 179:29
    in decisions about
    internet governance.
  • 179:29 - 179:31
    There are many avenues
    to do that.
  • 179:31 - 179:33
    That the-- I do have an
    engineering background
  • 179:33 - 179:36
    but the technology
    is not that hard
  • 179:36 - 179:39
    that people can learn the
    technology, learn the issues
  • 179:39 - 179:41
    and get engaged in these
    debates which are very important
  • 179:41 - 179:43
    because as goes internet
    governance,
  • 179:43 - 179:45
    so goes internet freedom.
  • 179:45 - 179:47
    >> Randy?
  • 179:47 - 179:51
    >> All of you here that are
    attending this panel discussion,
  • 179:51 - 179:54
    you all have stake in what
    we've been talking about.
  • 179:54 - 179:58
    And so I would challenge you
    guys to not only make sure
  • 179:58 - 179:59
    that your peers understand
    what some of the pressures are
  • 179:59 - 180:01
    and what some of the
    pitfalls and what are some
  • 180:01 - 180:02
    of the disadvantages are of
    working with the internet.
  • 180:02 - 180:07
    But also, you guys are
    probably going to be on a track
  • 180:07 - 180:11
    where you're going to be
    working with legislators
  • 180:11 - 180:14
    and policy makers
    and things like that.
  • 180:14 - 180:17
    And make sure that they
    understand the implications
  • 180:17 - 180:20
    of whatever it is that they're
    trying to write up or implement,
  • 180:20 - 180:23
    both on the legal side
    and on the policy side.
  • 180:23 - 180:27
    That's to me what I see your
    contribution to the whole thing.
  • 180:27 - 180:29
    It's going to be
    very incremental.
  • 180:29 - 180:31
    It's going to take a long time.
  • 180:31 - 180:33
    You know if some people talk
    about the fact that it's
  • 180:33 - 180:35
    like when the automobile was
    first introduced at the turn
  • 180:35 - 180:39
    of the century and it took us
    25 years to come up with the set
  • 180:39 - 180:41
    of laws and procedures
    to make things work
  • 180:41 - 180:42
    when we didn't do that.
  • 180:42 - 180:45
    But you guys are the ones that
    are going to be talking to
  • 180:45 - 180:49
    and down the road
    influencing policy and laws.
  • 180:49 - 180:51
    And so, you know,
    take the challenge,
  • 180:51 - 180:55
    do it and influence
    what you can.
  • 180:55 - 180:56
    >> Leslie.
  • 180:56 - 180:59
    >> So I'm just going
    to make one point.
  • 180:59 - 181:03
    And that's that those of us
    who are Americans or people
  • 181:03 - 181:07
    in the United States have
    a voice in this debate
  • 181:07 - 181:09
    that we need to exercise.
  • 181:09 - 181:11
    But we need to exercise
    it with more
  • 181:11 - 181:13
    than our own rights in mind.
  • 181:13 - 181:17
    If-- the interesting thing
    about the internet and the lack
  • 181:17 - 181:19
    of a government control is the--
  • 181:19 - 181:23
    it's also very unclear
    about who's responsible
  • 181:23 - 181:30
    for the human rights of people
    in the online environment.
  • 181:30 - 181:34
    And we-- I think the NSA example
    that we have data flowing all
  • 181:34 - 181:37
    over the world, states
    have obligations to people
  • 181:37 - 181:44
    within their borders, and
    a few other kinds of places
  • 181:44 - 181:46
    that the law has developed.
  • 181:46 - 181:49
    We're not sure who has the
    obligation for the rights
  • 181:49 - 181:51
    of internet users in
    this kind of environment.
  • 181:51 - 181:57
    And it's a conversation
    that really needs to happen
  • 181:57 - 181:59
    because the-- our lives,
    some significant part
  • 181:59 - 182:04
    of who we are is now online
    and flowing through these wires
  • 182:04 - 182:07
    and who ultimately is
    responsible to make sure
  • 182:07 - 182:10
    that basic privacy and free
    expression rights are honored
  • 182:10 - 182:13
    has become increasingly complex.
  • 182:13 - 182:16
    So, people need to
    take that on as well.
  • 182:16 - 182:16
    >> Last word, John.
  • 182:16 - 182:21
    >> It's a global internet.
  • 182:21 - 182:24
    There needs to be an
    equally global discussion
  • 182:24 - 182:28
    about the principles
    by which we operate it.
  • 182:28 - 182:31
    And part of what's occurred
  • 182:31 - 182:35
    over the last few months will
    help encourage one aspect
  • 182:35 - 182:36
    of that discussion.
  • 182:36 - 182:38
    So, it's a probably
    a good thing.
  • 182:38 - 182:39
    >> I want to thank all
    of you for being here.
  • 182:39 - 182:41
    I want to thank the
    Internet Society.
  • 182:41 - 182:43
    I want to thank the GW
    Engineering Department
  • 182:43 - 182:46
    and their [inaudible]
    Lance Hoffman's Institute.
  • 182:46 - 182:48
    Long afternoon, thanks
    for your patience.
  • 182:48 - 182:49
    I hope you learned something.
  • 182:49 - 182:51
    Come back again.
  • 182:51 - 182:52
    Thanks a lot.
  • 182:52 - 182:54
    [Applause]
Title:
INET Washington DC: Surveillance, Cybersecurity and the Future of the Internet
Description:

more » « less
Video Language:
English
Duration:
03:02:55

English subtitles

Revisions