0:00:00.080,0:00:02.136 Cyndi Stivers: So, future of storytelling. 0:00:02.160,0:00:03.976 Before we do the future, 0:00:04.000,0:00:07.896 let's talk about what is never[br]going to change about storytelling. 0:00:07.920,0:00:10.016 Shonda Rhimes:[br]What's never going to change. 0:00:10.040,0:00:12.776 Obviously, I think good stories[br]are never going to change, 0:00:12.800,0:00:16.576 the need for people to gather together[br]and exchange their stories 0:00:16.600,0:00:19.816 and to talk about the things[br]that feel universal, 0:00:19.840,0:00:22.776 the idea that we all feel[br]a compelling need to watch stories, 0:00:22.800,0:00:24.760 to tell stories, to share stories -- 0:00:26.000,0:00:27.976 sort of the gathering around the campfire 0:00:28.000,0:00:30.296 to discuss the things[br]that tell each one of us 0:00:30.320,0:00:32.000 that we are not alone in the world. 0:00:32.880,0:00:35.016 Those things to me[br]are never going to change. 0:00:35.040,0:00:38.200 That essence of storytelling[br]is never going to change. 0:00:38.880,0:00:41.736 CS: OK. In preparation[br]for this conversation, 0:00:41.760,0:00:43.656 I checked in with Susan Lyne, 0:00:43.680,0:00:45.776 who was running ABC Entertainment 0:00:45.800,0:00:48.576 when you were working[br]on "Grey's Anatomy" -- 0:00:48.600,0:00:49.816 SR: Yes. 0:00:49.840,0:00:52.736 CS: And she said that there was[br]this indelible memory she had 0:00:52.760,0:00:54.736 of your casting process, 0:00:54.760,0:00:57.416 where without discussing it[br]with any of the executives, 0:00:57.440,0:00:59.816 you got people coming in[br]to read for your scripts, 0:00:59.840,0:01:03.936 and every one of them[br]was the full range of humanity, 0:01:03.960,0:01:07.216 you did not type anyone in any way, 0:01:07.240,0:01:10.056 and that it was completely surprising. 0:01:10.080,0:01:14.736 So she said, in addition[br]to retraining the studio executives, 0:01:14.760,0:01:16.856 you also, she feels, 0:01:16.880,0:01:19.416 and I think this is -- I agree, 0:01:19.440,0:01:23.840 retrained the expectations[br]of the American TV audience. 0:01:24.360,0:01:29.520 So what else does the audience[br]not yet realize that it needs? 0:01:30.480,0:01:32.336 SR: What else does it not yet realize? 0:01:32.360,0:01:35.496 Well, I mean, I don't think[br]we're anywhere near there yet. 0:01:35.520,0:01:37.256 I mean, we're still in a place 0:01:37.280,0:01:44.216 in which we're far, far behind what looks[br]like the real world in actuality. 0:01:44.240,0:01:47.936 I wasn't bringing in[br]a bunch of actors 0:01:47.960,0:01:50.576 who looked very different from one another 0:01:50.600,0:01:52.696 simply because I was[br]trying to make a point, 0:01:52.720,0:01:55.056 and I wasn't trying[br]to do anything special. 0:01:55.080,0:01:59.056 It never occurred to me[br]that that was new, different or weird. 0:01:59.080,0:02:02.136 I just brought in actors[br]because I thought they were interesting 0:02:02.160,0:02:05.976 and to me, the idea that it[br]was completely surprising to everybody -- 0:02:06.000,0:02:08.175 I didn't know that for a while. 0:02:08.199,0:02:11.416 I just thought: these are the actors[br]I want to see play these parts. 0:02:11.440,0:02:13.656 I want to see what[br]they look like if they read. 0:02:13.680,0:02:14.936 We'll see what happens. 0:02:14.960,0:02:18.336 So I think the interesting thing[br]that happens is 0:02:18.360,0:02:20.896 that when you look at the world[br]through another lens, 0:02:20.920,0:02:25.176 when you're not the person[br]normally in charge of things, 0:02:25.200,0:02:26.840 it just comes out a different way. 0:02:28.400,0:02:32.816 CS: So you now have[br]this big machine that you run, 0:02:32.840,0:02:35.856 as a titan -- as you know,[br]last year when she gave her talk -- 0:02:35.880,0:02:37.816 she's a titan. 0:02:37.840,0:02:41.216 So what do you think[br]is going to happen as we go on? 0:02:41.240,0:02:46.136 There's a huge amount of money[br]involved in producing these shows. 0:02:46.160,0:02:52.376 While the tools of making stories[br]have gone and gotten greatly democratized, 0:02:52.400,0:02:54.936 there's still this large distribution: 0:02:54.960,0:03:00.056 people who rent networks,[br]who rent the audience to advertisers 0:03:00.080,0:03:01.776 and make it all pay. 0:03:01.800,0:03:07.016 How do you see the business model changing[br]now that anyone can be a storyteller? 0:03:07.040,0:03:08.776 SR: I think it's changing every day. 0:03:08.800,0:03:11.736 I mean, the rapid, rapid change[br]that's happening is amazing. 0:03:11.760,0:03:15.096 And I feel -- the panic is palpable, 0:03:15.120,0:03:16.936 and I don't mean that in a bad way. 0:03:16.960,0:03:18.616 I think it's kind of exciting. 0:03:18.640,0:03:23.296 The idea that there's[br]sort of an equalizer happening, 0:03:23.320,0:03:27.096 that sort of means that anybody[br]can make something, is wonderful. 0:03:27.120,0:03:32.856 I think there's some scary in the idea[br]that you can't find the good work now. 0:03:32.880,0:03:34.376 There's so much work out there. 0:03:34.400,0:03:37.496 I think there's something like[br]417 dramas on television right now 0:03:37.520,0:03:40.056 at any given time in any given place, 0:03:40.080,0:03:41.336 but you can't find them. 0:03:41.360,0:03:42.776 You can't find the good ones. 0:03:42.800,0:03:46.456 So there's a lot of bad stuff out there[br]because everybody can make something. 0:03:46.480,0:03:48.496 It's like if everybody painted a painting. 0:03:48.520,0:03:51.016 You know, there's not[br]that many good painters. 0:03:51.040,0:03:54.056 But finding the good stories,[br]the good shows, 0:03:54.080,0:03:55.656 is harder and harder and harder. 0:03:55.680,0:03:58.296 Because if you have[br]one tiny show over here on AMC 0:03:58.320,0:04:00.216 and one tiny show over here over there, 0:04:00.240,0:04:02.576 finding where they are[br]becomes much harder. 0:04:02.600,0:04:04.456 So I think that ferreting out the gems 0:04:04.480,0:04:07.576 and finding out who made[br]the great webisode and who made this, 0:04:07.600,0:04:09.696 it's -- I mean, think[br]about the poor critics 0:04:09.720,0:04:11.416 who now are spending 24 hours a day 0:04:11.440,0:04:13.496 trapped in their homes[br]watching everything. 0:04:13.520,0:04:15.616 It's not an easy job right now. 0:04:15.640,0:04:19.216 So the distribution engines[br]are getting more and more vast, 0:04:19.240,0:04:22.176 but finding the good programming[br]for everybody in the audience 0:04:22.200,0:04:23.416 is getting harder. 0:04:23.440,0:04:25.216 And unlike the news, 0:04:25.240,0:04:28.776 where everything's getting[br]winnowed down to just who you are, 0:04:28.800,0:04:30.416 television seems to be getting -- 0:04:30.440,0:04:34.376 and by television I mean anything[br]you can watch, television shows on -- 0:04:34.400,0:04:36.616 seems to be getting[br]wider and wider and wider. 0:04:36.640,0:04:39.296 And so anybody's making stories, 0:04:39.320,0:04:41.616 and the geniuses are sometimes hidden. 0:04:41.640,0:04:44.416 But it's going to be harder to find, 0:04:44.440,0:04:46.456 and at some point that will collapse. 0:04:46.480,0:04:48.216 People keep talking about peak TV. 0:04:48.240,0:04:50.216 I don't know when that's going to happen. 0:04:50.240,0:04:52.696 I think at some point[br]it'll collapse a little bit 0:04:52.720,0:04:54.616 and we'll, sort of, come back together. 0:04:54.640,0:04:56.816 I don't know if it[br]will be network television. 0:04:56.840,0:04:58.880 I don't know if that model is sustainable. 0:04:59.440,0:05:00.976 CS: What about the model 0:05:01.000,0:05:06.440 that Amazon and Netflix are throwing[br]a lot of money around right now. 0:05:07.520,0:05:09.816 SR: That is true. 0:05:09.840,0:05:11.496 I think it's an interesting model. 0:05:11.520,0:05:13.616 I think there's[br]something exciting about it. 0:05:13.640,0:05:16.776 For content creators, I think[br]there's something exciting about it. 0:05:16.800,0:05:19.536 For the world, I think[br]there's something exciting about it. 0:05:19.560,0:05:21.296 The idea that there are programs now 0:05:21.320,0:05:24.776 that can be in multiple languages[br]with characters from all over the world 0:05:24.800,0:05:27.776 that are appealing and come out[br]for everybody at the same time 0:05:27.800,0:05:29.216 is exciting. 0:05:29.240,0:05:33.736 I mean, I think the international sense[br]that television can now take on 0:05:33.760,0:05:34.976 makes sense to me, 0:05:35.000,0:05:36.616 that programming can now take on. 0:05:36.640,0:05:39.896 Television so much is made for, like --[br]here's our American audience. 0:05:39.920,0:05:41.176 We make these shows, 0:05:41.200,0:05:43.216 and then they shove them[br]out into the world 0:05:43.240,0:05:44.536 and hope for the best, 0:05:44.560,0:05:48.496 as opposed to really thinking[br]about the fact that America is not it. 0:05:48.520,0:05:51.256 I mean, we love ourselves[br]and everything, but it's not i. 0:05:51.280,0:05:54.456 And we should be[br]taking into account the fact 0:05:54.480,0:05:56.976 that there are all[br]of these other places in the world 0:05:57.000,0:06:00.016 that we should be interested in[br]while we're telling stories. 0:06:00.040,0:06:02.120 It makes the world smaller. 0:06:03.520,0:06:04.736 I don't know. 0:06:04.760,0:06:09.736 I think it pushes forward the idea[br]that the world is a universal place, 0:06:09.760,0:06:11.696 and our stories become universal things. 0:06:11.720,0:06:12.960 We stop being other. 0:06:13.640,0:06:17.456 CS: You've pioneered, as far as I can see, 0:06:17.480,0:06:20.096 interesting ways to launch new shows, too. 0:06:20.120,0:06:23.136 I mean, when you[br]launched "Scandal" in 2012, 0:06:23.160,0:06:26.616 there was this amazing groundswell[br]of support on Twitter 0:06:26.640,0:06:29.776 the likes of which nobody had seen before. 0:06:29.800,0:06:32.216 Do you have any other[br]tricks up your sleeve 0:06:32.240,0:06:34.376 when you launch your next one? 0:06:34.400,0:06:36.536 What do you think[br]will happen in that regard? 0:06:36.560,0:06:39.016 SR: We do have some interesting ideas. 0:06:39.040,0:06:42.176 We have a show called "Still Star-Crossed"[br]coming out this summer. 0:06:42.200,0:06:44.136 We have some interesting ideas for that. 0:06:44.160,0:06:46.896 I'm not sure if we're going[br]to be able to do them in time. 0:06:46.920,0:06:48.136 I thought they were fun. 0:06:48.160,0:06:50.336 But the idea[br]that we would live-tweet our show 0:06:50.360,0:06:52.536 was really just us thinking[br]that would be fun. 0:06:52.560,0:06:56.096 We didn't realize that the critics[br]would start to live-tweet along with us. 0:06:56.120,0:06:58.496 But the fans -- getting people[br]to be a part of it, 0:06:58.520,0:07:00.056 making it more of a campfire -- 0:07:00.080,0:07:02.216 you know, when you're all[br]on Twitter together 0:07:02.240,0:07:03.816 and you're all talking together, 0:07:03.840,0:07:05.496 it is more of a shared experience, 0:07:05.520,0:07:07.656 and finding other ways[br]to make that possible 0:07:07.680,0:07:10.016 and finding other ways[br]to make people feel engaged 0:07:10.040,0:07:11.240 is important. 0:07:12.320,0:07:15.960 CS: So when you have[br]all those different people making stories 0:07:16.960,0:07:19.256 and only some of them[br]are going to break through 0:07:19.280,0:07:21.056 and get that audience somehow, 0:07:21.080,0:07:24.056 how do you think[br]storytellers will get paid? 0:07:24.080,0:07:27.016 SR: I actually have been struggling[br]with this concept as well. 0:07:27.040,0:07:29.136 Is it going to be a subscriber model? 0:07:29.160,0:07:33.816 Are people going to say, like, I'm going[br]to watch this particular person's shows, 0:07:33.840,0:07:35.576 and that's how we're going to do it? 0:07:35.600,0:07:38.336 CS: I think we should buy[br]a passport to Shondaland. Right? 0:07:38.360,0:07:41.616 SR: I don't know about that, but yeah.[br]That's a lot more work for me. 0:07:41.640,0:07:44.456 I do think that there are[br]going to be different ways, 0:07:44.480,0:07:45.976 but I don't know necessarily. 0:07:46.000,0:07:48.656 I mean, I'll be honest and say[br]a lot of content creators 0:07:48.680,0:07:52.096 are not necessarily interested[br]in being distributors, 0:07:52.120,0:07:55.336 mainly because what I dream of doing 0:07:55.360,0:07:56.776 is creating content. 0:07:56.800,0:07:58.656 I really love to create content. 0:07:58.680,0:07:59.896 I want to get paid for it 0:07:59.920,0:08:03.096 and I want to get paid the money[br]that I deserve to get paid for it, 0:08:03.120,0:08:05.056 and there's a hard part in finding that. 0:08:05.080,0:08:07.616 But I also want it to be made possible 0:08:07.640,0:08:10.616 for, you know,[br]the people who work with me, 0:08:10.640,0:08:11.976 the people who work for me, 0:08:12.000,0:08:15.376 everybody to sort of get paid in a way,[br]and they're all making a living. 0:08:15.400,0:08:18.600 How it gets distributed[br]is getting harder and harder. 0:08:20.000,0:08:22.616 CS: How about the many new tools, 0:08:22.640,0:08:25.856 you know, VR, AR ... 0:08:25.880,0:08:29.896 I find it fascinating[br]that you can't really binge-watch, 0:08:29.920,0:08:33.176 you can't fast-forward in those things. 0:08:33.200,0:08:36.336 What do you see as the future[br]of those for storytelling? 0:08:36.360,0:08:39.096 SR: I spent a lot of time in the past year 0:08:39.120,0:08:40.696 just exploring those, 0:08:40.720,0:08:43.376 getting lots of demonstrations[br]and paying attention. 0:08:43.400,0:08:45.216 I find them fascinating, 0:08:45.240,0:08:47.216 mainly because I think that -- 0:08:47.240,0:08:49.536 I think most people[br]think of them for gaming, 0:08:49.560,0:08:52.256 I think most people think of them[br]for things like action, 0:08:52.280,0:08:54.936 and I think that there is[br]a sense of intimacy 0:08:54.960,0:08:58.856 that is very present in those things, 0:08:58.880,0:09:01.376 the idea that -- picture this, 0:09:01.400,0:09:04.896 you can sit there[br]and have a conversation with Fitz, 0:09:04.920,0:09:07.096 or at least sit there[br]while Fitz talks to you, 0:09:07.120,0:09:09.016 President Fitzgerald Grant III, 0:09:09.040,0:09:10.336 while he talks to you 0:09:10.360,0:09:12.496 about why he's making[br]a choice that he makes, 0:09:12.520,0:09:14.136 and it's a very heartfelt moment. 0:09:14.160,0:09:16.736 And instead of you watching[br]a television screen, 0:09:16.760,0:09:19.976 you're sitting there next to him,[br]and he's having this conversation. 0:09:20.000,0:09:21.656 Now, you fall in love with the man 0:09:21.680,0:09:23.816 while he's doing it[br]from a television screen. 0:09:23.840,0:09:25.216 Imagine sitting next to him, 0:09:25.240,0:09:29.296 or being with a character like Huck[br]who's about to execute somebody. 0:09:29.320,0:09:30.816 And instead of having a scene 0:09:30.840,0:09:34.656 where, you know, he's talking[br]to another character very rapidly, 0:09:34.680,0:09:38.056 he goes into a closet and turns to you[br]and tells you, you know, 0:09:38.080,0:09:40.696 what's going to happen[br]and why he's afraid and nervous. 0:09:40.720,0:09:43.736 It's a little more like theater,[br]and I'm not sure it would work, 0:09:43.760,0:09:46.456 but I'm fascinating by the concept[br]of something like that 0:09:46.480,0:09:48.456 and what that would mean for an audience. 0:09:48.480,0:09:51.176 And to get to play with those ideas[br]would be interesting, 0:09:51.200,0:09:55.616 and I think, you know, for my audience,[br]the people who watch my shows, 0:09:55.640,0:09:57.936 which is, you know, women 12 to 75, 0:09:57.960,0:10:00.640 there's something interesting[br]in there for them. 0:10:02.720,0:10:05.496 CS: And how about[br]the input of the audience? 0:10:05.520,0:10:07.336 How interested are you in the things 0:10:07.360,0:10:10.496 where the audience[br]can actually go up to a certain point 0:10:10.520,0:10:14.176 and then decide, oh wait,[br]I'm going to choose my own adventure. 0:10:14.200,0:10:17.136 I'm going to run off with Fitz[br]or I'm going to run off with -- 0:10:17.160,0:10:19.376 SR: Oh, the choose-[br]your-own-adventure stories. 0:10:19.400,0:10:20.856 I have a hard time with those, 0:10:20.880,0:10:24.016 and not necessarily because[br]I want to be in control of everything, 0:10:24.040,0:10:27.416 but because when I'm watching television[br]or I'm watching a movie, 0:10:27.440,0:10:32.256 I know for a fact[br]that a story is not as good 0:10:32.280,0:10:35.056 when I have control[br]over exactly what's going to happen 0:10:35.080,0:10:36.816 to somebody else's character. 0:10:36.840,0:10:40.616 You know, if I could tell you exactly[br]what I wanted to happen to Walter White, 0:10:40.640,0:10:44.216 that's great, but the story[br]is not the same, and it's not as powerful. 0:10:44.240,0:10:46.816 You know, if I'm in charge[br]of how "The Sopranos" ends, 0:10:46.840,0:10:50.016 then that's lovely and I have an ending[br]that's nice and satisfying, 0:10:50.040,0:10:53.216 but it's not the same story[br]and it's not the same emotional impact. 0:10:53.240,0:10:56.616 CS: I can't stop imagining[br]what that might be. 0:10:56.640,0:10:58.456 Sorry, you're losing me for a minute. 0:10:58.480,0:11:01.056 SR: But what's wonderful is[br]I don't get to imagine it, 0:11:01.080,0:11:03.216 because Vince has his own ending, 0:11:03.240,0:11:06.776 and it makes it really powerful[br]to know that somebody else has told. 0:11:06.800,0:11:09.056 You know, if you could[br]decide that, you know, 0:11:09.080,0:11:11.296 in "Jaws," the shark wins or something, 0:11:11.320,0:11:14.416 it doesn't do what it needs to do for you. 0:11:14.440,0:11:16.155 The story is the story that is told, 0:11:16.179,0:11:18.896 and you can walk away angry[br]and you can walk away debating 0:11:18.920,0:11:20.376 and you can walk away arguing, 0:11:20.400,0:11:21.640 but that's why it works. 0:11:22.280,0:11:23.536 That is why it's art. 0:11:23.560,0:11:25.376 Otherwise, it's just a game, 0:11:25.400,0:11:28.416 and games can be art,[br]but in a very different way. 0:11:28.440,0:11:32.016 CS: Gamers who actually[br]sell the right to sit there 0:11:32.040,0:11:34.176 and comment on what's happening, 0:11:34.200,0:11:37.176 to me that's more community[br]than storytelling. 0:11:37.200,0:11:39.176 SR: And that is its own form of campfire. 0:11:39.200,0:11:42.336 I don't discount that[br]as a form of storytelling, 0:11:42.360,0:11:44.760 but it is a group form, I suppose. 0:11:46.200,0:11:49.656 CS: All right,[br]what about the super-super -- 0:11:49.680,0:11:52.896 the fact that everything's[br]getting shorter, shorter, shorter. 0:11:52.920,0:11:56.176 And, you know, Snapchat[br]now has something it calls shows 0:11:56.200,0:11:57.800 that are one minute long. 0:11:59.080,0:12:00.320 SR: It's interesting. 0:12:02.880,0:12:05.480 Part of me thinks[br]it sounds like commercials. 0:12:06.320,0:12:09.096 I mean, it does -- like, sponsored by. 0:12:09.120,0:12:11.736 But part of me also gets it completely. 0:12:11.760,0:12:13.856 There's something[br]really wonderful about it. 0:12:13.880,0:12:15.176 If you think about a world 0:12:15.200,0:12:18.096 in which most people[br]are watching television on their phones, 0:12:18.120,0:12:19.976 if you think about a place like India, 0:12:20.000,0:12:21.736 where most of the input is coming in 0:12:21.760,0:12:24.136 and that's where[br]most of the product is coming in, 0:12:24.160,0:12:25.416 shorter makes sense. 0:12:25.440,0:12:28.976 If you can charge people more[br]for shorter periods of content, 0:12:29.000,0:12:32.256 some distributor has figured out[br]a way to make a lot more money. 0:12:32.280,0:12:34.296 If you're making content, 0:12:34.320,0:12:37.216 it costs less money[br]to make it and put it out there. 0:12:37.240,0:12:38.456 And, by the way, 0:12:38.480,0:12:42.976 if you're 14 and have[br]a short attention span, like my daughter, 0:12:43.000,0:12:45.736 that's what you want to see,[br]that's what you want to make, 0:12:45.760,0:12:46.976 that's how it works. 0:12:47.000,0:12:51.216 And if you do it right[br]and it actually feels like narrative, 0:12:51.240,0:12:53.560 people will hang on for it[br]no matter what you do. 0:12:54.560,0:12:56.456 CS: I'm glad you raised your daughters, 0:12:56.480,0:13:01.296 because I am wondering how are they[br]going to consume entertainment, 0:13:01.320,0:13:03.576 and also not just entertainment, 0:13:03.600,0:13:04.800 but news, too. 0:13:05.960,0:13:08.936 When they're not -- I mean,[br]the algorithmic robot overlords 0:13:08.960,0:13:12.056 are going to feed them[br]what they've already done. 0:13:12.080,0:13:16.680 How do you think we will correct for that[br]and make people well-rounded citizens? 0:13:17.760,0:13:19.936 SR: Well, me and how I correct for it 0:13:19.960,0:13:22.736 is completely different[br]than how somebody else might do it. 0:13:22.760,0:13:24.856 CS: Feel free to speculate. 0:13:24.880,0:13:27.856 SR: I really don't know[br]how we're going to do it in the future. 0:13:27.880,0:13:31.296 I mean, my poor children have been[br]the subject of all of my experiments. 0:13:31.320,0:13:33.696 We're still doing[br]what I call "Amish summers" 0:13:33.720,0:13:35.456 where I turn off all electronics 0:13:35.480,0:13:37.536 and pack away[br]all their computers and stuff 0:13:37.560,0:13:40.536 and watch them scream for a while[br]until they settle down 0:13:40.560,0:13:43.320 into, like, an electronic-free summer. 0:13:44.280,0:13:46.976 But honestly, it's a very hard world 0:13:47.000,0:13:49.016 in which now, as grown-ups, 0:13:49.040,0:13:52.176 we're so interested[br]in watching our own thing, 0:13:52.200,0:13:55.376 and we don't even know[br]that we're being fed, sometimes, 0:13:55.400,0:13:57.136 just our own opinions. 0:13:57.160,0:13:58.856 You know, the way it's working now, 0:13:58.880,0:14:00.136 you're watching a feed, 0:14:00.160,0:14:01.776 and the feeds are being corrected 0:14:01.800,0:14:03.936 so that you're only getting[br]your own opinions 0:14:03.960,0:14:06.496 and you're feeling[br]more and more right about yourself. 0:14:06.520,0:14:08.376 So how do you really start to discern? 0:14:08.400,0:14:10.216 It's getting a little bit disturbing. 0:14:10.240,0:14:13.056 So maybe it'll overcorrect,[br]maybe it'll all explode, 0:14:13.080,0:14:14.720 or maybe we'll all just become -- 0:14:16.080,0:14:17.616 I hate to be negative about it, 0:14:17.640,0:14:21.576 but maybe we'll all[br]just become more idiotic. 0:14:21.600,0:14:23.336 (Cyndi laughs) 0:14:23.360,0:14:26.776 CS: Yeah, can you picture[br]any corrective that you could do 0:14:26.800,0:14:29.696 with scripted, fictional work? 0:14:29.720,0:14:33.376 SR: I think a lot about the fact[br]that television has the power 0:14:33.400,0:14:35.136 to educate people in a powerful way, 0:14:35.160,0:14:37.016 and when you're watching television -- 0:14:37.040,0:14:40.296 for instance, they do studies[br]about medical shows. 0:14:40.320,0:14:42.456 I think it's 87 percent,[br]87 percent of people 0:14:42.480,0:14:45.856 get most of their knowledge[br]about medicine and medical facts 0:14:45.880,0:14:47.336 from medical shows, 0:14:47.360,0:14:49.496 much more so than[br]they do from their doctors, 0:14:49.520,0:14:50.896 than from articles. 0:14:50.920,0:14:54.336 So we work really hard to be accurate,[br]and every time we make a mistake, 0:14:54.360,0:14:57.176 I feel really guilty,[br]like we're going to do something bad, 0:14:57.200,0:14:59.856 but we also give a lot[br]of good medical information. 0:14:59.880,0:15:02.936 There are so many other ways[br]to give information on those shows. 0:15:02.960,0:15:04.336 People are being entertained 0:15:04.360,0:15:06.416 and maybe they don't want[br]to read the news, 0:15:06.440,0:15:09.856 but there are a lot of ways to give[br]fair information out on those shows, 0:15:09.880,0:15:14.496 not in some creepy, like,[br]we're going to control people's minds way, 0:15:14.520,0:15:17.376 but in a way that's sort of[br]very interesting and intelligent 0:15:17.400,0:15:20.856 and not about pushing[br]one side's version or the other, 0:15:20.880,0:15:22.216 like, giving out the truth. 0:15:22.240,0:15:24.176 It would be strange, though, 0:15:24.200,0:15:27.936 if television drama[br]was how we were giving the news. 0:15:27.960,0:15:29.216 CS: It would be strange, 0:15:29.240,0:15:32.456 but I gather a lot of what[br]you've written as fiction 0:15:32.480,0:15:34.880 has become prediction this season? 0:15:35.560,0:15:38.816 SR: You know, "Scandal" has been[br]very disturbing for that reason. 0:15:38.840,0:15:41.816 We have this show[br]that's about politics gone mad, 0:15:41.840,0:15:44.776 and basically the way[br]we've always told the show -- 0:15:44.800,0:15:47.176 you know, everybody[br]pays attention to the papers. 0:15:47.200,0:15:49.336 We read everything.[br]We talk about everything. 0:15:49.360,0:15:51.296 We have lots of friends in Washington. 0:15:51.320,0:15:54.176 And we'd always sort of[br]done our show as a speculation. 0:15:54.200,0:15:55.696 We'd sit in the room and think, 0:15:55.720,0:15:57.976 what would happen[br]if the wheels came off the bus 0:15:58.000,0:15:59.296 and everything went crazy? 0:15:59.320,0:16:00.856 And that was always great, 0:16:00.880,0:16:03.816 except now it felt like[br]the wheels were coming off the bus 0:16:03.840,0:16:05.656 and things were actually going crazy, 0:16:05.680,0:16:08.656 so the things that we were speculating[br]were really coming true. 0:16:08.680,0:16:10.056 I mean, our season this year 0:16:10.080,0:16:13.856 was going to end with the Russians[br]controlling the American election, 0:16:13.880,0:16:16.536 and we'd written it, we'd planned for it, 0:16:16.560,0:16:17.776 it was all there, 0:16:17.800,0:16:21.616 and then the Russians were suspected[br]of being involved in the American election 0:16:21.640,0:16:24.976 and we suddenly had to change[br]what we were going to do for our season. 0:16:25.000,0:16:26.336 I walked in and I was like, 0:16:26.360,0:16:29.216 "That scene where our mystery woman[br]starts speaking Russian? 0:16:29.240,0:16:32.096 We have to fix that[br]and figure out what we're going to do." 0:16:32.120,0:16:33.776 That just comes from extrapolating 0:16:33.800,0:16:36.016 out from what we thought[br]was going to happen, 0:16:36.040,0:16:37.520 or what we thought was crazy. 0:16:38.520,0:16:39.816 CS: That's great. 0:16:39.840,0:16:44.736 So where else in US or elsewhere[br]in the world do you look? 0:16:44.760,0:16:47.216 Who is doing interesting[br]storytelling right now? 0:16:47.240,0:16:50.256 SR: I don't know, there's a lot[br]of interesting stuff out there. 0:16:50.280,0:16:53.216 Obviously British television[br]is always amazing 0:16:53.240,0:16:55.856 and always does interesting things. 0:16:55.880,0:16:58.176 I don't get to watch a lot of TV, 0:16:58.200,0:17:00.616 mainly because I'm busy working. 0:17:00.640,0:17:04.256 And I pretty much try not to watch[br]very much television at all, 0:17:04.280,0:17:07.175 even American television,[br]until I'm done with a season, 0:17:07.200,0:17:09.695 because things start[br]to creep into my head otherwise. 0:17:09.720,0:17:11.656 I start to wonder, like, 0:17:11.680,0:17:15.096 why can't our characters wear crowns[br]and talk about being on a throne? 0:17:15.119,0:17:16.776 It gets crazy. 0:17:16.800,0:17:20.536 So I try not to watch much[br]until the seasons are over. 0:17:20.560,0:17:24.296 But I do think that there's a lot of[br]interesting European television out there. 0:17:24.319,0:17:26.215 I was at the International Emmys 0:17:26.240,0:17:29.216 and looking around and seeing[br]the stuff that they were showing, 0:17:29.240,0:17:30.656 and I was kind of fascinated. 0:17:30.680,0:17:33.416 There's some stuff[br]I want to watch and check out. 0:17:33.440,0:17:34.696 CS: Can you imagine -- 0:17:34.720,0:17:38.256 I know that you don't spend a lot of time[br]thinking about tech stuff, 0:17:38.280,0:17:41.296 but you know how a few years ago[br]we had someone here at TED 0:17:41.320,0:17:43.576 talking about seeing, 0:17:43.600,0:17:49.736 wearing Google Glass and seeing[br]your TV shows essentially in your eye? 0:17:49.760,0:17:52.056 Do you ever fantasize when, you know -- 0:17:52.080,0:17:54.536 the little girl[br]who sat on the pantry floor 0:17:54.560,0:17:56.296 in your parents' house, 0:17:56.320,0:17:58.920 did you ever imagine any other medium? 0:18:00.200,0:18:01.656 Or would you now? 0:18:01.680,0:18:03.016 SR: Any other medium. 0:18:03.040,0:18:04.736 For storytelling, other than books? 0:18:04.760,0:18:07.576 I mean, I grew up wanting[br]to be Toni Morrison, so no. 0:18:07.600,0:18:09.576 I mean, I didn't even imagine television. 0:18:09.600,0:18:13.456 So the idea that there could be[br]some bigger world, 0:18:13.480,0:18:15.656 some more magical way of making things --- 0:18:15.680,0:18:17.936 I'm always excited[br]when new technology comes out 0:18:17.960,0:18:20.976 and I'm always the first one[br]to want to try it. 0:18:21.000,0:18:24.176 The possibilities feel endless[br]and exciting right now, 0:18:24.200,0:18:25.640 which is what excites me. 0:18:27.040,0:18:30.096 We're in this sort of Wild West period,[br]to me, it feels like, 0:18:30.120,0:18:32.536 because nobody knows[br]what we're going to settle on. 0:18:32.560,0:18:34.776 You can put stories anywhere right now 0:18:34.800,0:18:36.256 and that's cool to me, 0:18:36.280,0:18:40.416 and it feels like once we figure out[br]how to get the technology 0:18:40.440,0:18:43.896 and the creativity[br]of storytelling to meet, 0:18:43.920,0:18:45.349 the possibilities are endless. 0:18:46.280,0:18:51.016 CS: And also the technology has enabled[br]the thing I briefly flew by earlier, 0:18:51.040,0:18:54.016 binge-viewing,[br]which is a recent phenomenon, 0:18:54.040,0:18:56.216 since you've been doing shows, right? 0:18:56.240,0:19:00.936 And how do you think does that change[br]the storytelling process at all? 0:19:00.960,0:19:04.776 You always had a bible[br]for the whole season beforehand, right? 0:19:04.800,0:19:08.136 SR: No, I just always knew[br]where we were going to end. 0:19:08.160,0:19:10.496 So for me, 0:19:10.520,0:19:12.496 the only way I can really comment on that 0:19:12.520,0:19:17.216 is that I have a show[br]that's been going on for 14 seasons 0:19:17.240,0:19:20.536 and so there are the people[br]who have been watching it for 14 seasons, 0:19:20.560,0:19:24.096 and then there are the 12-year-old girls[br]I'd encounter in the grocery store 0:19:24.120,0:19:28.096 who had watched[br]297 episodes in three weeks. 0:19:28.120,0:19:30.936 Seriously, and that's a very different[br]experience for them, 0:19:30.960,0:19:32.896 because they've been inside of something 0:19:32.920,0:19:36.336 really intensely for[br]a very short period of time 0:19:36.360,0:19:37.856 in a very intense way, 0:19:37.880,0:19:40.816 and to them the story[br]has a completely different arc 0:19:40.840,0:19:42.496 and a completely different meaning 0:19:42.520,0:19:44.096 because it never had any breaks. 0:19:44.120,0:19:47.456 CS: It's like visiting a country[br]and then leaving it. It's a strange -- 0:19:47.480,0:19:50.536 SR: It's like reading an amazing novel[br]and then putting it down. 0:19:50.560,0:19:53.616 I think that is the beauty[br]of the experience. 0:19:53.640,0:19:56.576 You don't necessarily have to watch[br]something for 14 seasons. 0:19:56.600,0:19:59.280 It's not necessarily[br]the way everything's supposed to be. 0:20:00.320,0:20:03.920 CS: Is there any topic[br]that you don't think we should touch? 0:20:04.600,0:20:06.776 SR: I don't think[br]I think of story that way. 0:20:06.800,0:20:09.976 I think of story in terms of character[br]and what characters would do 0:20:10.000,0:20:13.216 and what characters need to do[br]in order to make them move forward, 0:20:13.240,0:20:16.416 so I'm never really thinking of story[br]in terms of just plot, 0:20:16.440,0:20:19.376 and when writers come[br]into my writer's room and pitch me plot, 0:20:19.400,0:20:21.496 I say, "You're not speaking English." 0:20:21.520,0:20:22.936 Like, that's the thing I say. 0:20:22.960,0:20:25.576 We're not speaking English.[br]I need to hear what's real. 0:20:25.600,0:20:27.336 And so I don't think of it that way. 0:20:27.360,0:20:30.696 I don't know if there's a way[br]to think there's something I wouldn't do 0:20:30.720,0:20:34.336 because that feels like I'm plucking[br]pieces of plot off a wall or something. 0:20:34.360,0:20:37.496 CS: That's great. To what extent[br]do you think you will use -- 0:20:37.520,0:20:40.456 You know, you recently went[br]on the board of Planned Parenthood 0:20:40.480,0:20:43.176 and got involved[br]in the Hillary Clinton campaign. 0:20:43.200,0:20:46.816 To what extent do you think[br]you will use your storytelling 0:20:46.840,0:20:48.576 in the real world 0:20:48.600,0:20:50.040 to effect change? 0:20:52.000,0:20:53.960 SR: Well, you know, there's -- 0:20:55.320,0:20:56.896 That's an intense subject to me, 0:20:56.920,0:20:59.696 because I feel like the lack of narrative 0:20:59.720,0:21:05.696 that a lot of people have is difficult. 0:21:05.720,0:21:07.896 You know, like,[br]there's a lot of organizations 0:21:07.920,0:21:11.936 that don't have a positive narrative[br]that they've created for themselves 0:21:11.960,0:21:13.280 that would help them. 0:21:14.000,0:21:15.816 There's a lot of campaigns 0:21:15.840,0:21:19.456 that could be helped[br]with a better narrative. 0:21:19.480,0:21:21.976 The Democrats could do a lot 0:21:22.000,0:21:24.096 with a very strong[br]narrative for themselves. 0:21:24.120,0:21:26.536 There's a lot of different things[br]that could happen 0:21:26.560,0:21:28.376 in terms of using storytelling voice, 0:21:28.400,0:21:30.296 and I don't mean that in a fiction way, 0:21:30.320,0:21:34.456 I mean that in a same way[br]that any speechwriter would mean it. 0:21:34.480,0:21:35.696 And I see that, 0:21:35.720,0:21:39.896 but I don't necessarily know[br]that that's, like, my job to do that. 0:21:39.920,0:21:41.136 CS: All right. 0:21:41.160,0:21:43.776 Please help me thank Shonda.[br]SR: Thank you. 0:21:43.800,0:21:45.280 (Applause)