0:00:17.491,0:00:22.457 Almost one year ago -[br]it was February, I believe, 2019 - 0:00:22.457,0:00:26.391 I was browsing in the Internet,[br]and I saw this. 0:00:27.989,0:00:30.137 Women built the tech industry. 0:00:30.357,0:00:32.305 Then they were pushed out. 0:00:33.775,0:00:36.323 Pretty powerful headline, don't you think? 0:00:37.283,0:00:38.776 Actually, this is true. 0:00:39.326,0:00:42.137 Women were at the forefront[br]of the tech industry 0:00:42.137,0:00:47.021 back when technologist jobs[br]were considered menial, akin to typists. 0:00:47.391,0:00:51.964 In 1946, the US military[br]selected a team of people 0:00:51.964,0:00:54.260 to work on the world's first computer. 0:00:54.440,0:00:55.819 Guess what. 0:00:55.819,0:00:58.069 More than 50 percent were women. 0:00:58.789,0:01:01.732 The lady in the picture[br]is Margaret Hamilton. 0:01:02.132,0:01:07.232 She was the head of the coding team[br]that charted Apollo 11's path to the moon. 0:01:07.851,0:01:12.841 Another most renown lady [br]of those times was Grace Hopper. 0:01:13.171,0:01:17.819 Grace Hopper was a US Admiral[br]who invented the first compiler. 0:01:18.389,0:01:21.395 It was actually her programming[br]and coding skills 0:01:21.395,0:01:26.667 that enabled the United States[br]to model the impact of the atomic bomb. 0:01:27.837,0:01:29.797 Let's look at some numbers. 0:01:30.717,0:01:35.023 Thirty-seven percent is the estimated,[br]by the US federal government, 0:01:35.023,0:01:37.135 number of women, percentage of women, 0:01:37.135,0:01:42.318 that were taking computer studies[br]between the '70s and the '90s. 0:01:43.088,0:01:45.325 This was actually the peak. 0:01:46.125,0:01:48.620 Thirty-seven percent, early '90s, 0:01:48.620,0:01:53.902 was the peak in the percentage of women[br]who attended computer science fields. 0:01:54.642,0:01:57.776 At that time, the conditions[br]of working in the tech industry 0:01:57.776,0:01:59.750 were not so glamorous, 0:01:59.750,0:02:03.189 but the women were known[br]for their meticulous work ethic 0:02:03.189,0:02:04.965 and their attention to detail. 0:02:05.511,0:02:08.682 One of the great anecdotal[br]stories of that time 0:02:08.682,0:02:14.530 was how Grace Hopper[br]identified the first-ever computer bug. 0:02:15.120,0:02:21.514 She traced a glitch in the system[br]to a moth trapped in a relay wire. 0:02:22.124,0:02:27.127 Today in the US, there's an annual[br]conference every year named after her 0:02:27.127,0:02:29.234 to celebrate women in computing. 0:02:30.004,0:02:31.590 What happened afterwards 0:02:31.590,0:02:34.307 is that the computer industry[br]became more lucrative. 0:02:34.407,0:02:37.163 The tech jobs were of a bit better status 0:02:37.163,0:02:38.601 and higher paid. 0:02:38.771,0:02:42.314 So the tech companies[br]were looking to find the best profile 0:02:42.314,0:02:44.529 that would fit an engineering job. 0:02:44.529,0:02:47.955 But they didn't know,[br]because the coding skill was really new. 0:02:48.025,0:02:52.100 So what they did is they developed[br]a personality test, 0:02:52.340,0:02:54.750 a personality test[br]that would help them identify 0:02:54.750,0:02:57.760 what a good computer engineer[br]would look like. 0:02:57.990,0:03:01.260 That personality test[br]was actually favoring people 0:03:01.260,0:03:05.449 who were not so social,[br]who were not so extroverted. 0:03:05.679,0:03:08.842 It was the birth of the geeky profile. 0:03:08.842,0:03:10.194 (Laughter) 0:03:10.194,0:03:15.404 A computer engineer[br]could only be nerdy and antisocial. 0:03:15.404,0:03:19.824 That was the industry demographic[br]from then on, okay? 0:03:19.824,0:03:23.587 So what we know today as a stereotype, 0:03:23.587,0:03:27.834 of the male who's nerdy[br]and anti-social as a programmer, 0:03:27.834,0:03:33.251 was actually part of this vicious circle[br]that started at that time. 0:03:33.811,0:03:36.788 As the personality test favored more men, 0:03:36.788,0:03:40.284 then the industry got more[br]representation of men, 0:03:40.284,0:03:42.731 then that fueled the public perception 0:03:42.731,0:03:45.865 that computer engineering[br]is only suitable for men. 0:03:46.095,0:03:48.119 So where we are today. 0:03:49.039,0:03:53.957 In 2018, it was March, I think -[br]March 8th, International Women's Day - 0:03:53.957,0:03:57.013 the European Commission announced a study. 0:03:57.283,0:04:00.023 The study was named[br]"Women in the digital age," 0:04:00.023,0:04:02.735 and, actually, it has[br]a lot of concerning facts 0:04:02.735,0:04:06.834 about the representation of women[br]in the European tech industry. 0:04:06.834,0:04:10.279 So today, there are four times more men 0:04:10.279,0:04:14.861 taking up computer-related fields[br]or tech-related fields. 0:04:15.931,0:04:17.772 Based on the same study, 0:04:17.772,0:04:23.024 out of 1,000 female graduates[br]of universities in Europe, 0:04:23.024,0:04:26.684 24 are studying in tech-related fields, 0:04:26.684,0:04:30.809 but then only six[br]end up working in a digital job. 0:04:31.479,0:04:34.224 For every 1,000 of male graduates, 0:04:34.224,0:04:39.230 49 end up working in a digital job. 0:04:40.000,0:04:41.734 What's even more concerning 0:04:41.734,0:04:46.835 is that women are leaving the digital jobs[br]more often than men. 0:04:47.435,0:04:50.133 Actually, in 2015, it was measured 0:04:50.133,0:04:54.769 that nine percent of women[br]left the industry. 0:04:55.159,0:04:59.847 The same year, the percentage[br]for men was 1.2 percent. 0:05:00.477,0:05:04.899 And this happened[br]at the age between 30 and 44, 0:05:04.899,0:05:07.304 which is, of course,[br]the primary working age. 0:05:07.473,0:05:12.947 At the same time, it is the age that most[br]Europeans are having their first child. 0:05:13.357,0:05:15.905 So the gender gap is increasing. 0:05:16.075,0:05:18.894 If we talk about digital entrepreneurship, 0:05:19.124,0:05:21.470 the European Startup Monitor 0:05:21.940,0:05:26.907 measures the startups[br]that have female founders to 14.8 percent. 0:05:27.695,0:05:30.317 Startups that have only female founders 0:05:30.317,0:05:34.776 got only five percent of the global[br]venture capital investment. 0:05:34.776,0:05:38.108 And the average investment[br]on the female-founded startups 0:05:38.108,0:05:40.828 is dropping every year[br]of the last five years. 0:05:43.078,0:05:46.666 I graduated computer engineering[br]in the early '90s. 0:05:46.896,0:05:50.788 It was the time that the gender gap[br]was not so visible in the numbers, 0:05:50.788,0:05:56.323 but, of course, the unconscious[br]and sometimes conscious bias was there. 0:05:57.063,0:06:02.148 I had to sign up for volunteering work[br]at the computer center in the university 0:06:02.148,0:06:06.672 and, of course, miss a lot of parties[br]outside of school to be taken seriously 0:06:06.672,0:06:09.990 by my professors and my classmates. 0:06:10.711,0:06:13.994 You see, I was not the geeky profile, 0:06:13.994,0:06:20.513 and femininity at that time was actually[br]adversely related to brilliance. 0:06:21.783,0:06:27.547 When I got into the professional life,[br]things seemed easier and better. 0:06:27.856,0:06:30.408 I was learning everyday,[br]I was growing every day, 0:06:30.408,0:06:32.634 and I had respect from my colleagues. 0:06:33.050,0:06:37.203 And then I decided that it was about time[br]to move up with my career, 0:06:37.203,0:06:39.932 to move on, to develop my career. 0:06:39.932,0:06:42.792 "You can't - you are a mother. 0:06:43.432,0:06:45.723 What would your husband say? 0:06:46.083,0:06:48.015 Who would take care of your kids? 0:06:48.395,0:06:50.978 How will you manage the long hours?" 0:06:51.918,0:06:57.513 Surprisingly, nobody asked me before[br]how I managed the long hours 0:06:57.513,0:07:01.388 when I had to deliver a complex project[br]with quality and on time. 0:07:01.868,0:07:04.454 (Applause) 0:07:07.334,0:07:12.273 My husband is in this room today,[br]and I guess he's smiling. 0:07:12.273,0:07:14.466 (Applause) 0:07:18.850,0:07:23.066 Today, I am thankful[br]for this painful moment of truth 0:07:23.066,0:07:25.386 because it made me stronger, 0:07:25.386,0:07:27.707 because it fueled my determination. 0:07:27.807,0:07:29.919 So I worked really hard to make it happen. 0:07:29.919,0:07:32.524 I worked really hard[br]to make my dream come true. 0:07:33.198,0:07:35.025 And, of course,[br]there was advancements, 0:07:35.025,0:07:36.752 and, of course, there were setbacks, 0:07:36.752,0:07:40.142 and, of course, I was faced[br]with unconscious bias many times, 0:07:40.142,0:07:42.657 which was not only gender,[br]I have to tell you. 0:07:42.897,0:07:45.188 In the years of the Greek crisis, 0:07:45.188,0:07:51.669 I had to to overcome the unconscious bias[br]that the Greeks are incompetent, okay? 0:07:51.849,0:07:56.038 Very incompetent and lazy[br]and corrupt, okay? 0:07:56.038,0:07:57.840 (Laughter) 0:07:57.840,0:07:59.151 But I made it. 0:07:59.151,0:08:04.497 And today, today I feel[br]I need to share this story 0:08:04.621,0:08:11.358 because I want to empower young women[br]to follow up on her dreams. 0:08:12.016,0:08:14.561 So this evolution that I have lived, 0:08:14.561,0:08:19.480 because technology has been growing[br]exponentially over the past years, 0:08:19.480,0:08:21.536 was not an inclusive one. 0:08:21.536,0:08:26.313 More than 50 percent of the population[br]has not been part of that. 0:08:26.803,0:08:30.824 But is it only gender that was left out? 0:08:31.064,0:08:32.649 It was not only gender, 0:08:32.649,0:08:37.598 because diversity doesn't have to do[br]only with whether you are male or female. 0:08:37.848,0:08:39.944 It has to do with age. 0:08:40.034,0:08:41.880 It has to do with ethnicity. 0:08:41.950,0:08:43.648 It has to do with color. 0:08:43.698,0:08:46.651 It has to do with religion or culture. 0:08:48.501,0:08:51.681 Take for example the older people today. 0:08:51.781,0:08:53.923 Today, we are living in a time 0:08:53.923,0:08:56.426 that we are having for the first time 0:08:56.426,0:08:59.830 five generations at the same time[br]in the workplace. 0:09:00.710,0:09:04.535 In the next five years, analysts expect 0:09:04.535,0:09:09.216 that 25 percent of the workforce[br]will be over the age of 55. 0:09:10.126,0:09:13.607 How do we make sure[br]these people are not excluded? 0:09:13.877,0:09:17.053 How do we drive[br]cross-generational collaboration? 0:09:17.313,0:09:20.687 How do we give opportunities[br]for them to thrive 0:09:20.687,0:09:23.917 and to leverage[br]the benefits of digitization? 0:09:24.267,0:09:26.811 Why is this discussion[br]happening right now? 0:09:26.811,0:09:30.941 It is happening first of all because[br]there is a business case around it. 0:09:31.321,0:09:33.051 Based on McKinsey, 0:09:33.051,0:09:37.621 companies who have top female executives 0:09:37.621,0:09:40.637 and have good diversity[br]rate in their executive boards 0:09:40.637,0:09:45.337 are 35 percent more likely[br]to be more profitable. 0:09:45.977,0:09:49.293 The European Union has estimated[br]the annual productivity costs 0:09:49.293,0:09:52.069 of this nine percent of women[br]leaving the workplace 0:09:52.069,0:09:54.845 to be 16 billion euros. 0:09:55.895,0:09:59.464 But it's not only the country[br]or the company financials. 0:10:00.914,0:10:02.951 Diversity is an action. 0:10:03.361,0:10:05.418 Inclusion is a culture. 0:10:05.628,0:10:07.416 Belonging is a feeling. 0:10:08.126,0:10:11.639 And the need for all three[br]is deeply human. 0:10:11.639,0:10:13.142 When we empower people, 0:10:13.142,0:10:15.206 when we give them the opportunity 0:10:15.206,0:10:18.656 to bring their strengths[br]and the best talents, 0:10:19.136,0:10:20.968 then we all win. 0:10:21.861,0:10:24.681 And today, it's even more than that. 0:10:26.778,0:10:29.817 With the advance[br]of technology, such as AI, 0:10:29.817,0:10:34.778 a lot of things, a lot of activities -[br]like, for example, recruiting - 0:10:34.778,0:10:36.800 have been outsourced. 0:10:37.080,0:10:40.112 So in 2018, there was a Reuters article 0:10:40.112,0:10:45.485 that revealed that one of the major[br]e-commerce companies globally, 0:10:46.165,0:10:50.410 a machine-learning-fueled recruiting bot, 0:10:51.580,0:10:55.996 was disqualifying women over men. 0:10:57.562,0:10:59.003 Τhe machine-learning algorithm 0:10:59.003,0:11:04.327 used CVs that were gathered in the company[br]for a period of more than 10 years 0:11:04.327,0:11:07.119 to learn and identify patterns 0:11:07.119,0:11:10.621 that would drive[br]what would a good candidate be. 0:11:10.947,0:11:14.796 Of course, those CVs[br]were coming mostly from men. 0:11:15.906,0:11:20.136 So the bot, when it was finding[br]words like "women" in the CV - 0:11:20.136,0:11:23.282 like participation[br]in a women's soccer club - 0:11:23.282,0:11:25.693 it would disqualify that CV. 0:11:28.723,0:11:31.788 Biases that exist in the data 0:11:32.278,0:11:35.934 in a world that is becoming[br]more and more data-driven, 0:11:35.934,0:11:39.533 in a world the decisions[br]are taken based on this data 0:11:39.873,0:11:44.015 can be reproduced, can be reinforced, 0:11:44.015,0:11:47.366 can become a self-fulfilling prophecy. 0:11:47.816,0:11:50.792 I am Greek; I was raised in Greece. 0:11:50.792,0:11:56.045 I was educated with the principles coming[br]from the ancient Greek philosophers. 0:11:56.915,0:12:01.114 And as old-fashioned[br]as this might look today, 0:12:01.114,0:12:05.060 it is actually some ethical values,[br]some timeless values 0:12:05.060,0:12:07.103 that we need all to go back to. 0:12:07.103,0:12:09.660 We need a new ethical coach[br]in technology today 0:12:09.660,0:12:11.322 that would earn trust, 0:12:11.322,0:12:13.964 that would help technology be trustworthy, 0:12:13.964,0:12:19.009 that would make us believe[br]that it will be used in a fair way. 0:12:20.839,0:12:24.998 So when we talk about new technologies,[br]like artificial intelligence, 0:12:24.998,0:12:29.868 we want them to be fair,[br]to not have biases. 0:12:29.868,0:12:31.601 We want them to be transparent 0:12:31.601,0:12:35.650 for us to understand how data are used,[br]how decisions are made. 0:12:36.020,0:12:40.517 We want them to work[br]in a safe and reliable way. 0:12:40.517,0:12:43.502 We want them to respect our privacy. 0:12:43.962,0:12:47.627 We want them to make sure[br]they include and they empower everyone. 0:12:48.455,0:12:51.660 So we know we need inclusive technology. 0:12:51.660,0:12:56.445 To do that, I would refer[br]to Simon Sinek's talk "Start with why." 0:12:57.095,0:12:58.969 "Why" matters. 0:12:59.519,0:13:01.670 Why are we building technology? 0:13:02.030,0:13:05.600 Technology is here[br]to enhance our capabilities, 0:13:05.600,0:13:08.388 to help us make smarter decisions faster, 0:13:08.448,0:13:11.297 to help us make less mistakes. 0:13:11.527,0:13:13.038 "Why" matters. 0:13:13.498,0:13:14.769 Then "who." 0:13:14.769,0:13:17.459 Who's involved[br]in building the technology? 0:13:17.659,0:13:21.970 We need more diverse teams[br]building the technology. 0:13:22.185,0:13:23.862 "Who" matters. 0:13:25.132,0:13:27.419 And then, finally, "how," 0:13:28.529,0:13:33.574 how transparent we are about[br]the way we are building technology. 0:13:34.154,0:13:36.390 How are we using the data? 0:13:36.390,0:13:38.279 How are the machines learning? 0:13:38.599,0:13:40.149 "How" matters. 0:13:40.709,0:13:42.613 In an evolving digital world, 0:13:42.613,0:13:47.112 as we welcome robots,[br]holograms, artificial intelligence 0:13:47.112,0:13:49.521 and more and more automation, 0:13:49.747,0:13:53.291 people are getting more and more scared[br]what the future will be. 0:13:53.851,0:13:56.320 What will be the future of jobs? 0:13:56.320,0:13:58.789 What will be the future of us as humans? 0:13:59.059,0:14:01.286 The future is as good as we make it. 0:14:01.376,0:14:04.866 Technology has always been here[br]to enhance our capabilities. 0:14:04.866,0:14:07.554 If you ask me, what would be[br]our future as humans? 0:14:07.554,0:14:09.387 I would tell you, 0:14:09.387,0:14:12.860 we need to get back to our core identity, 0:14:13.840,0:14:16.201 to the core of who we are, 0:14:16.201,0:14:17.722 and that's empathy. 0:14:19.272,0:14:21.568 Empathy is not a "nice to have." 0:14:22.144,0:14:28.697 Empathy is absolutely essential[br]to us as humans into the future. 0:14:29.371,0:14:32.306 Empathy is about overcoming biases. 0:14:33.056,0:14:35.052 It's about respecting each other. 0:14:35.142,0:14:37.697 It's about understanding[br]or trying to understand. 0:14:37.797,0:14:39.835 It's about listening. 0:14:39.885,0:14:41.974 It's about showing compassion. 0:14:43.344,0:14:45.697 It's about offering support. 0:14:47.297,0:14:51.049 The future will be good 0:14:51.049,0:14:57.472 if it only includes and embraces[br]every talent on earth. 0:14:58.452,0:15:03.030 The future will be good[br]if it only relies on empathy. 0:15:05.000,0:15:09.702 In a digital world that would respect[br]every individual's talents, 0:15:09.702,0:15:15.676 that would empower everybody[br]to get high, higher, highest, 0:15:15.676,0:15:18.732 these headlines will not be there anymore. 0:15:19.562,0:15:22.151 Today, I have one ask of you. 0:15:22.891,0:15:26.060 Let's make this decision collectively. 0:15:27.020,0:15:29.051 Inclusion starts with I. 0:15:29.851,0:15:32.421 It starts with each and every one of us. 0:15:32.861,0:15:37.653 Let's act, let's listen, let's understand,[br]let's respect each other. 0:15:37.653,0:15:39.474 Let's empathize. 0:15:39.774,0:15:44.098 Let's make this evolving[br]digital world a more inclusive place. 0:15:44.168,0:15:45.664 Thank you. 0:15:45.717,0:15:48.264 (Applause)