WEBVTT 00:00:17.491 --> 00:00:22.457 Almost one year ago - it was February, I believe, 2019 - 00:00:22.457 --> 00:00:26.391 I was browsing in the Internet, and I saw this. 00:00:27.989 --> 00:00:30.137 Women built the tech industry. 00:00:30.357 --> 00:00:32.305 Then they were pushed out. 00:00:33.775 --> 00:00:36.323 Pretty powerful headline, don't you think? 00:00:37.283 --> 00:00:38.776 Actually, this is true. 00:00:39.326 --> 00:00:42.137 Women were at the forefront of the tech industry 00:00:42.137 --> 00:00:47.021 back when technologist jobs were considered menial, akin to typists. 00:00:47.391 --> 00:00:51.964 In 1946, the US military selected a team of people 00:00:51.964 --> 00:00:54.260 to work on the world's first computer. 00:00:54.440 --> 00:00:55.819 Guess what. 00:00:55.819 --> 00:00:58.069 More than 50 percent were women. 00:00:58.789 --> 00:01:01.732 The lady in the picture is Margaret Hamilton. 00:01:02.132 --> 00:01:07.232 She was the head of the coding team that charted Apollo 11's path to the moon. 00:01:07.851 --> 00:01:12.841 Another most renown lady of those times was Grace Hopper. 00:01:13.171 --> 00:01:17.819 Grace Hopper was a US Admiral who invented the first compiler. 00:01:18.389 --> 00:01:21.395 It was actually her programming and coding skills 00:01:21.395 --> 00:01:26.667 that enabled the United States to model the impact of the atomic bomb. 00:01:27.837 --> 00:01:29.797 Let's look at some numbers. 00:01:30.717 --> 00:01:35.023 Thirty-seven percent is the estimated, by the US federal government, 00:01:35.023 --> 00:01:37.135 number of women, percentage of women, 00:01:37.135 --> 00:01:42.318 that were taking computer studies between the '70s and the '90s. 00:01:43.088 --> 00:01:45.325 This was actually the peak. 00:01:46.125 --> 00:01:48.620 Thirty-seven percent, early '90s, 00:01:48.620 --> 00:01:53.902 was the peak in the percentage of women who attended computer science fields. 00:01:54.642 --> 00:01:57.776 At that time, the conditions of working in the tech industry 00:01:57.776 --> 00:01:59.750 were not so glamorous, 00:01:59.750 --> 00:02:03.189 but the women were known for their meticulous work ethic 00:02:03.189 --> 00:02:04.965 and their attention to detail. 00:02:05.511 --> 00:02:08.682 One of the great anecdotal stories of that time 00:02:08.682 --> 00:02:14.530 was how Grace Hopper identified the first-ever computer bug. 00:02:15.120 --> 00:02:21.514 She traced a glitch in the system to a moth trapped in a relay wire. 00:02:22.124 --> 00:02:27.127 Today in the US, there's an annual conference every year named after her 00:02:27.127 --> 00:02:29.234 to celebrate women in computing. 00:02:30.004 --> 00:02:31.590 What happened afterwards 00:02:31.590 --> 00:02:34.307 is that the computer industry became more lucrative. 00:02:34.407 --> 00:02:37.163 The tech jobs were of a bit better status 00:02:37.163 --> 00:02:38.601 and higher paid. 00:02:38.771 --> 00:02:42.314 So the tech companies were looking to find the best profile 00:02:42.314 --> 00:02:44.529 that would fit an engineering job. 00:02:44.529 --> 00:02:47.955 But they didn't know, because the coding skill was really new. 00:02:48.025 --> 00:02:52.100 So what they did is they developed a personality test, 00:02:52.340 --> 00:02:54.750 a personality test that would help them identify 00:02:54.750 --> 00:02:57.760 what a good computer engineer would look like. 00:02:57.990 --> 00:03:01.260 That personality test was actually favoring people 00:03:01.260 --> 00:03:05.449 who were not so social, who were not so extroverted. 00:03:05.679 --> 00:03:08.842 It was the birth of the geeky profile. 00:03:08.842 --> 00:03:10.194 (Laughter) 00:03:10.194 --> 00:03:15.404 A computer engineer could only be nerdy and antisocial. 00:03:15.404 --> 00:03:19.824 That was the industry demographic from then on, okay? 00:03:19.824 --> 00:03:23.587 So what we know today as a stereotype, 00:03:23.587 --> 00:03:27.834 of the male who's nerdy and anti-social as a programmer, 00:03:27.834 --> 00:03:33.251 was actually part of this vicious circle that started at that time. 00:03:33.811 --> 00:03:36.788 As the personality test favored more men, 00:03:36.788 --> 00:03:40.284 then the industry got more representation of men, 00:03:40.284 --> 00:03:42.731 then that fueled the public perception 00:03:42.731 --> 00:03:45.865 that computer engineering is only suitable for men. 00:03:46.095 --> 00:03:48.119 So where we are today. 00:03:49.039 --> 00:03:53.957 In 2018, it was March, I think - March 8th, International Women's Day - 00:03:53.957 --> 00:03:57.013 the European Commission announced a study. 00:03:57.283 --> 00:04:00.023 The study was named "Women in the digital age," 00:04:00.023 --> 00:04:02.735 and, actually, it has a lot of concerning facts 00:04:02.735 --> 00:04:06.834 about the representation of women in the European tech industry. 00:04:06.834 --> 00:04:10.279 So today, there are four times more men 00:04:10.279 --> 00:04:14.861 taking up computer-related fields or tech-related fields. 00:04:15.931 --> 00:04:17.772 Based on the same study, 00:04:17.772 --> 00:04:23.024 out of 1,000 female graduates of universities in Europe, 00:04:23.024 --> 00:04:26.684 24 are studying in tech-related fields, 00:04:26.684 --> 00:04:30.809 but then only six end up working in a digital job. 00:04:31.479 --> 00:04:34.224 For every 1,000 of male graduates, 00:04:34.224 --> 00:04:39.230 49 end up working in a digital job. 00:04:40.000 --> 00:04:41.734 What's even more concerning 00:04:41.734 --> 00:04:46.835 is that women are leaving the digital jobs more often than men. 00:04:47.435 --> 00:04:50.133 Actually, in 2015, it was measured 00:04:50.133 --> 00:04:54.769 that nine percent of women left the industry. 00:04:55.159 --> 00:04:59.847 The same year, the percentage for men was 1.2 percent. 00:05:00.477 --> 00:05:04.899 And this happened at the age between 30 and 44, 00:05:04.899 --> 00:05:07.304 which is, of course, the primary working age. 00:05:07.473 --> 00:05:12.947 At the same time, it is the age that most Europeans are having their first child. 00:05:13.357 --> 00:05:15.905 So the gender gap is increasing. 00:05:16.075 --> 00:05:18.894 If we talk about digital entrepreneurship, 00:05:19.124 --> 00:05:21.470 the European Startup Monitor 00:05:21.940 --> 00:05:26.907 measures the startups that have female founders to 14.8 percent. 00:05:27.695 --> 00:05:30.317 Startups that have only female founders 00:05:30.317 --> 00:05:34.776 got only five percent of the global venture capital investment. 00:05:34.776 --> 00:05:38.108 And the average investment on the female-founded startups 00:05:38.108 --> 00:05:40.828 is dropping every year of the last five years. 00:05:43.078 --> 00:05:46.666 I graduated computer engineering in the early '90s. 00:05:46.896 --> 00:05:50.788 It was the time that the gender gap was not so visible in the numbers, 00:05:50.788 --> 00:05:56.323 but, of course, the unconscious and sometimes conscious bias was there. 00:05:57.063 --> 00:06:02.148 I had to sign up for volunteering work at the computer center in the university 00:06:02.148 --> 00:06:06.672 and, of course, miss a lot of parties outside of school to be taken seriously 00:06:06.672 --> 00:06:09.990 by my professors and my classmates. 00:06:10.711 --> 00:06:13.994 You see, I was not the geeky profile, 00:06:13.994 --> 00:06:20.513 and femininity at that time was actually adversely related to brilliance. 00:06:21.783 --> 00:06:27.547 When I got into the professional life, things seemed easier and better. 00:06:27.856 --> 00:06:30.408 I was learning everyday, I was growing every day, 00:06:30.408 --> 00:06:32.634 and I had respect from my colleagues. 00:06:33.050 --> 00:06:37.203 And then I decided that it was about time to move up with my career, 00:06:37.203 --> 00:06:39.932 to move on, to develop my career. 00:06:39.932 --> 00:06:42.792 "You can't - you are a mother. 00:06:43.432 --> 00:06:45.723 What would your husband say? 00:06:46.083 --> 00:06:48.015 Who would take care of your kids? 00:06:48.395 --> 00:06:50.978 How will you manage the long hours?" 00:06:51.918 --> 00:06:57.513 Surprisingly, nobody asked me before how I managed the long hours 00:06:57.513 --> 00:07:01.388 when I had to deliver a complex project with quality and on time. 00:07:01.868 --> 00:07:04.454 (Applause) 00:07:07.334 --> 00:07:12.273 My husband is in this room today, and I guess he's smiling. 00:07:12.273 --> 00:07:14.466 (Applause) 00:07:18.850 --> 00:07:23.066 Today, I am thankful for this painful moment of truth 00:07:23.066 --> 00:07:25.386 because it made me stronger, 00:07:25.386 --> 00:07:27.707 because it fueled my determination. 00:07:27.807 --> 00:07:29.919 So I worked really hard to make it happen. 00:07:29.919 --> 00:07:32.524 I worked really hard to make my dream come true. 00:07:33.198 --> 00:07:35.025 And, of course, there was advancements, 00:07:35.025 --> 00:07:36.752 and, of course, there were setbacks, 00:07:36.752 --> 00:07:40.142 and, of course, I was faced with unconscious bias many times, 00:07:40.142 --> 00:07:42.657 which was not only gender, I have to tell you. 00:07:42.897 --> 00:07:45.188 In the years of the Greek crisis, 00:07:45.188 --> 00:07:51.669 I had to to overcome the unconscious bias that the Greeks are incompetent, okay? 00:07:51.849 --> 00:07:56.038 Very incompetent and lazy and corrupt, okay? 00:07:56.038 --> 00:07:57.840 (Laughter) 00:07:57.840 --> 00:07:59.151 But I made it. 00:07:59.151 --> 00:08:04.497 And today, today I feel I need to share this story 00:08:04.621 --> 00:08:11.358 because I want to empower young women to follow up on her dreams. 00:08:12.016 --> 00:08:14.561 So this evolution that I have lived, 00:08:14.561 --> 00:08:19.480 because technology has been growing exponentially over the past years, 00:08:19.480 --> 00:08:21.536 was not an inclusive one. 00:08:21.536 --> 00:08:26.313 More than 50 percent of the population has not been part of that. 00:08:26.803 --> 00:08:30.824 But is it only gender that was left out? 00:08:31.064 --> 00:08:32.649 It was not only gender, 00:08:32.649 --> 00:08:37.598 because diversity doesn't have to do only with whether you are male or female. 00:08:37.848 --> 00:08:39.944 It has to do with age. 00:08:40.034 --> 00:08:41.880 It has to do with ethnicity. 00:08:41.950 --> 00:08:43.648 It has to do with color. 00:08:43.698 --> 00:08:46.651 It has to do with religion or culture. 00:08:48.501 --> 00:08:51.681 Take for example the older people today. 00:08:51.781 --> 00:08:53.923 Today, we are living in a time 00:08:53.923 --> 00:08:56.426 that we are having for the first time 00:08:56.426 --> 00:08:59.830 five generations at the same time in the workplace. 00:09:00.710 --> 00:09:04.535 In the next five years, analysts expect 00:09:04.535 --> 00:09:09.216 that 25 percent of the workforce will be over the age of 55. 00:09:10.126 --> 00:09:13.607 How do we make sure these people are not excluded? 00:09:13.877 --> 00:09:17.053 How do we drive cross-generational collaboration? 00:09:17.313 --> 00:09:20.687 How do we give opportunities for them to thrive 00:09:20.687 --> 00:09:23.917 and to leverage the benefits of digitization? 00:09:24.267 --> 00:09:26.811 Why is this discussion happening right now? 00:09:26.811 --> 00:09:30.941 It is happening first of all because there is a business case around it. 00:09:31.321 --> 00:09:33.051 Based on McKinsey, 00:09:33.051 --> 00:09:37.621 companies who have top female executives 00:09:37.621 --> 00:09:40.637 and have good diversity rate in their executive boards 00:09:40.637 --> 00:09:45.337 are 35 percent more likely to be more profitable. 00:09:45.977 --> 00:09:49.293 The European Union has estimated the annual productivity costs 00:09:49.293 --> 00:09:52.069 of this nine percent of women leaving the workplace 00:09:52.069 --> 00:09:54.845 to be 16 billion euros. 00:09:55.895 --> 00:09:59.464 But it's not only the country or the company financials. 00:10:00.914 --> 00:10:02.951 Diversity is an action. 00:10:03.361 --> 00:10:05.418 Inclusion is a culture. 00:10:05.628 --> 00:10:07.416 Belonging is a feeling. 00:10:08.126 --> 00:10:11.639 And the need for all three is deeply human. 00:10:11.639 --> 00:10:13.142 When we empower people, 00:10:13.142 --> 00:10:15.206 when we give them the opportunity 00:10:15.206 --> 00:10:18.656 to bring their strengths and the best talents, 00:10:19.136 --> 00:10:20.968 then we all win. 00:10:21.861 --> 00:10:24.681 And today, it's even more than that. 00:10:26.778 --> 00:10:29.817 With the advance of technology, such as AI, 00:10:29.817 --> 00:10:34.778 a lot of things, a lot of activities - like, for example, recruiting - 00:10:34.778 --> 00:10:36.800 have been outsourced. 00:10:37.080 --> 00:10:40.112 So in 2018, there was a Reuters article 00:10:40.112 --> 00:10:45.485 that revealed that one of the major e-commerce companies globally, 00:10:46.165 --> 00:10:50.410 a machine-learning-fueled recruiting bot, 00:10:51.580 --> 00:10:55.996 was disqualifying women over men. 00:10:57.562 --> 00:10:59.003 Τhe machine-learning algorithm 00:10:59.003 --> 00:11:04.327 used CVs that were gathered in the company for a period of more than 10 years 00:11:04.327 --> 00:11:07.119 to learn and identify patterns 00:11:07.119 --> 00:11:10.621 that would drive what would a good candidate be. 00:11:10.947 --> 00:11:14.796 Of course, those CVs were coming mostly from men. 00:11:15.906 --> 00:11:20.136 So the bot, when it was finding words like "women" in the CV - 00:11:20.136 --> 00:11:23.282 like participation in a women's soccer club - 00:11:23.282 --> 00:11:25.693 it would disqualify that CV. 00:11:28.723 --> 00:11:31.788 Biases that exist in the data 00:11:32.278 --> 00:11:35.934 in a world that is becoming more and more data-driven, 00:11:35.934 --> 00:11:39.533 in a world the decisions are taken based on this data 00:11:39.873 --> 00:11:44.015 can be reproduced, can be reinforced, 00:11:44.015 --> 00:11:47.366 can become a self-fulfilling prophecy. 00:11:47.816 --> 00:11:50.792 I am Greek; I was raised in Greece. 00:11:50.792 --> 00:11:56.045 I was educated with the principles coming from the ancient Greek philosophers. 00:11:56.915 --> 00:12:01.114 And as old-fashioned as this might look today, 00:12:01.114 --> 00:12:05.060 it is actually some ethical values, some timeless values 00:12:05.060 --> 00:12:07.103 that we need all to go back to. 00:12:07.103 --> 00:12:09.660 We need a new ethical coach in technology today 00:12:09.660 --> 00:12:11.322 that would earn trust, 00:12:11.322 --> 00:12:13.964 that would help technology be trustworthy, 00:12:13.964 --> 00:12:19.009 that would make us believe that it will be used in a fair way. 00:12:20.839 --> 00:12:24.998 So when we talk about new technologies, like artificial intelligence, 00:12:24.998 --> 00:12:29.868 we want them to be fair, to not have biases. 00:12:29.868 --> 00:12:31.601 We want them to be transparent 00:12:31.601 --> 00:12:35.650 for us to understand how data are used, how decisions are made. 00:12:36.020 --> 00:12:40.517 We want them to work in a safe and reliable way. 00:12:40.517 --> 00:12:43.502 We want them to respect our privacy. 00:12:43.962 --> 00:12:47.627 We want them to make sure they include and they empower everyone. 00:12:48.455 --> 00:12:51.660 So we know we need inclusive technology. 00:12:51.660 --> 00:12:56.445 To do that, I would refer to Simon Sinek's talk "Start with why." 00:12:57.095 --> 00:12:58.969 "Why" matters. 00:12:59.519 --> 00:13:01.670 Why are we building technology? 00:13:02.030 --> 00:13:05.600 Technology is here to enhance our capabilities, 00:13:05.600 --> 00:13:08.388 to help us make smarter decisions faster, 00:13:08.448 --> 00:13:11.297 to help us make less mistakes. 00:13:11.527 --> 00:13:13.038 "Why" matters. 00:13:13.498 --> 00:13:14.769 Then "who." 00:13:14.769 --> 00:13:17.459 Who's involved in building the technology? 00:13:17.659 --> 00:13:21.970 We need more diverse teams building the technology. 00:13:22.185 --> 00:13:23.862 "Who" matters. 00:13:25.132 --> 00:13:27.419 And then, finally, "how," 00:13:28.529 --> 00:13:33.574 how transparent we are about the way we are building technology. 00:13:34.154 --> 00:13:36.390 How are we using the data? 00:13:36.390 --> 00:13:38.279 How are the machines learning? 00:13:38.599 --> 00:13:40.149 "How" matters. 00:13:40.709 --> 00:13:42.613 In an evolving digital world, 00:13:42.613 --> 00:13:47.112 as we welcome robots, holograms, artificial intelligence 00:13:47.112 --> 00:13:49.521 and more and more automation, 00:13:49.747 --> 00:13:53.291 people are getting more and more scared what the future will be. 00:13:53.851 --> 00:13:56.320 What will be the future of jobs? 00:13:56.320 --> 00:13:58.789 What will be the future of us as humans? 00:13:59.059 --> 00:14:01.286 The future is as good as we make it. 00:14:01.376 --> 00:14:04.866 Technology has always been here to enhance our capabilities. 00:14:04.866 --> 00:14:07.554 If you ask me, what would be our future as humans? 00:14:07.554 --> 00:14:09.387 I would tell you, 00:14:09.387 --> 00:14:12.860 we need to get back to our core identity, 00:14:13.840 --> 00:14:16.201 to the core of who we are, 00:14:16.201 --> 00:14:17.722 and that's empathy. 00:14:19.272 --> 00:14:21.568 Empathy is not a "nice to have." 00:14:22.144 --> 00:14:28.697 Empathy is absolutely essential to us as humans into the future. 00:14:29.371 --> 00:14:32.306 Empathy is about overcoming biases. 00:14:33.056 --> 00:14:35.052 It's about respecting each other. 00:14:35.142 --> 00:14:37.697 It's about understanding or trying to understand. 00:14:37.797 --> 00:14:39.835 It's about listening. 00:14:39.885 --> 00:14:41.974 It's about showing compassion. 00:14:43.344 --> 00:14:45.697 It's about offering support. 00:14:47.297 --> 00:14:51.049 The future will be good 00:14:51.049 --> 00:14:57.472 if it only includes and embraces every talent on earth. 00:14:58.452 --> 00:15:03.030 The future will be good if it only relies on empathy. 00:15:05.000 --> 00:15:09.702 In a digital world that would respect every individual's talents, 00:15:09.702 --> 00:15:15.676 that would empower everybody to get high, higher, highest, 00:15:15.676 --> 00:15:18.732 these headlines will not be there anymore. 00:15:19.562 --> 00:15:22.151 Today, I have one ask of you. 00:15:22.891 --> 00:15:26.060 Let's make this decision collectively. 00:15:27.020 --> 00:15:29.051 Inclusion starts with I. 00:15:29.851 --> 00:15:32.421 It starts with each and every one of us. 00:15:32.861 --> 00:15:37.653 Let's act, let's listen, let's understand, let's respect each other. 00:15:37.653 --> 00:15:39.474 Let's empathize. 00:15:39.774 --> 00:15:44.098 Let's make this evolving digital world a more inclusive place. 00:15:44.168 --> 00:15:45.664 Thank you. 00:15:45.717 --> 00:15:48.264 (Applause)