Welcome back, the next talk will be Jan Kiszka on Getting more Debian into our civil infrastructure. Thank you Michael. So my name is Jan Kiszka, you may not know me, I'm not a Debian Developer, not a Debian Maintainer. I'm just an upstream hacker. I'm working for Siemens and part of the Linux team there for now 10 years actually, more than 10 years. We are supporting our business units in getting Linux into the products successfully for that long time, even longer actually. Today, I'm representing a collaborative project that has some relationship with Debian, and more soon. First of all, maybe a surprise to some of you, our civilization is heavily running on Linux and you may now think about this kind of devices where some kind of Linux inside, or you may think of the cloud servers running Linux inside. But actually, this is about devices closer to us. In all our infrastructure, there are control systems, there are management systems included and many many of them run Linux inside. Maybe if you are traveling with Deutsche Bahn to this event these days, there was some Linux system on the train as well, as they were on the ???, so on the control side. Energy generation. Power plants, they are also run with Linux in very interesting ways, in positive ways Industry automation, the factories, they have control systems inside and quite a few are running Linux inside. And also other systems like health care, diagnostic systems. These big balls up there, they're magnetic resonance imaging systems, they're running on Linux for over a decade now. Building automation, not at home but in the professional building area. Actually, as I said, the train systems are going to be more on Debian soon. We have Debian for quite a while in power generation. "We", in this case, Siemens. We have the box underneath, on the third row, the industrial switch there is running Debian. And the health care device is still on Ubuntu, but soon will be Debian as well. Just to give some examples. These are the areas where we, as a group, and we, as Siemens, are active. But there are some problems with this. Just take an example from a railway system. Usually, this kind of devices installation, they have a lifetime of 25, 30 years. It used to be quite simple with these old devices, simple in the sense that it was mechanic, it was pretty robust I was once told that one of these locking systems, they were basically left in a box out there for 50 years and no one entered the ??? No one touched the whole thing for 50 years These times are a little bit over. Nowadays, we have more electronic systems in these systems and they contain of course software. What does it mean? Just to give you an idea, how this kind of development looks like in this domain. So ??? development takes quite a long time until the product is ready, 3 to 5 years. Then, in the railway domain, it's mostly about customizing the systems for specific installations of the railway systems, not only in Europe, they are kind of messy regarding the differences. So you have specific requirements of the customer, the railway operators to adjust these systems for their needs. And you see by then, after 5 years already, a Debian version would be out of maintenance and if you add an other year, you can start over again. So, in the development time, you may change still the system but later on, it's getting hard to change the system ??? because then the interesting parts start in this domain, not only in this domain, that's safety and security assessment and approval for these systems. And that also takes time. For example, in Germany, you go for the Eisenbahn ??? and you ask to get a permission to run that train on the track and if they say "Mmh, not happy with it", you do it over again and it takes time and if you change something in the system, it becomes interesting because some of these certification aspects become invalid, you have to redo it. And then of course, these trains on the installation, the have a long life as I mentioned before. So how do you deal with this in an electronic device and in software-driven devices over this long phase? That's our challenge and just one example and there are more in this area. At the same time, what we see now is these fancy buzzwords from cloud business entering our conservative, slowly moving domain. We talk about IoT, industrial IoT, so connected devices. We talk about edge computing, it means getting the power of the cloud to the device in the field, closer to where the real things happen. So, networking becomes a topic. In the past, you basically built a system, you locked it up physically you never touched it again, except the customer complains that there were some bug inside. These days, the customer asks us to do a frequent update. And actually the customers ??? ask for this. So you have to have some security maintenance concept in this which means regular updates, regular fixes and that is of course ??? for this kind of doing the way you have slow running and long running support cycles. To summarize, there's a very long time we have to maintain our devices in the field and so far, this was mostly done individually. So each company, and sometimes quite frequently also inside the company, each product group, development ??? did it individually. So everyone was having their own kernel, everyone was having their own base system, it was easy to build up so it should be easy to maintain. Of course it's not. This was one thing, one important thing. And then, of course, we not always are completely happy with what the free software gives us. There are some needs to make things more robust, to make things more secure, reliable. So we have to work with these components and improve them, mostly upstream, and that, of course, is not a challenge we have to address in this area. And catch up with a trend coming in from the service space on the cloud space. So with this challenge… it was the point where we, in this case, a number of big users of industrial open source systems, came together and created a new collaborative project. That's what you do in the open source area. This project is called Civil Infrastructure Platform. It's under the umbrella of the Linux Foundation, there are many projects of the Linux Foundation you may have seen, but most of them are more in the area of cloud computing or in the area of media. Automotive computing, this one is actually even more conservative than the other ones and it's also comparably small. Our goal is to build this open source base layer for these application scenarios based on free software, based on Linux. We started two years ago. That's basically our structure, to give you an idea. Member companies, the 3 on the top are founding platinum companies, Hitachi, Toshiba and Siemens. We have Codethink and Plat'Home on board, we had them on board for the first time as well. Renesas joined us and just recently also Moxa. So if you compare this with other collaborative projects, it's a pretty small one, comparatively small one, so our budget is also limited. It's still decent enough, but, well, we are growing. And based on this budget, we have some developers being paid, Ben is paid this way, you will see later on why. And we have people working from the companies in the communities and we are ramping up on working with communities to improve the base layers for our needs. Everything is open source, we have a GitLab repo as well and you can look up there what's going on there. So, the main areas of activities where we are working on right now. 4 areas. Kernel maintenance, we started with declaring one kernel as the CIP kernel to have an extended support phase for this kernel of 10 years. This is what we're aiming for, which is feasible already for some enterprise distros in a specific area but here we are talking about an industrial area, an embedded area so there is some challenge. I'm saying 10 years, there's sometimes written 15 years, we will see after 10 years if we follow on to this. Along with this, of course, comes the need for real time support. Currently, it's a separated branch, but it's going to be integrated eventually to have the PREEMPT_RT branch ??? doing this. As I mentioned before, Ben is currently our 4.4 CIP kernel maintainer. This is the core, basically where we started activities. We continued in extending this on test infrastructure, so we invested a bit in improving on ??? infrastructure, we are now ramping up an internal ??? just to enable the kernel testing of course. And then, that's actually what I'd like to talk about today a bit more, there's a CIP core. The kernel alone doesn't make a system, you need a user space, you need a user land and that's basically where we are now focusing on, ramping up. Our activity is to define this CIP core, means a base system, user space base system which you want to maintain as long as the kernel, so an other 10 years thing. Our group had a couple of members which were already familiar with Debian before. So it was pretty easy for that group to decide on choosing Debian as the base source for our core, CIP core package. So, why was Debian chosen? Well, it has an outstanding maturity and a focus on stability, so we are pretty much aligned regarding how conservative we see certain things which is a positive thing for us. It has very professional security properties but we also rely on heavily. And also another interesting aspect for us is the license hygiene that you are after to ensure that there is only free software in these packages and that is properly documented. We, when we are using and redistributing software, in contrast to, for example, the service space when you don't usually redistribute things, we are redistributing devices, so we are redistributing software, we have to take care of the licenses that we are redistributing and that we are compliant with all these licenses included. So it's very important for us that this is a consistent picture we get from the package. Someone looked at this already, we are still looking ourselves on this but that's a very important thing. With these characters, we chose Debian as the base system. So, what does it mean right now? We are currently in the process to select the core packages from the Debian packages There is still a little bit of ??? obviously. So we are already working with Debian on certain long term support aspects Just to mention 2 activities, we were sponsoring already the staging repo for security master. Actually I'm ??? aware of the current state of the project but we got the feedback that it's apparently a valuable thing for LTS activity We just joined LTS platinum sponsoring and we are now involved in discussion for this extended LTS activity, so anything beyond 5 years and in the end, that's what we committed to our users. We want to ensure that for the base system the 10 years is reached. Of course, ideally, in the community, not only based on our personal activities but in the end, we have to fill the gap and that's basically our commitment on this.