0:00:00.000,0:00:18.420
36C3 preroll music
0:00:18.420,0:00:24.530
Herald-Angel: This Talk will be about… I[br]have to read this. Mathematical diseases
0:00:24.530,0:00:28.600
in climate models and how to cure them.[br]And I don't have the slightest idea what
0:00:28.600,0:00:33.850
these two guys are talking about now. And[br]when I asked them, they said, just tell
0:00:33.850,0:00:40.039
the people it's about next generation[br]climate models and how to build them.
0:00:40.039,0:00:45.789
Which is cool. Throw that on Twitter.[br]Please welcome Ali Ramadhan and Valentin
0:00:45.789,0:00:47.229
Churavy.
0:00:47.229,0:00:55.239
applause
0:00:55.239,0:00:58.509
Ali Ramadhan: Can you guys hear us? Is[br]this OK?
0:00:58.509,0:01:00.210
Valentin Churavy: I'll stand back...[br]Ramadhan: I'll stand back a little bit.
0:01:00.210,0:01:05.030
OK, cool. Thank you. So if you guys saw[br]the last talk by karlab... karlabyrinth or
0:01:05.030,0:01:08.530
something. So we're kind of expanding on[br]her talk a little bit. So she talked a lot
0:01:08.530,0:01:11.209
about kind of uncertainties…[br]audio feedback from microphone
0:01:11.209,0:01:15.670
uncertainties in climate models. And one[br]point that she did make was that most of
0:01:15.670,0:01:18.410
the uncertainty actually comes from[br]humans. But there's a really huge
0:01:18.410,0:01:22.070
uncertainty that also comes from… comes[br]from the models. So we're talking more
0:01:22.070,0:01:26.170
about the model uncertainties, which is[br]kind of uncertainties because of unknown
0:01:26.170,0:01:31.590
or missing physics, kind of how to cure[br]them. So it'll be kind of a weird talk. So
0:01:31.590,0:01:35.119
I'll talk a little bit more about the[br]climate modeling part and then kind of how
0:01:35.119,0:01:39.840
to cure them involves using new programing[br]languages. And that's where Valentin will
0:01:39.840,0:01:44.810
talk about Julia. So we'll kind of just[br]start with maybe just giving kind of an
0:01:44.810,0:01:50.080
idea of why it's so hard to model the[br]climate. So if you've… maybe you've seen
0:01:50.080,0:01:54.550
images like this a lot where it's like a…[br]it's a satellite image basically of
0:01:54.550,0:01:58.530
clouds. It's used for like weather[br]forecasting. But you can immediately see
0:01:58.530,0:02:02.320
there's lots of, you know, lots of really[br]small clouds. So basically, if you want to
0:02:02.320,0:02:06.250
build a climate model, you've got to be[br]able to resolve all the physics in these
0:02:06.250,0:02:10.770
clouds. So you can actually zoom in a lot.[br]And see the clouds look pretty big over
0:02:10.770,0:02:15.140
here. But if you zoom in on kind of[br]Central America, then you see even smaller
0:02:15.140,0:02:21.509
clouds. And if you zoom in even more so,[br]so you zoom in on the Yucatan Peninsula,
0:02:21.509,0:02:24.780
then you can see the clouds are really,[br]really small. So you're… there are maybe
0:02:24.780,0:02:28.980
five smaller clouds, some some of the[br]clouds are, you know, a hundred meters or
0:02:28.980,0:02:33.709
something. And as the last talk kind of[br]suggests that most climate models are…
0:02:33.709,0:02:38.990
they resolve things of, you know, up to 50[br]kilometers. So anything smaller than 50
0:02:38.990,0:02:43.670
kilometers, the climate model can't really[br]see. So you have to kind of take that… it
0:02:43.670,0:02:46.290
kind of has to account for that because[br]clouds are important, and if you have more
0:02:46.290,0:02:51.779
clouds, then that reflects some of the[br]heat out. So maybe you cool. But it also
0:02:51.779,0:02:55.569
traps more of the heat in so maybe you[br]warm. And if you have more clouds, maybe
0:02:55.569,0:03:00.130
you warm more. But if you have less[br]clouds, maybe you warm even more. So it's
0:03:00.130,0:03:04.549
kind of unsure. We actually don't know if[br]clouds will make the climate warmer or if
0:03:04.549,0:03:07.760
they'll make the climate cooler. So it's[br]important for your climate models to kind
0:03:07.760,0:03:13.200
of resolve or see these little clouds. So[br]kind of where the mathematical disease
0:03:13.200,0:03:16.940
comes in, is that you don't… we don't know[br]what equation to solve. We don't know
0:03:16.940,0:03:21.399
exactly what physics to solve, to see, to[br]kind of resolve the effect of these little
0:03:21.399,0:03:25.120
clouds. So it's kind of the the[br]mathematical disease. We don't know how to
0:03:25.120,0:03:29.939
do it. So you instead use a… well, it's[br]called a parametrization, which is the
0:03:29.939,0:03:33.120
mathematical disease. So in the[br]atmosphere, the big mathematical disease
0:03:33.120,0:03:40.240
is clouds. But if you look at the ocean,[br]you kind of get a similar… You have also
0:03:40.240,0:03:43.719
similar mathematical diseases. So if you[br]for example, this is model output. We
0:03:43.719,0:03:48.830
don't have good satellite imagery of the[br]oceans. So if you if you look at, for
0:03:48.830,0:03:53.400
example, model output from an ocean model,[br]high resolution ocean model here, it's
0:03:53.400,0:03:58.159
kind of centered on the Pacific. So you[br]can kind of see Japan and China and the
0:03:58.159,0:04:03.060
white kind of lines. Those are streamlines[br]or that the lines tell you where the water
0:04:03.060,0:04:07.019
is going. So you could see a lot of kind[br]of straight lines. You see this curve here
0:04:07.019,0:04:13.669
current off of Japan, but you see lots of[br]circles. So the circles are these eddies
0:04:13.669,0:04:17.910
and they're kind of the turbulence of the[br]ocean. They move, they kind of stir and
0:04:17.910,0:04:23.740
mix and transport a lot of salt or heat or[br]carbon or nutrients or… you know, marine
0:04:23.740,0:04:28.270
life or anything. It's the main way the[br]ocean kind of moves heat from the equator
0:04:28.270,0:04:32.970
to the pole. It kind of stirs things[br]around. So they're really important for
0:04:32.970,0:04:38.480
kind of how carbon moves in the ocean, for[br]how the ocean heats up. And here they look
0:04:38.480,0:04:42.259
pretty big. But again, you can zoom in and[br]you'll see lots of small scale structures.
0:04:42.259,0:04:46.500
So we're going to switch to a different[br]model output and different colors. So here
0:04:46.500,0:04:51.260
here's kind of the same area. So you see[br]Japan in the top left. But what's being
0:04:51.260,0:04:55.720
plotted is vorticity. So you have to know[br]what that is. It's kind of a measure of
0:04:55.720,0:05:00.030
how much the fluid or the water is[br]spinning. But the point is that you have
0:05:00.030,0:05:05.340
lots of structure. So there's lots of, you[br]know, big circles, but there also lots of
0:05:05.340,0:05:10.070
really little circles. And again, your[br]climate model can only see something like
0:05:10.070,0:05:14.919
50 kilometers or 100 kilometers. But as[br]you can see here, there's lots of stuff
0:05:14.919,0:05:18.860
that's much smaller than a hundred[br]kilometers. So if you superimpose kind of
0:05:18.860,0:05:23.880
this this grid, maybe that's your climate[br]model grid. And, you know, basically for
0:05:23.880,0:05:27.840
the climate model, every one of these[br]boxes is like one number. So you can't
0:05:27.840,0:05:31.880
really see anything smaller than that. But[br]there's important dynamics and physics
0:05:31.880,0:05:35.380
that happens in like 10 kilometers, which[br]is a lot smaller than what the climate
0:05:35.380,0:05:39.470
model can see. And there's even important[br]physics that happens that like 100 meters
0:05:39.470,0:05:44.290
or 200 meters. So if you want if you want[br]to, you know, what the climate will look
0:05:44.290,0:05:50.909
like, we need to… we need to know about[br]the physics that happens at 200 meters. So
0:05:50.909,0:05:55.020
to give an example of some of the physics[br]that happens at 10 kilometers, here's kind
0:05:55.020,0:06:01.040
of a little animation where this kind of[br]explains why you get all these eddies or
0:06:01.040,0:06:05.530
all the circles in the ocean. So a lot of[br]times you have, say, hot water, say, in
0:06:05.530,0:06:10.520
the north. So the hot water here is all in[br]orange or yellow and you have a lot of
0:06:10.520,0:06:15.340
cold water. So the cold water is in the[br]south and it's purple. And then once this…
0:06:15.340,0:06:20.099
once you add rotation, you end up with[br]these eddies because what the hot water
0:06:20.099,0:06:24.350
wants to do, the hot water is lighter,[br]it's less dense. So it actually wants to
0:06:24.350,0:06:28.810
go on top of the cold water. So usually[br]have cold at the bottom, hot at the top.
0:06:28.810,0:06:34.120
So you have heavy at the bottom and light[br]at the top. So when you add… without
0:06:34.120,0:06:38.050
rotation, the hot water will just go on[br]top of the cold water. But when you have
0:06:38.050,0:06:42.540
rotation, you end up… it kind of wants to[br]tip over. But it's also rotating. So you
0:06:42.540,0:06:47.479
kind of get this beautiful swirling[br]patterns and these are kind of the same
0:06:47.479,0:06:52.380
circular eddies that you see in the real[br]ocean. But this model here is like two
0:06:52.380,0:06:55.139
hundred and fifty kilometers by five[br]hundred kilometers and it's like one
0:06:55.139,0:06:59.710
kilometer deep. So you need a lot of[br]resolution to be able to resolve this
0:06:59.710,0:07:04.750
stuff, but not… your climate model doesn't[br]have that much resolution. So some of the
0:07:04.750,0:07:08.449
features here, like the sharp prints[br]between the cold and the hot water, your
0:07:08.449,0:07:12.879
climate model might not see that. So maybe[br]if you if you don't resolve this properly,
0:07:12.879,0:07:17.270
you get the mixing rate wrong or maybe[br]that the ocean is the wrong temperature or
0:07:17.270,0:07:22.039
something. So it's kind of important to[br]resolve this stuff. Another one, the color
0:07:22.039,0:07:26.170
scheme here is really bad. laughs I'm[br]sorry, but another one, for example, is
0:07:26.170,0:07:32.389
here. Everything's under 100 meter, so[br]it's a cube of 100 meters on each side and
0:07:32.389,0:07:37.340
you're starting with 20 degrees Celsius[br]water at the top. You have 19 degrees
0:07:37.340,0:07:41.370
Celsius water at the bottom initially. So[br]it's kind of you're… as you go deeper in
0:07:41.370,0:07:46.270
the ocean, the water gets colder. And then[br]if you can imagine, the ocean kind of at
0:07:46.270,0:07:50.669
night, it's kind of cold. So the top is[br]being cooled and you end up with cold
0:07:50.669,0:07:54.639
water on the top. The cold water wants to be[br]at the bottom. So it ends up sinking and you
0:07:54.639,0:07:59.030
get all this convection going on. So this[br]is happening at a lot of places in the
0:07:59.030,0:08:03.650
ocean. You get a lot of mixing at the top.[br]You get this kind of layer at the top of
0:08:03.650,0:08:08.090
the ocean. It's kind of constant color,[br]constant temperature. So this mix layer is
0:08:08.090,0:08:12.240
important for the ocean. So knowing how[br]deep that mix layer is and knowing how
0:08:12.240,0:08:16.000
much of the water is being mixed is also[br]important for for climate. But as you can
0:08:16.000,0:08:19.860
imagine, you know, if this happens on very[br]small scales. So you're climate model has
0:08:19.860,0:08:24.870
to know something about what's happening[br]at this scale. So this isn't I guess the
0:08:24.870,0:08:28.921
mathematical diseases in the ocean is, the[br]climate model cannot see this, so it has
0:08:28.921,0:08:32.970
to do something else that's maybe[br]unphysical to resolve this stuff. And
0:08:32.970,0:08:38.420
that's a mathematical disease, I guess.[br]Aside from the ocean and the atmosphere.
0:08:38.420,0:08:41.820
You also have the same problem with sea[br]ice. So this is kind of just a satellite
0:08:41.820,0:08:47.010
picture of where sea ice is forming off[br]the coast of Antarctica. So you get winds
0:08:47.010,0:08:48.680
that kind of come off the [br]continent and they're
0:08:48.680,0:08:51.080
kind of blowing all the [br]ice that's beeing formed
0:08:51.080,0:08:53.900
away. So you get all these[br]little lines and streaks and they kind of
0:08:53.900,0:08:58.390
merge into sea ice. But in this whole[br]picture is like 20 kilometers. So the
0:08:58.390,0:09:01.950
climate model doesn't see this, but[br]somehow it has to represent all the
0:09:01.950,0:09:08.561
physics. And you have kind of similar[br]things happening with soil moisture, land
0:09:08.561,0:09:16.240
and dynamic vegetation, aerosols. So, you[br]know, these are kind of three places with
0:09:16.240,0:09:21.270
pretty pictures. But see, if you look at[br]the atmosphere, so it's not just clouds.
0:09:21.270,0:09:27.350
You also have aerosols, which are like[br]little particles, or sulfates that are
0:09:27.350,0:09:31.680
important for kind of cloud formation and[br]maybe atmospheric chemistry. But again, we
0:09:31.680,0:09:34.700
don't fully understand the physics of[br]these aerosols. So again, you have to kind
0:09:34.700,0:09:40.270
of parametrize them. Same thing with kind[br]of convictions. You maybe your climate
0:09:40.270,0:09:44.160
model doesn't resolve all the very deep[br]convection in the atmosphere so as to get
0:09:44.160,0:09:48.190
all sides to parametrize it. So I guess[br]you have many kind of mathematical
0:09:48.190,0:09:51.291
diseases in the atmosphere. So I'm not[br]expecting you to understand everything in
0:09:51.291,0:09:55.110
this in this picture. But the idea is: The[br]atmosphere is complicated. There's no way
0:09:55.110,0:09:59.740
a climate model is going to kind of, you[br]know, figure all this out by itself. And
0:09:59.740,0:10:04.600
again, you could you could do something[br]similar for the ocean. So we can just show
0:10:04.600,0:10:07.370
an image for like two little parts of[br]these. But the point is, you know, the
0:10:07.370,0:10:11.490
ocean is not kind of just a bucket of[br]water standing there. So there's lots of
0:10:11.490,0:10:15.180
stuff happening deep inside the ocean. And[br]some of it, we think is important for
0:10:15.180,0:10:19.180
climate. Some of it we don't know. Some[br]might not be important. But again, a lot
0:10:19.180,0:10:25.430
of this happens on very small spatial[br]scales. So we don't know or the climate
0:10:25.430,0:10:30.441
model can't always resolve all this stuff.[br]And again, same thing with kind of sea
0:10:30.441,0:10:34.370
ice. Lots of small scale stuff is[br]important for sea ice. And I think one
0:10:34.370,0:10:37.770
person asked about kind of tipping points[br]and there's kind of two with like sea ice
0:10:37.770,0:10:43.880
that are pretty important. One of them is[br]this CSL biofeedback. So if you have sea
0:10:43.880,0:10:48.630
ice that melts. Now you have more ocean[br]and the ocean can absorb more heat. But
0:10:48.630,0:10:51.980
now the earth is warmer, so it melts more[br]sea ice. So as soon as you kind of start
0:10:51.980,0:10:55.670
melting sea ice, maybe you melt even more[br]sea ice and eventually you reach an earth
0:10:55.670,0:11:00.020
with no sea ice. So there's kind of[br]research into that stuff going on, but
0:11:00.020,0:11:04.410
it's a possible tipping point. Another one[br]is this kind of marine ice sheet,
0:11:04.410,0:11:08.970
stability, instability at the bottom of[br]the ice shelf. So if you start melting
0:11:08.970,0:11:13.500
water, if you start melting ice from the[br]bottom of the ice shelf, then we create
0:11:13.500,0:11:17.530
kind of a larger area for more ice to[br]melt. So maybe once you start melting and
0:11:17.530,0:11:20.940
increasing sea level, you just keep[br]melting more and more and increasing sea
0:11:20.940,0:11:27.300
level even more. But again, it's kind of[br]hard to quantify these things on like 50
0:11:27.300,0:11:34.910
or 100 year timescales because it all[br]happens on very small scales. So yeah, the
0:11:34.910,0:11:39.760
point is there's lots of these kind of[br]parametrizations or mathematical diseases.
0:11:39.760,0:11:43.800
And once you start adding them all up, you[br]end up with lots and lots of kind of
0:11:43.800,0:11:48.180
parameters. So this is a really boring[br]table. But the point is, so this is like
0:11:48.180,0:11:53.110
one parametrization for like vertical[br]mixing in the ocean. It's basically the
0:11:53.110,0:11:57.140
process that I showed the rainbow color[br]movie about to see a climate model for
0:11:57.140,0:12:02.060
that. I'm trying to kind of parametrize[br]that, physics might have like 20
0:12:02.060,0:12:05.860
parameters. And, you know, some of them[br]are crazy like a surface layer fractional
0:12:05.860,0:12:10.120
like zero point one or something. And[br]usually they keep the same constants for
0:12:10.120,0:12:14.620
all these values. Usually it's like[br]someone in like 1994 came up with these 20
0:12:14.620,0:12:18.730
numbers and now we all use the same 20[br]numbers. But you know, maybe they're
0:12:18.730,0:12:21.850
different. And like the Pacific or the[br]Atlantic or like maybe they're different
0:12:21.850,0:12:25.740
when it's summer and winter and the[br]problem is, there's many of these
0:12:25.740,0:12:28.920
parametrizations. So you know here's like[br]20 parameters, but then you have a lot
0:12:28.920,0:12:32.430
more for clouds. You have a lot more sea[br]ice. We add them all up. Suddenly you have
0:12:32.430,0:12:38.050
like 100, maybe up to a thousand kind of[br]tunable parameters. Kind of going back to
0:12:38.050,0:12:42.700
this plot that was shown at the last talk.[br]You can see kind of the all the models
0:12:42.700,0:12:47.880
kind of agree really well from like 1850[br]to 2000, because they're all kind of being
0:12:47.880,0:12:50.950
they all have different kind of[br]parameters, but they all get kind of tuned
0:12:50.950,0:12:54.890
or optimized. So they get the 20th[br]century. Correct. So they get the black
0:12:54.890,0:12:59.980
line. Correct. But then when you run them[br]forward, you run them to like 2300. They
0:12:59.980,0:13:02.860
all are slightly different. So they all[br]start producing different physics and
0:13:02.860,0:13:07.550
suddenly you get a huge like red band. So[br]that's saying you have lots of model
0:13:07.550,0:13:12.870
uncertainty. So it's kind of on some[br]people might say like oh this like tuning
0:13:12.870,0:13:17.510
process is like optimization. It's like[br]not very scientific to be kind of right.
0:13:17.510,0:13:22.440
It's kind of like in the past. It's kind[br]of like the best live we've had. But I
0:13:22.440,0:13:26.420
think, you know, we should be able to do a[br]little bit better. Better than that. So
0:13:26.420,0:13:28.700
just to give you the idea, you know, some[br]people would say, you know, why don't you
0:13:28.700,0:13:31.650
just you know, most of the physics, which[br]you just, you know, resolve all the
0:13:31.650,0:13:36.640
physics, you know, but see if you want to[br]do like a direct numerical simulation, so
0:13:36.640,0:13:39.410
it's basically saying you want to resolve[br]all the motions in the ocean, in the
0:13:39.410,0:13:43.310
atmosphere. You basically need to resolve[br]things down to like one millimeter. So if
0:13:43.310,0:13:46.850
you have like a grid spacing of one[br]millimeter and you consider the volume of
0:13:46.850,0:13:50.760
the ocean and the atmosphere, you[br]basically say you need like 10 to the 28
0:13:50.760,0:13:55.470
grid points. You know, that's like imagine[br]putting cubes of like one millimeter
0:13:55.470,0:13:56.850
everywhere in the [br]ocean and atmosphere.
0:13:56.850,0:13:58.630
That's how many great[br]points you would need.
0:13:58.630,0:14:01.830
So unfortunately, you could do that.[br]But there's not enough computer power or
0:14:01.830,0:14:05.800
storage space in the world to do that. So[br]you're kind of stuck doing something a bit
0:14:05.800,0:14:11.120
coarser. Usually most climate models, he's[br]like 10 to the 8 great points so that you
0:14:11.120,0:14:16.390
10 to the 20 to little words. You don't[br]want to just run a big climate model once
0:14:16.390,0:14:19.870
you know you need to run them for very[br]long times, usually like you run them for
0:14:19.870,0:14:23.290
a thousand years or ten thousand years[br]when you want to run many of them because
0:14:23.290,0:14:27.170
you want to collect statistics. So[br]generally you don't run at the highest
0:14:27.170,0:14:31.460
resolution possible. You run kind of at a[br]lower resolution so you can run many, many
0:14:31.460,0:14:36.900
models. So because you can only use so[br]much resolution, it seems that power
0:14:36.900,0:14:39.790
transitions or these kind of mathematical[br]things, you have to live with them, you've
0:14:39.790,0:14:45.310
got to use them. But at least one idea is,[br]you know, instead of using numbers that
0:14:45.310,0:14:49.030
sum. But he came up with in 1994. You[br]might as well try to figure you know,
0:14:49.030,0:14:50.980
better numbers or maybe [br]you know if the numbers
0:14:50.980,0:14:52.210
are kind of different [br]in different places,
0:14:52.210,0:14:55.910
you should find that out. So one[br]thing you could do, one thing we are
0:14:55.910,0:15:01.340
trying to do is get the pressurization is[br]to kind of agree with like basic physics
0:15:01.340,0:15:05.210
or agree with observations. So we have[br]lots of observations. How many we can run
0:15:05.210,0:15:07.300
kind of high resolution [br]simulations to resolve
0:15:07.300,0:15:09.100
a lot of the physics [br]and then make sure
0:15:09.100,0:15:12.100
when you put the prioritization in[br]the climate model, it actually gives you
0:15:12.100,0:15:16.840
the right numbers according to basic[br]physics or observations. But sometimes
0:15:16.840,0:15:20.810
that might mean, you know, different[br]numbers in the Atlantic, in the Pacific or
0:15:20.810,0:15:24.480
different numbers for the winter and the[br]summer. And you have to run many high
0:15:24.480,0:15:28.660
resolution simulations to get enough data[br]to do this. But indeed, you know, these
0:15:28.660,0:15:33.890
days I think we have enough computing[br]power to do that. So it's kind of do all
0:15:33.890,0:15:37.070
these high resolution simulations. We[br]ended up building a new kind of ocean
0:15:37.070,0:15:42.370
model that we run on GPUs because these[br]are all faster for giving us these
0:15:42.370,0:15:46.520
results. So we ended up usually most[br]climate modeling is done in Fortran. We
0:15:46.520,0:15:53.310
decided to go with with Julia for a number[br]of reasons, which I'll talk about. But the
0:15:53.310,0:15:58.220
left figure is kind of that mixed layer or[br]boundary layer turbulence kind of movie.
0:15:58.220,0:16:01.630
But instead of the rainbow color map, now[br]it's using a more reasonable color maps.
0:16:01.630,0:16:07.050
It looks like the ocean, the right is that[br]old movie. So we're generating tons and
0:16:07.050,0:16:11.670
tons of data from using simulations like[br]this and then hopefully we can get enough
0:16:11.670,0:16:15.440
data and like figure out a way to explain[br]the prior transitions. But it's kind of a
0:16:15.440,0:16:19.810
work in progress. So a different idea that[br]might be more popular here, I don't know.
0:16:19.810,0:16:25.770
Is instead of kind of using the existing[br]permanent positions, you could say, OK,
0:16:25.770,0:16:29.960
well, now you have tons and tons of data.[br]Maybe you just throw in like a neural
0:16:29.960,0:16:34.620
network into the differential equations.[br]Basically, you put in the physics, you
0:16:34.620,0:16:37.760
know, and then the neural network is[br]responsible for the physics you don't
0:16:37.760,0:16:43.780
know. So, for example, you know, most[br]people here might not. I also don't want
0:16:43.780,0:16:46.230
to talk about differential equations[br]because I would take a long time. So just
0:16:46.230,0:16:48.280
imagine that the equation[br]in the middle is kind of
0:16:48.280,0:16:49.960
what a climate model[br]needs to solve.
0:16:49.960,0:16:53.240
And the question marks are kind of[br]physics we don't know. So we don't know
0:16:53.240,0:16:58.720
what to put there. But maybe you could put[br]out a neural network. So number one is
0:16:58.720,0:17:03.320
kind of a possible characterisation or a[br]possible way you could try to franchise
0:17:03.320,0:17:05.750
the missing physics where the neural[br]networks kind of responsible for
0:17:05.750,0:17:09.940
everything. We find that doesn't work as[br]well. So instead, maybe you tell it some
0:17:09.940,0:17:13.990
of the physics, maybe tell it about cue,[br]which is like the heating or cooling at
0:17:13.990,0:17:17.540
the surface. And then it's kind of[br]responsible for resolving the other stuff.
0:17:17.540,0:17:22.300
But it's still a work in progress because[br]the blue is kind explosive your data. The
0:17:22.300,0:17:25.690
orange is supposed to be the narwhal and[br]they don't agree. So it's still a work in
0:17:25.690,0:17:28.960
progress, but hopefully we'll be able to[br]do that better. So this is kind of stuff
0:17:28.960,0:17:34.400
that's like a week or two old. But kind of[br]reach a conclusion, at least from my half
0:17:34.400,0:17:39.750
of the talk. So the reason I personally[br]like Julia as a climate modeler is we were
0:17:39.750,0:17:45.000
able to kind of build an ocean model from[br]scratch basically in less than a year. And
0:17:45.000,0:17:47.471
one of the nice things[br]is that the user interface
0:17:47.471,0:17:49.561
or the scripting and the model[br]backend is all in one language,
0:17:49.561,0:17:52.030
whereas in the past[br]used to usually write the high
0:17:52.030,0:17:57.850
level and like Python and maybe the back[br]end is like Fortran or C. And we find, you
0:17:57.850,0:18:01.400
know, when we Julia, it's just as fast as[br]our legacy model, which was written in
0:18:01.400,0:18:07.230
Fortran. And one of the nicest things was[br]that basically able to write code once and
0:18:07.230,0:18:10.510
using there's a need of GPU compiler. So[br]basically you write your code one single
0:18:10.510,0:18:15.180
code base and you go to CPUs and GPUs.[br]So you'd want to write two different code
0:18:15.180,0:18:20.610
bases. And yeah, we find generally because[br]it's high level language, we're all kind
0:18:20.610,0:18:24.860
of more productive. We can give a more[br]powerful user API and Julia kind of has a
0:18:24.860,0:18:30.240
nice multiple dispatch backend so that we[br]find that makes it easy for the users to
0:18:30.240,0:18:35.330
kind of extend the model or hack the[br]model. And there's. Some people would say
0:18:35.330,0:18:38.480
the Julia community is pretty small. But[br]we find there's a pretty big Julia
0:18:38.480,0:18:41.470
community interest in scientific[br]computing. So we fund kind of all the
0:18:41.470,0:18:46.540
packages we need are pretty much[br]available. So with our client conclude my
0:18:46.540,0:18:50.240
half by saying there is most of the[br]uncertainty in climate modeling basically
0:18:50.240,0:18:53.850
comes from humans because they don't know[br]what humans will do. But there's a huge
0:18:53.850,0:18:57.550
model uncertainty basically because of[br]physics we don't understand or physics,
0:18:57.550,0:19:01.610
the kind of model cannot see. You can't[br]resolve every cloud and you know, every
0:19:01.610,0:19:05.390
wave in the oceans you've got you've got[br]to figure out a way to account for them.
0:19:05.390,0:19:09.940
So that's what our prioritization does.[br]And we're trying to kind of use a lot of
0:19:09.940,0:19:14.680
computing power to kind of make sure we[br]train or come up with good privatizations
0:19:14.680,0:19:19.600
instead of kind of tuning the model at the[br]end. And we're hoping this will lead to
0:19:19.600,0:19:23.750
better climate predictions. Maybe you[br]will. Maybe you won't. But at least, you
0:19:23.750,0:19:28.990
know, even if it doesn't. Hopefully we can[br]say we got rid of the model tuning problem
0:19:28.990,0:19:33.110
and hopefully we can make. We find it[br]that software development for climate
0:19:33.110,0:19:37.930
modeling is easier than if we did it in[br]Fortran. I will say this kind of an
0:19:37.930,0:19:41.760
advertisement, but I'm looking to bike her[br]on Germany for a week and apparently can't
0:19:41.760,0:19:46.390
take the next bike out of Leipzig. So if[br]anyone is looking to sell their bicycle or
0:19:46.390,0:19:51.230
wants to make some cash, I'm looking to[br]rent a bicycle. So yeah, if you have one,
0:19:51.230,0:19:55.070
come talk to me, please. Thank you. Danke.
0:19:55.070,0:20:02.190
applause
0:20:02.190,0:20:09.490
Churavy: So one big question for me always[br]is how can we ask technologists hub that
0:20:09.490,0:20:14.580
think most of us in this room are fairly[br]decent with computers? The internet is not
0:20:14.580,0:20:19.410
necessarily an new island for us. But how[br]do we use that knowledge to actually
0:20:19.410,0:20:25.060
impact real change? And if you haven't[br]there's some fantastic article:
0:20:25.060,0:20:30.030
worrydreams.com/ClimateChange. Which lists[br]all the possible or not all the possible
0:20:30.030,0:20:36.610
but a lot of good ideas to think about and[br]go like, okay, do my skills apply in that
0:20:36.610,0:20:43.490
area? Well, I'm a computer scientist. I do[br]programing language research. So how do my
0:20:43.490,0:20:50.430
skills really apply to climate change? How[br]can I help? And one of the things that
0:20:50.430,0:20:54.880
took me in this article was how, and one[br]of the realization, and why I do my work
0:20:54.880,0:20:59.790
is that the tools that we have built for[br]scientists and engineers, they are that
0:20:59.790,0:21:05.550
poor. Computer scientists like myself have[br]focused a lot on making programing easier,
0:21:05.550,0:21:11.580
more accessible. What we don't necessarily[br]have kept the scientific community as a
0:21:11.580,0:21:18.250
target audience. And then you get into[br]this position where models are written in
0:21:18.250,0:21:23.150
a language. Fortran 74 and isn't that a[br]nice language, but it's still not one that
0:21:23.150,0:21:28.320
is easily picked up and where you find[br]enthusiasm in younger students for using
0:21:28.320,0:21:36.450
it. So I work on Julia and my goal is[br]basically to make a scientific computing
0:21:36.450,0:21:41.410
easier, more accessible and make it easier[br]to access the huge computing power we have
0:21:41.410,0:21:49.960
available to do climate modeling. Ideas,[br]if you are interesting in this space is,
0:21:49.960,0:21:53.590
you don't need to work on Julia[br]necessarily, but you can think about maybe
0:21:53.590,0:21:59.420
I'm to look at modeling for physical[br]systems, modeling like one of the
0:21:59.420,0:22:04.250
questions is can be model air conditioning[br]units more precisely, get them more
0:22:04.250,0:22:09.211
efficient? Or any other technical system.[br]How do we get that efficiency? But we need
0:22:09.211,0:22:15.340
better tools to do that. So the language[br]down here as an example is modelicar.
0:22:15.340,0:22:19.560
There is a project right now, modelicar is[br]trying to see how we can push the
0:22:19.560,0:22:24.770
boundary there. The language up here is Fortran.[br]You might have seen a little bit of that
0:22:24.770,0:22:32.310
in the talk beforehand and it's most often[br]used to do climate science. So why
0:22:32.310,0:22:37.100
programing languages? Why do I think that[br]my time is best spent to actually work on
0:22:37.100,0:22:44.250
programing languages and do that in order[br]to help people? Well, Wittgenstein says:
0:22:44.250,0:22:48.620
"The limits of my language are the limits[br]of my world." What I can express is what I
0:22:48.620,0:22:52.190
think about. And I think people are[br]multilingual, know that that sometimes
0:22:52.190,0:22:56.050
it's easier to think for them. But certain[br]things in one language say it isn't the
0:22:56.050,0:23:00.660
other one. But language is about[br]communication. It's about communication
0:23:00.660,0:23:05.670
with scientists, but it's also about[br]communication with the computer. And too
0:23:05.670,0:23:09.950
often programing language fall into that[br]trap where it's about, oh, I want to
0:23:09.950,0:23:15.720
express my one particular problem or I[br]wanna express my problem very well for the
0:23:15.720,0:23:20.420
compiler, for the computer. I won't talk[br]to the machine. What if I found that
0:23:20.420,0:23:24.640
programming languages are very good to[br]talk to other scientists, to talk in a
0:23:24.640,0:23:29.220
community and to actually collaborate? And[br]so the project that Ali and I are both
0:23:29.220,0:23:35.420
part of has, I think, 30 ish. I don't[br]know. The numbers are as big as the big
0:23:35.420,0:23:40.580
coupe of climate scientists modelers. And[br]we have a couple of numerical scientists,
0:23:40.580,0:23:44.650
computer scientists and engineers and we[br]all working the same language, being able
0:23:44.650,0:23:49.260
to collaborate and actually work on the[br]same code instead of me working on some
0:23:49.260,0:23:56.610
low level implementation and Ali telling[br]me what to write. That wouldn't be really
0:23:56.610,0:24:05.050
efficient. So, yes, my goal is to make[br]this search easier. Do we really need yet
0:24:05.050,0:24:09.110
another high level language? That is a[br]question I often get. It's like why Julia?
0:24:09.110,0:24:14.660
And not why are you not spending your time[br]and effort doing this for Python? Well, so
0:24:14.660,0:24:21.580
this is as a small example, this is Julia[br]code. It looks rather readable. I find it
0:24:21.580,0:24:28.930
doesn't use a semantic whitespace. You may[br]like that or not. It has all the typical
0:24:28.930,0:24:33.960
features that you would expect from a high[br]level dynamic language. It is using the
0:24:33.960,0:24:37.190
M.I.T. license that has a built in package[br]manager. It's very good for interactive
0:24:37.190,0:24:45.040
development, but it has a couple of[br]unusual wants and those matter. You need
0:24:45.040,0:24:49.830
if you want to simulate a climate model,[br]you need to get top performance on a
0:24:49.830,0:24:56.470
supercomputer. Otherwise you won't get an[br]answer in the time that it matters. Julia
0:24:56.470,0:25:02.830
uses just in time ahead of time[br]compilation, the other great feature is
0:25:02.830,0:25:07.670
actually a spitting in Julia. So I can[br]just look at implementations. I can dive
0:25:07.670,0:25:12.610
and dive and dive deeper into somebodies[br]code and don't have a comprehension
0:25:12.610,0:25:20.210
barrier. If I if you ever have spent some[br]time and tried to figure out how Python
0:25:20.210,0:25:26.360
sums numbers under the hood to make it[br]reasonably fast. Good luck. It's hard.
0:25:26.360,0:25:30.060
It's written in C. See, and there is a lot[br]of barriers in order to understand what's
0:25:30.060,0:25:35.380
actually going on. Then reflection and[br]meta programing. You can do a lot of fun
0:25:35.380,0:25:41.140
stuff which we're going to talk about. And[br]then the big coin for me is that you have
0:25:41.140,0:25:46.150
native keep you code generation support so[br]you can actually take Julia code and run
0:25:46.150,0:25:49.920
it on the GPU. You you're not[br]relying on libraries because libraries
0:25:49.920,0:25:59.450
only are can express the things. That was[br]where writtenin there. So early on last
0:25:59.450,0:26:03.550
December, I think we met up for the[br]climate science project and after deciding
0:26:03.550,0:26:08.450
on using Julia for the entire project.[br]They were like, we we're happy with the
0:26:08.450,0:26:12.710
performance, but we have a problem. We[br]have to duplicate our code for GPUs
0:26:12.710,0:26:20.530
and CPUs. What really? It can't be! I mean, I[br]designed the damn thing, it should be working.
0:26:20.530,0:26:25.559
Well, what they had at that point was[br]basically always a copy of two functions
0:26:25.559,0:26:31.170
where one side of it was writing the CPU[br]code and the other side was implementing a
0:26:31.170,0:26:36.710
GPU code. And really, there were only a[br]couple of GPU specific parts in there. And
0:26:36.710,0:26:43.210
if anybody has ever written GPU Code, it's[br]this pesky which index am I calculation.
0:26:43.210,0:26:49.710
Worthy for loop on the CPU to would just[br]looks quite natural. And I was like, what?
0:26:49.710,0:26:55.610
Sit. Come on. What we can do is we can[br]just wait a kernel so he takes a body of
0:26:55.610,0:27:00.490
the for loop, extracts it in a new[br]function. Add a little bit of sugar and
0:27:00.490,0:27:06.460
magic to court GPU kernels and CPU[br]functions and then we're done. Problem
0:27:06.460,0:27:12.730
solved. What the code roughly would look[br]look like isn't actually this. You can
0:27:12.730,0:27:19.670
copy and paste this and it should work.[br]And so you have two functions. One of them
0:27:19.670,0:27:23.679
launches where you extract your kernel.[br]Then you write a function that takes
0:27:23.679,0:27:29.580
another function and runs it function in a[br]for loop or it launches that function on
0:27:29.580,0:27:34.600
the GPU. And then you have this little GPU[br]snippet is the only bit of us actually
0:27:34.600,0:27:39.250
GPU, which calculates the index and[br]then calls the function F with an index
0:27:39.250,0:27:45.780
argument. I'm done here. My, my[br]contribution to this project was done,
0:27:45.780,0:27:49.650
Well, they came back to me and we're like,[br]now it's not good enough. And I was like,
0:27:49.650,0:27:54.550
why? Well, the issue is they needed kernel[br]fusion. So that's the process of taking
0:27:54.550,0:28:00.110
two functions and merging them together.[br]I'm like, okay, fine. Why do they need
0:28:00.110,0:28:04.790
that? Because if you want to be white(?)[br]average efficient GPO code, you need to be
0:28:04.790,0:28:09.980
really concerned about the numbers of[br]global memory loads and stores. If you
0:28:09.980,0:28:13.720
have too many of them or if they are[br]irregular, you lose a lot of performance
0:28:13.720,0:28:20.470
and you need good performance. Otherwise,[br]we can't simulate the solution once. They
0:28:20.470,0:28:24.600
also actually wanted to take use GPU[br]functionality and low level controlled.
0:28:24.600,0:28:29.610
They wanted to look at their kernels and[br]use shared memory constructs. They wanted
0:28:29.610,0:28:36.190
to do precise risk working, minimizing the[br]number of registers used and they really cared
0:28:36.190,0:28:40.751
about low level performance. They were[br]like, well, we can't do this with the
0:28:40.751,0:28:47.790
abstraction you gave us because it builds[br]up too many barriers. And I could have
0:28:47.790,0:28:53.970
given you a few more typical computer[br]science answer, which would have been OK.
0:28:53.970,0:28:59.429
Give me two years and I'll come back to[br]you and there is a perfect solution which
0:28:59.429,0:29:03.350
is like a cloud cover in the sky. And I[br]write your speech spoke language that does
0:29:03.350,0:29:06.850
exactly what you need to do. And at the[br]end, we have a domain specific language
0:29:06.850,0:29:10.590
for climate simulation that will do final[br]volume and discontinuous cloaking in
0:29:10.590,0:29:17.410
everything you want. And I will have a[br]PhD. Kit. Fantastic. Well, we don't have
0:29:17.410,0:29:21.880
the time. The whole climate science[br]project that we are on has accelerated
0:29:21.880,0:29:27.040
timeline because the philanthropist that[br]the funding that research are. Well, if
0:29:27.040,0:29:35.090
you can't give us better answer anytime[br]soon, it won't matter anymore. So I sat
0:29:35.090,0:29:40.580
down and was like, okay, I need a box. I[br]need something. It has minimal effort.
0:29:40.580,0:29:45.150
Quick delivery. I need to be able to fix[br]it. If I do get it wrong the first time
0:29:45.150,0:29:49.600
around and I did, it needs to be hackable.[br]My collaborator needs to understand it and
0:29:49.600,0:29:55.040
actually be able to change it. And it[br]needs to be happened yesterday. Well,
0:29:55.040,0:29:59.370
Julia is good at these kinds of hacks. And[br]as I've learned, you can actually let them
0:29:59.370,0:30:06.259
go into bespoke solutions and have better[br]abstractions after the fact. So that
0:30:06.259,0:30:10.220
you're that you can actually do the fancy[br]computer science that I really wanted to
0:30:10.220,0:30:14.860
do. The product is called GPUify Loops[br]because I couldn't come up with a worse
0:30:14.860,0:30:23.520
name, nobody else could. So we stick with[br]it. It's a Macro based. And so, Julia, you
0:30:23.520,0:30:30.730
can write syntax macros that transform the[br]transform the written statements into
0:30:30.730,0:30:37.330
similar statements so you can insert code[br]or remove code if you want to. At, right
0:30:37.330,0:30:41.330
now target CPUs and GPUs and we are[br]talking about how do we get multi threaded
0:30:41.330,0:30:46.480
into the story, how do we target more on[br]different GPUs? There are other projects
0:30:46.480,0:30:51.470
that are very similar. So there's OCCA,[br]which is where a lot of these ideas are
0:30:51.470,0:30:58.610
coming from and Open ACC in C++ does[br]something really similar. But basically
0:30:58.610,0:31:02.290
you write a for loop, you write an at[br]loop in front of it, which is the magic
0:31:02.290,0:31:09.480
macro that takes a transformation. And you[br]have two indexed statements and now you
0:31:09.480,0:31:15.090
just say I want to launch it on the GPU[br]and it magically does a job. Get,
0:31:15.090,0:31:22.150
fantastic. So let's pick up the entire[br]implementation of the macro at loop
0:31:22.150,0:31:29.270
without the error checking that didn't fit[br]on the screen a couple of nights. So
0:31:29.270,0:31:38.140
everything is here and basically I'm just[br]manipulating the for loop so that on the
0:31:38.140,0:31:46.350
GPU it only iterates one iteration per[br]index and on CPU it iterates all of the
0:31:46.350,0:31:50.890
indices because CPU is single threaded[br]and a GPU is many, many
0:31:50.890,0:31:56.810
multithreaded. Of course there's a little[br]bit of magic hidden in the device function
0:31:56.810,0:32:00.880
because how do I know where I'm running?[br]And if you're curious how to do that and
0:32:00.880,0:32:06.780
then we can talk after afterwards. But[br]otherwise, it's a very simple,
0:32:06.780,0:32:11.440
straightforward transformation. It's[br]written in Julia. It's a Julia function.
0:32:11.440,0:32:17.600
And. Yeah. So you don't need to understand[br]the code here. I just want to show how quick it
0:32:17.600,0:32:24.760
can be to write something like this. If[br]you know anything about GPU Programming at
0:32:24.760,0:32:28.620
all, there should be a little voice in the[br]head, of the back of your head is like,
0:32:28.620,0:32:34.780
wait a second. How can you run a dynamic[br]programming on a GPU? That shouldn't be
0:32:34.780,0:32:43.100
possible. Well, Julia can run on the GPU[br]because it has a lot of meta programing
0:32:43.100,0:32:47.770
facilities for the port for stage[br]programing. So I can generate code based
0:32:47.770,0:32:51.679
on a specific call signature. It has[br]introspection, reflection mechanisms that
0:32:51.679,0:32:56.760
allow me to do some interesting stuff in[br]the background. It is built upon LVM,
0:32:56.760,0:33:02.470
which is a common compiler infrastructure.[br]And so I can actually write staged
0:33:02.470,0:33:08.440
function that would generate an LVM[br]specific code for my one function and do
0:33:08.440,0:33:16.820
so do that during compile time and is a[br]dynamic language that tries really hard to
0:33:16.820,0:33:20.480
avoid runtime uncertainties. And this is[br]one of the challenges if you're getting
0:33:20.480,0:33:26.900
into Julia is to understand that when[br]you're writing code that has a lot of
0:33:26.900,0:33:32.230
runtime uncertainties, you get relative[br]slow performance, or as fast as Python.
0:33:32.230,0:33:36.780
But if you work with the compiler and you[br]write runtime uncertainties you can get
0:33:36.780,0:33:40.490
very fast code and you can run your code[br]on the GPU, you basically that's the
0:33:40.490,0:33:45.340
limites test. If you can run your code on[br]the GPU, that you did your job well and it
0:33:45.340,0:33:50.750
provides tools to understand the behavior[br]of your code. So a warning runtime
0:33:50.750,0:33:55.130
uncertainty. It does that and I don't have[br]the time to go too deep into the answers.
0:33:55.130,0:33:59.240
There is actually a paper about this. It[br]has a type system that allows you to do
0:33:59.240,0:34:03.260
some sophisticated reasoning type[br]influence to figure out what your code is
0:34:03.260,0:34:07.660
doing. Mutable dispatchers actually[br]helping us quite a lot in making it easier
0:34:07.660,0:34:11.409
to do virtualized codes. It was a case of[br]specialization and just in time
0:34:11.409,0:34:18.250
compilation. And so just looking a little[br]bit closer at some of these topics, if you
0:34:18.250,0:34:25.060
want to look at the entire pipeline that[br]flow when you start while you're
0:34:25.060,0:34:30.130
functioning, call it what happens through[br]the Julia compiler. You have tools to
0:34:30.130,0:34:33.830
introspect and all of these on the right[br]hand side here and then you have tools to
0:34:33.830,0:34:43.300
interact on the left hand side. You can[br]inject code back into the compiler. The
0:34:43.300,0:34:48.490
other thing is Julia has dynamic[br]semantics. So when you difficult, you can
0:34:48.490,0:34:54.200
at runtime, redefine your function and[br]recall it new function and it uses
0:34:54.200,0:35:00.930
multiple dispatch. So if you look at the[br]absolute value call here, which of the 13
0:35:00.930,0:35:06.280
possible methods will it call? In C++ or[br]in other programing languages this called
0:35:06.280,0:35:10.930
a virtual function call. So isn't Julia[br]everything a virtual functional call? No.
0:35:10.930,0:35:18.250
This is one of the important points is[br]when we call a function, let's say because
0:35:18.250,0:35:24.230
sign of X, we look at the type of the[br]input arguments and then we first of all
0:35:24.230,0:35:33.590
look at which function is applicable to[br]our input argument. So in this case, it
0:35:33.590,0:35:41.920
would be the real down here because float[br]64 is a subtype of real. So we choose the
0:35:41.920,0:35:50.740
right method using dispatch and then we[br]specialize that method for the signature.
0:35:50.740,0:35:54.320
So the rule in multiople dispact is to[br]remember is we calling the most specific
0:35:54.320,0:35:58.670
method, whatever specific might mean. So[br]if you have this bit of example, where we
0:35:58.670,0:36:06.220
have a function F, which has three[br]different methods and we have an integer
0:36:06.220,0:36:10.960
argument that can be matched on X, or on Y,[br]and then we have a floating point argument
0:36:10.960,0:36:16.690
on Y and we call this with a "1,Hello".[br]Well, we will select the methods that is
0:36:16.690,0:36:24.420
most specific for this argument, which[br]would be the number 1 here. On the other
0:36:24.420,0:36:28.820
hand, if when we have a float 64 and[br]the second position, then we will call the
0:36:28.820,0:36:34.730
second method. Now what happens if I pass[br]in an integer and the first position and a
0:36:34.730,0:36:38.859
floating point in the second position?[br]Well, you would get a run time error
0:36:38.859,0:36:44.180
because we can't make this decision. What[br]is the most specific method? That's just
0:36:44.180,0:36:49.170
something to keep in mind. Method[br]specialization works really similarly when
0:36:49.170,0:36:55.350
you call a method for the first time. This[br]method sign right now has no
0:36:55.350,0:37:01.380
specializations. And then I look back,[br]call it once and Julia will insert a
0:37:01.380,0:37:06.710
speciallisation just for Float64. Before[br]that it could have been a Float32. The
0:37:06.710,0:37:11.430
Float64 is for this method. So[br]Julia specializes in compilers methods on
0:37:11.430,0:37:15.050
concrete called signatures instead of[br]keeping everything dynamic or everything
0:37:15.050,0:37:23.210
ambiguous. You can introspect this process[br]and there are several macros that are code
0:37:23.210,0:37:30.510
lowered or code type that will help you[br]understand that process. I think I don't
0:37:30.510,0:37:35.160
have enough time to go into detail here,[br]but just as a note, if you have a look at
0:37:35.160,0:37:40.600
this, the percentage for means it's an[br]assignment. So if you reference it later,
0:37:40.600,0:37:48.660
so in line 5, we will iterate on the 4[br]value. And then we can look at the type
0:37:48.660,0:37:52.730
information that Julia infers out of that[br]call. We're calling the function mandel
0:37:52.730,0:37:59.490
with the U in 32 and you can see how that[br]information propagates through the
0:37:59.490,0:38:05.109
function itself. And then if you actually[br]do agressive inlining .., we do aggressive
0:38:05.109,0:38:09.750
inlining and optimizations and[br]devirtualization. And so in the end, we
0:38:09.750,0:38:15.960
don't have calls anymore. We only have the[br]intrinsics that Julia provides on which
0:38:15.960,0:38:23.870
programs are actually implemented. So this[br]is a unsigned less than integer function.
0:38:23.870,0:38:27.830
So we are using time and find as an[br]optimization to find static or near static
0:38:27.830,0:38:32.260
site programs. It allows us to do us[br]agressive virtualization, inlining and
0:38:32.260,0:38:36.630
constant propagation. But it raises[br]problems of cash and validation. So in
0:38:36.630,0:38:42.250
bygone days, this used to be the case. I[br]could define a new function G after
0:38:42.250,0:38:48.140
calling G want a function, a new function,[br]f after calling G once and I would get the
0:38:48.140,0:38:53.250
old restore back. That's bad. That's[br]counter-intuitive. That's not dynamic. So
0:38:53.250,0:38:59.560
in Julia 1.0 and I think 0.5 and 0.6[br]already. We fix that. So we invalidating
0:38:59.560,0:39:06.400
the functions that have dependencies on[br]the function. We just changed. But can we
0:39:06.400,0:39:09.710
see latency of your program? If you change[br]a lot of the functions and you recall them
0:39:09.710,0:39:20.781
well hm we need to do a lot of work every[br]time. We do constant propagation, so it
0:39:20.781,0:39:27.420
isn't very simple example. We try to[br]reduce. We try to exploit as much
0:39:27.420,0:39:32.210
information as possible. And so if you[br]call if you want a function F and you call
0:39:32.210,0:39:36.360
the function sign with a constant value,[br]we actually build just turning you the
0:39:36.360,0:39:40.520
constant avoiding the calculation is the[br]sine entirely. And that can be very
0:39:40.520,0:39:49.930
important during hot calls and in a cycle.[br]This can sometimes go wrong or Julia can
0:39:49.930,0:39:53.930
has heuristics in order to decide when or[br]whether or not these optimizations are
0:39:53.930,0:40:00.680
valuable. And so when you introspect your[br]code, you might see the results that are
0:40:00.680,0:40:05.670
not that are not quite, what you want. So[br]we don't know what the return value here
0:40:05.670,0:40:10.150
is. It's just a tuple. We know it's a [br]tuple, nothing else. Holistic to say, not
0:40:10.150,0:40:14.060
specialize. But the nice thing about Julia[br]and where we get performance voice that we
0:40:14.060,0:40:17.970
can actually do for specialisation and[br]hopefully at some point view makes a
0:40:17.970,0:40:26.050
compiler smart enough that these edge[br]cases disappear. So I can use some secrets
0:40:26.050,0:40:33.280
and foresee specialization to happen and[br]then I can actually infer the precise of
0:40:33.280,0:40:40.270
return type of my function. Another thing[br]to know when you're coming for more
0:40:40.270,0:40:45.050
traditional object oriented programing[br]language is that types are not extensible,
0:40:45.050,0:40:50.190
extendable. So you can't inherit from[br]something like Int64. You can only subtype
0:40:50.190,0:40:55.710
abstract types. We do that because[br]otherwise we couldn't do a lot of
0:40:55.710,0:41:01.110
optimizations. When we, when we look at[br]programms, we can't never assume that you
0:41:01.110,0:41:05.310
won't add code. We had a dinamic programming[br]language at any time in the runtime of your
0:41:05.310,0:41:11.760
program you can't add code. And so we don't[br]have close word semantics, which doesn't
0:41:11.760,0:41:16.500
doesn't allow us to say, hey, by the way,[br]we know all possible subtypes here. You
0:41:16.500,0:41:20.800
might add a new type. Later on by[br]saying a common types are not extendable.
0:41:20.800,0:41:27.740
We get a lot of the performance back. So[br]personally, for me, why do I like Julia?
0:41:27.740,0:41:32.510
Or why do I work on Julia? It works like[br]Pyphon, it talks like Lisp and runs like
0:41:32.510,0:41:38.120
Fortran. That's my five sales pitch.[br]It's very hackable and extendable.
0:41:38.120,0:41:46.500
I can poke at the internals[br]and I can bend them if I need to. It's a
0:41:46.500,0:41:52.330
bit of upon LVM. So in reality, for me as[br]a compiler writer, it's my favorite LVM
0:41:52.330,0:41:59.190
front end. I can get the LVM code[br]that I need to actually run. But for users,
0:41:59.190,0:42:04.870
that's hopefully not a concern. If you do[br]our job right and it has users in
0:42:04.870,0:42:09.430
scientific computing and I'm in a prior[br]life whilst doing a lot of scientific
0:42:09.430,0:42:15.040
computing in cognitive science wanting[br]models. And I care about these users
0:42:15.040,0:42:21.190
because I've seen how hard it can be to[br]actually make progress when the tools you
0:42:21.190,0:42:28.780
have are bad. And my personal goal is to[br]enable scientists and engineers to
0:42:28.780,0:42:37.500
collaborate efficiently and actually make[br]change. Julia is a big project and Climate
0:42:37.500,0:42:44.450
is a big project and many people to thank.[br]And with that, I would like to extend you
0:42:44.450,0:42:50.180
an invitation if you're interested. There[br]is juliacon every year. Where you have a
0:42:50.180,0:42:57.830
develop meet up. Last year we were about[br]60 people are much smaller than CCC. But
0:42:57.830,0:43:02.240
next year it will be in Lisbon. So come[br]join us if you're interested and if you
0:43:02.240,0:43:05.970
want to meet scientists who have[br]interesting problems and are looking for
0:43:05.970,0:43:08.570
solutions. Thank you.
0:43:08.570,0:43:17.120
Applaus
0:43:17.120,0:43:23.860
Herald A: Time for questions and answers,[br]are there any questions?
0:43:23.860,0:43:29.050
Herald H: Yeah, we've got microphones over[br]there. So just jump to the microphone and
0:43:29.050,0:43:33.010
ask your questions so that[br]everybody could hear.
0:43:33.010,0:43:38.510
Question: What do you mean when you say[br]dead? Julia talks like Lisp and how is
0:43:38.510,0:43:43.280
that a good thing Lachen[br]Churavy: Well, it talks like Lisp, but it
0:43:43.280,0:43:48.490
doesn't look like Lisp. I assume that's[br]what you mean. It doesn't have that many
0:43:48.490,0:43:53.960
braces. But no, Lisp has another powerful meta [br]programming capabilities and macros. And
0:43:53.960,0:43:58.320
so we have a lot of that. If you read a[br]little bit about the history of Lisp. The
0:43:58.320,0:44:03.650
original intention was to write NLisp,[br]which would be Lisp with a nice syntax. And
0:44:03.650,0:44:07.320
I think Julia is my personal is NLisp.[br]It has all these nice features, but it
0:44:07.320,0:44:13.390
doesn't have the packet syntax. [br]Herald A: OK. Thank you.
0:44:13.390,0:44:18.580
Question: Thanks for the talk. My question[br]is regarding the first part of the talk.
0:44:18.580,0:44:22.920
You, if I understand correctly, you[br]simulating a deterministic system there.
0:44:22.920,0:44:26.180
So there's no additional noise[br]term or anything, right?
0:44:26.180,0:44:29.990
Ramadhan: Well, if you had infinite[br]precision, I think it would be
0:44:29.990,0:44:34.850
deterministic. But I think by kind design[br]turbulence itself is not deterministic.
0:44:34.850,0:44:38.400
Well, it's a chaotic system,[br]Question: But the district size version
0:44:38.400,0:44:41.400
itself is deterministic. You don't have[br]the monte carlo part where you have
0:44:41.400,0:44:44.470
some noise that you would add to [br]which might actually be justified
0:44:44.470,0:44:49.800
from the physics side. Right?[br]Ramadhan: Well, I mean, we, if you think if
0:44:49.800,0:44:53.650
you ran the same simulation again, you[br]would not get that. Well, I think if you
0:44:53.650,0:44:55.940
ran on the exact same machine, [br]you would get the
0:44:55.940,0:44:58.240
same answer. So in that [br]sense, it is deterministic.
0:44:58.240,0:45:00.910
But if you ran on a slightly [br]different machine like truncation
0:45:00.910,0:45:04.010
error, I'd like the 16th decimal place[br]could give you a completely different
0:45:04.010,0:45:08.180
answer. Question: Sure. So the point I'm[br]trying. Am I allowed to continue?
0:45:08.180,0:45:12.270
Herald H: Yes, of course. There's no one[br]else. Well, there is one person else. So you
0:45:12.270,0:45:17.250
can continue a few minutes if you want to.[br]Thanks. Laughter
0:45:17.250,0:45:20.030
Question: So the point I was[br]trying to make is,
0:45:20.030,0:45:22.420
if you add noise in the [br]sense that it's a physical
0:45:22.420,0:45:24.790
system, you have noise in[br]there, it might actually allow you to
0:45:24.790,0:45:28.890
solve a PDI or discretize a PD, but get a[br]stochastic simulation itself, which might
0:45:28.890,0:45:34.480
be interesting because it often can make[br]things easier. And also, you mentioned
0:45:34.480,0:45:39.010
neural differential equations, right? And[br]in particular, with physical systems, if
0:45:39.010,0:45:43.231
you have an discontinuities, for example,[br]the DT integral can actually be quite the
0:45:43.231,0:45:47.890
problem. And there is work on to just[br]plug my colleagues work, control neutral
0:45:47.890,0:45:50.860
differential equations where you can[br]actually also built in these
0:45:50.860,0:45:53.580
discontinuities, which might also be[br]interesting for you guys.
0:45:53.580,0:45:56.470
Ali: That's why maybe we should talk[br]because I don't know much about that stuff
0:45:56.470,0:45:59.690
where we're kind of just starting up. I[br]think that so we've been doing this maybe
0:45:59.690,0:46:03.478
hopefully continuous, but maybe we'll hit[br]discontinuities. I don't know. We should
0:46:03.478,0:46:07.359
talk, though. Q: And also the math is[br]beautiful and has no sickness. It's the
0:46:07.359,0:46:10.359
physics that mightn't change. I'm a[br]mathematician. I have to say that. Ali: I know
0:46:10.359,0:46:15.480
that the physics is ugly, trust me.[br]Churavy: Just as quickly, we do have
0:46:15.480,0:46:24.570
stickers and I sell cookies, too. They are[br]in the cookie box and on. I think they for
0:46:24.570,0:46:29.530
somebody from our community is giving a[br]juliaworkshop and we're trying to find a
0:46:29.530,0:46:34.290
set up an assembly space and hopefully[br]that goes out as well.
0:46:34.290,0:46:38.490
Herald H: Go on please.[br]Question: Also, one question for the first
0:46:38.490,0:46:44.840
part of the talk I want. I wanted to ask[br]if it's possible or if you are using
0:46:44.840,0:46:49.190
dynamic resolution in your climate models.[br]Well, you will maybe have a smaller grid
0:46:49.190,0:46:54.900
size near the (???) and larger[br]in the areas that are not that
0:46:54.900,0:46:58.140
interesting.[br]Ramadhan: Like adaptive grids? So I
0:46:58.140,0:47:02.520
think we mostly do that in the vertical.[br]So usually in the ocean, the thinking
0:47:02.520,0:47:04.260
things are interesting [br]in the, you know,
0:47:04.260,0:47:06.070
close to the surface. [br]We have more resolution
0:47:06.070,0:47:08.780
there. But as you go deeper,[br]things get less interesting. So you put
0:47:08.780,0:47:14.380
less resolution there. Generally, I think[br]in general, the idea people have asked
0:47:14.380,0:47:17.760
that before, you know, why do you always[br]use constant grids? Why don't you use
0:47:17.760,0:47:21.700
these adaptive grids on your global, you[br]know, models? And you the answer I've
0:47:21.700,0:47:24.670
heard I don't know if it's very[br]convincing. I think generally there hasn't
0:47:24.670,0:47:28.950
been that much research or people who do[br]research into adaptive grids for kind of
0:47:28.950,0:47:34.960
models. Their funding gets cut. But I like[br]the answer I've heard is a lot of the
0:47:34.960,0:47:38.070
time, a lot of the atmosphere and ocean is[br]turbulent. So if you especially you do
0:47:38.070,0:47:42.750
kind of adaptive refinement, then you just[br]kind of adapt everywhere because there's
0:47:42.750,0:47:48.400
kind of turbulence everywhere. But yeah, I[br]don't I'm not. I guess first for our
0:47:48.400,0:47:53.320
simulations we're kind of just some of[br]the numerical methods are only fast if you
0:47:53.320,0:47:57.070
run it on a regular grid. So[br]that's the reason we don't use adaptive
0:47:57.070,0:48:01.310
grids for our simulations. But in general,[br]adaptive grids for climate models is
0:48:01.310,0:48:05.010
interesting beyond like it seems like[br]there needs to be more research in that
0:48:05.010,0:48:07.990
area. So I don't know if I answered your[br]question, but I kind of just ranted it.
0:48:07.990,0:48:10.890
Question: You did, thanks.[br]Herald H: Go go ahead, please.
0:48:10.890,0:48:16.160
Question: Yeah, it's just a few guesses[br]about us. I think I have. I wept quite a
0:48:16.160,0:48:22.170
bit of legacy fortune code in Python. And[br]my question is, would there be a simple
0:48:22.170,0:48:28.740
pass converting Fortran code to Julia,[br]preferably automatically. Do you have any
0:48:28.740,0:48:33.060
ideas about this one?[br]Churavy: You can do it. Your Julia code
0:48:33.060,0:48:39.100
will look like Fortran code. So you[br]haven't won anything. So, yes. As a good
0:48:39.100,0:48:42.170
starting point, you can do that.[br]Absolutely. But you can also just call
0:48:42.170,0:48:46.080
Fortran from Julia and then totally move[br]over. I generally don't want people to
0:48:46.080,0:48:50.060
rework their code, except if there's a[br]good reason. Like starting from scratch
0:48:50.060,0:48:55.470
sometimes helps. It can be a good reason.[br]Or if you say the solutions, we don't have
0:48:55.470,0:49:00.880
the necessary experts to to work with the[br]old solution anymore. But generally, if
0:49:00.880,0:49:04.700
you have Fortran code, I would just say,[br]well, call Julia from Fortran or from
0:49:04.700,0:49:11.040
Julia, get it up to speed and then start[br]transitioning. Piece by piece. That makes
0:49:11.040,0:49:16.770
sense?[br]Herald H: So any more questions? No more
0:49:16.770,0:49:23.530
questions. That's an early read. Ali Ramadhan,[br]and Valentin Churavy, thank you very much.
0:49:23.530,0:49:25.980
Applaus
0:49:25.980,0:49:29.370
36C3 postroll music
0:49:29.370,0:49:51.000
Subtitles created by c3subtitles.de[br]in the year 2020. Join, and help us!