
Welcome now to the coding portion of the regression lesson.

We start, as always, on Google and we look for what sklearn has to offer us.

As you can see, there's plenty going on here.

This linear regression is what we'll end up using eventually.

But I actually think this link is a better one to start at because it

gives us a little bit more of an introduction to generalized linear models.

So we have here a formula that, it's a little bit hard to tell, but

this is just rewriting y equals mx plus b, but

now we can have more than one x available to us.

We'll get into this a little bit later in the lesson right now,

but at least this tells us that we're in the right place for

something that has a linear form to it.

Scrolling down a little bit, I see something called ordinary least squares.

The exact name for

the code is linear regression, and this looks like exactly what I want.

I have a variety of points.

I'm fitting them with a line.

And maybe even better, I have some example code that can get me started.

I hope you recognize that the code here really isn't all that different

from the types of codes that we were seeing for the supervised classifiers.

So that means that we start with an import statement,

from sklearn import linear_model.

Within linear_model, there's an object called linearRegression and

we use that to create our classifier.

The next thing that we do is we fit our classifier.

And then here,

they're not making predictions with it, although we'll be doing that as well.

What they're doing here is they're reading out the coefficients, or

what we called the slope.

Scrolling down a little further, I see that there's a more detailed example here

that I could use to help me if I get stuck anywhere.

But as it happens, this is going to be enough to get us off the ground.