Summer of Math Exposition

Polynomial Regression as a Tool in Education

In this article, we look at a novel usecase of polynomial regression models.

thumbnail

Analytics

5.3 Overall score*
31 Votes
13 Comments
Rank 24

Comments

Good work on putting words to ideas before going into the formal detail. The graphs are clear and appealing.

7.1

The paper was well written overall. It anticipated questions and addressed them. For instance, the discussion around learning rates in section 'Linear Regression'. Presenting an outline in the section 'The Plan' was similarly helpful. I think references should be links. In section 'The Plan', figure 4.1 is referenced, but clicking on the text does not take me to the figure. I would also have liked an explanation as to why a degree two polynomial was chosen for this model. Why not three? Or four? That said, reading this felt like being handheld through solving the problem, which was wonderful. I liked how the dataset and code were available to play around with.

7.6

Straightforward ML exercise. I think the middle chunk would be helped by some intermediate visuals, or simply fewer math weeds. Also wasn't really clear on why failure was -1 instead of 0. You throw a lot of math at the reader without much context. More clarity would make this much better.

5

I was initially interested in the topic, since I am myself a Mathematics instructor. But I must say that I am not a fan of predicting which students "will fail" a class. Students either pass the class or they do not. The main idea of the submission would be an easier pill to swallow if it was framed in terms of identifying which students are in most need of intervention and not in terms of prematurely separating out the failing students.

3

It is a good work! Very well written and explained. I can see the novelty in the application, but the method is very well know and can be encountered in any book of econometrics.

6

Have you looked at polynomial regressions in Pattern Recognition and Machine Learning by Bishop? Love how clear your walkthrough is.

4.4

It doesn't feel that original to me, but maybe I haven't understood what makes this explanation different from others. It feels like a very good example to explain the principle of polynomial regression & classification.

5.5

It was not clear what the criterion for passing is. Is the pass grade based on attendance and exam scores? Homework? Other things? If the teacher themselves decides whether someone passes or not then wouldn't they know the formula? I understand that this is just an example to motivate the technique of polynomial regression but I was confused by it. Maybe use a different example altogether. Second, there was not much use of visual explanations and this could have been helpful. For example showing why the best linear separator is still bad visually would have been good. Third, some of the writing is confusing. I wasn't sure what was meant by "augmenting" in "Now, I will try augmenting, that rewarding the model doesn't do us any good."

2

There are a couple of typos and grammar mistakes, but otherwise I thought this was a very good explanation! As someone who is familiar with optimisation techniques but not experienced with machine learning I enjoyed learning about how the techniques of the former can be used in the latter. You picked a good example of a real-life application which provides a solid motivation for the method, rather than developing it in the abstract which can be detrimental for learners :)

5.9

This was for the most part clear and well-written (one small exception was the paragraph about "rewarding", which I had to reread to figure out what was going on). However, it felt pretty unambitious. The algorithm itself was quite simple and there wasn't an attempt to compare the performance with more common binary classification methods.

4.8

The text hits a sweet-spot of mentioning important, while still being kept short and interesting to read. The reader is led nicely through the example of a simple linear regression, at the end of which it is made clear, why one needs to use a more difficult model. Only the section "Calculating the Partial Derivatives" could be a bit more fleshed out, i.e. comparisons to before (the simple case) and derivations could be a bit more detailed. On the other hand I can understand, if there should not be any focus on that, since it is not a main part. Adding the code is very nice. It is also kept very clear and no special things are needed to run it (at least from my point of view).

7.9

I was puzzled by how you can have an attendance rate higher than 1.0, as shown in the graph. You do require yk <= 1 in the main text, but the final equation generated and its curve go outside this range. Is that a harmless quirk, or is producing a curve that permits nonsensical results sometimes troublesome? How should a programmer (or a conscientious teacher) guard against this? If the model had been punished for accommodating nonsensical results, how much would that have affected the final curve and the time taken to get it? This was the main question I had remaining after reading the article. Perfectly ok article - slight positive score bump given for novelty.

5.4

I liked that instead of going with a traditional logistic regression for the model, we opted for a linear regression and built up to a different understanding of cutting the hyperplane to derive a boundary. Nearer towards the end, some of the math prevented the actual clarity in terms of of the actual problem here, and I would have liked more discussion in terms of determining the goodness of fit of the model for classification, because that is an important concept which could pose a problem when trying to generalize this method.

3