Isotonic Regression in Rust

Isotonic (monotonic) regression is the technique of fitting a piecewise line to a sequence of observations such that the fitted line is monotonically non-decreasing.

Let $(x_1,y_1),\ldots ,(x_n,y_n)$ be a given set of observations, where the $y_i \in \mathbb{R}$.

The isotonic regression problem a weighted least-squares fit ${\hat y_i} \approx y_{i}$ for all $i$, subject to the constraint that ${\hat y}_i \leq {\hat y}_j$ whenever $x_i \leq x_j$.

$$ \min \sum_{i=1}^{n} w_i ({\hat y_i}-y_i)^2 \text{ subject to } {\hat y_i} \leq {\hat y_j} {\text{ for all }} (i,j)\in E$$

where $ E={(i,j):x_i \leq x_j} $ specifies the partial ordering of the observed inputs $x_i$.

Read More

Stochastic Gradient Descent in Data Science

Introduction

Stochastic gradient descent (SGD) is a popular stochastic optimization algorithm in the field of machine learning, especially for optimizing deep neural networks. In its core, this iterative algorithm combines two optimization techniques: a stochastic approximation with gradient descent.

sgd

SGD is common for optimizing various range of models. We are interested in application of this optimization techniques to standard data science tasks as linear regression and clustering. In addition, we’ll use differentiable programming techniques for simplifying and making versatile our SGD implementation.

Read More