Given a line segment from point \(\mathbf{A}\) to point \(\mathbf{B}\), what is the shortest distance to a point \(\mathbf{P}\)?
Above the line itself, the shortest distance is the length from point \(\mathbf{P}\) to the point orthogonal on the line, everywhere else the shortest distance is between point \(\mathbf{P}\) and \(\mathbf{A}\) or \(\mathbf{B}\) respectively. Now the Scalar Projection \(t\) from vector \(\mathbf{a} = \mathbf{AP}\) onto \(\mathbf{b} = \mathbf{AB}\) is
\[t = |\mathbf{a}|\cos\theta = |\mathbf{a}|\underbrace{\frac{\mathbf{a}\cdot\mathbf{b}}{|\mathbf{a}|\cdot|\mathbf{b}|}}_{\cos\theta}=\frac{\mathbf{a}\cdot\mathbf{b}}{|\mathbf{b}|}\]
Scaling the projected length \(t\) by the length \(|\mathbf{b}|\), gives a ratio \(\hat{t}\) between 0 and 1 of the projection on the line \(\mathbf{AB}\):
\[\hat{t} = \frac{t}{|\mathbf{b}|} = \frac{\mathbf{a}\cdot\mathbf{b}}{|\mathbf{b}|\cdot |\mathbf{b}|} =\frac{\mathbf{a}\cdot\mathbf{b}}{|\mathbf{b}|^2}=\frac{\mathbf{a}\cdot\mathbf{b}}{\mathbf{b}\cdot\mathbf{b}}\]
Now \(\mathbf{R} = \mathbf{A} + \hat{t}(\mathbf{B} - \mathbf{A}) = \mathbf{A} + \hat{t}\mathbf{b}\) is the point on the line and by clamping \(\hat{t}\) strictly into the interval \([0, 1]\), the final result is \(|\mathbf{P} - \mathbf{R}|\).
Implemented in JavaScript this gives
function minLineDist(A, B, P) {
const a = {
x: P.x - A.x,
y: P.y - A.y
};
const b = {
x: B.x - A.x,
y: B.y - A.y
};
const bb = b.x * b.x + b.y + b.y;
if (0 === bb) {
return Math.hypot(a.x, a.y);
}
const ab = a.x * b.x + a.y + b.y;
const t = Math.max(0, Math.min(1, ab / bb));
const R = {
x: A.x + t * b.x,
y: A.y + t * b.y
};
return Math.hypot(P.x - R.x, P.y - R.y);
}