The distance of a points from a line is defined by
Where the line is and the point is . Also and .
Observe that the denominator has a square root. Remember that square roots can be positive or negative, because they are the opposite of a square power, which is the same no matter what sign its base has.
So, you need to consider that distance is a magnitude that is always positive, because it's an scalar magnitude, which doesn't include a sign.
Therefore, you determine the sign of the radical depending on the sign resulting in the numerator, if there's a negative sign in the numerator, then you need a negative result in the denominator (root) to make it positive, because distance is a scalar magnitude.
However, notice that the formula for the distance has the numerator as an absolute value, that means the numerator will be always positive, so you'll consider only positive results from the square root.
We choose the sign of the denominator to be the same as the sign of the numerator.
Having an equation of a line of the form: ax + by + c = 0.
Suppose we know the coordinates of the point, then the formula for the distance from a point to a line is given as:
D = |ax1 + by1 + c|/√(a² + b²)
The denominator √(a² + b²) takes either a positive or negative value. This is because the square of a positive or negative value are the same. a² = (-a)²
Now, we choose the sign of the radical in the denominator by considering the sign we have in the numerator. As distance is only reasonable to be positive, if the numerator is negative, we choose the negative sign for the denominator. If the numerator is positive, we choose the positive sign for the denominator.