machine_learning.logistic_regression¶
Implementing logistic regression for classification problem Helpful resources: Coursera ML course https://medium.com/@martinpella/logistic-regression-from-scratch-in-python-124c5636b8ac
Attributes¶
Functions¶
|
Cost function quantifies the error between predicted and expected values. |
|
|
|
|
|
Also known as Logistic Function. |
Module Contents¶
- machine_learning.logistic_regression.cost_function(h: numpy.ndarray, y: numpy.ndarray) float ¶
Cost function quantifies the error between predicted and expected values. The cost function used in Logistic Regression is called Log Loss or Cross Entropy Function.
J(θ) = (1/m) * Σ [ -y * log(hθ(x)) - (1 - y) * log(1 - hθ(x)) ]
- Where:
J(θ) is the cost that we want to minimize during training
m is the number of training examples
Σ represents the summation over all training examples
y is the actual binary label (0 or 1) for a given example
hθ(x) is the predicted probability that x belongs to the positive class
@param h: the output of sigmoid function. It is the estimated probability that the input example ‘x’ belongs to the positive class
@param y: the actual binary label associated with input example ‘x’
Examples: >>> estimations = sigmoid_function(np.array([0.3, -4.3, 8.1])) >>> cost_function(h=estimations,y=np.array([1, 0, 1])) 0.18937868932131605 >>> estimations = sigmoid_function(np.array([4, 3, 1])) >>> cost_function(h=estimations,y=np.array([1, 0, 0])) 1.459999655669926 >>> estimations = sigmoid_function(np.array([4, -3, -1])) >>> cost_function(h=estimations,y=np.array([1,0,0])) 0.1266663223365915 >>> estimations = sigmoid_function(0) >>> cost_function(h=estimations,y=np.array([1])) 0.6931471805599453
- machine_learning.logistic_regression.log_likelihood(x, y, weights)¶
- machine_learning.logistic_regression.logistic_reg(alpha, x, y, max_iterations=70000)¶
- machine_learning.logistic_regression.sigmoid_function(z: float | numpy.ndarray) float | numpy.ndarray ¶
Also known as Logistic Function.
1
- f(x) = ——-
1 + e⁻ˣ
The sigmoid function approaches a value of 1 as its input ‘x’ becomes increasing positive. Opposite for negative values.
Reference: https://en.wikipedia.org/wiki/Sigmoid_function
@param z: input to the function @returns: returns value in the range 0 to 1
Examples: >>> float(sigmoid_function(4)) 0.9820137900379085 >>> sigmoid_function(np.array([-3, 3])) array([0.04742587, 0.95257413]) >>> sigmoid_function(np.array([-3, 3, 1])) array([0.04742587, 0.95257413, 0.73105858]) >>> sigmoid_function(np.array([-0.01, -2, -1.9])) array([0.49750002, 0.11920292, 0.13010847]) >>> sigmoid_function(np.array([-1.3, 5.3, 12])) array([0.21416502, 0.9950332 , 0.99999386]) >>> sigmoid_function(np.array([0.01, 0.02, 4.1])) array([0.50249998, 0.50499983, 0.9836975 ]) >>> sigmoid_function(np.array([0.8])) array([0.68997448])
- machine_learning.logistic_regression.iris¶