machine_learning.automatic_differentiation¶
Demonstration of the Automatic Differentiation (Reverse mode).
Reference: https://en.wikipedia.org/wiki/Automatic_differentiation
Author: Poojan Smart Email: smrtpoojan@gmail.com
Classes¶
Class contains methods to compute partial derivatives of Variable |
|
Class represents list of supported operations on Variable for gradient calculation. |
|
Class represents operation between single or two Variable objects. |
|
Class represents n-dimensional object which is used to wrap numpy array on which |
Module Contents¶
- class machine_learning.automatic_differentiation.GradientTracker¶
Class contains methods to compute partial derivatives of Variable based on the computation graph.
Examples:
>>> with GradientTracker() as tracker: ... a = Variable([2.0, 5.0]) ... b = Variable([1.0, 2.0]) ... m = Variable([1.0, 2.0]) ... c = a + b ... d = a * b ... e = c / d >>> tracker.gradient(e, a) array([-0.25, -0.04]) >>> tracker.gradient(e, b) array([-1. , -0.25]) >>> tracker.gradient(e, m) is None True
>>> with GradientTracker() as tracker: ... a = Variable([[2.0, 5.0]]) ... b = Variable([[1.0], [2.0]]) ... c = a @ b >>> tracker.gradient(c, a) array([[1., 2.]]) >>> tracker.gradient(c, b) array([[2.], [5.]])
>>> with GradientTracker() as tracker: ... a = Variable([[2.0, 5.0]]) ... b = a ** 3 >>> tracker.gradient(b, a) array([[12., 75.]])
- __enter__() typing_extensions.Self ¶
- __exit__(exc_type: type[BaseException] | None, exc: BaseException | None, traceback: types.TracebackType | None) None ¶
- append(op_type: OpType, params: list[Variable], output: Variable, other_params: dict | None = None) None ¶
Adds Operation object to the related Variable objects for creating computational graph for calculating gradients.
- Args:
op_type: Operation type params: Input parameters to the operation output: Output variable of the operation
- derivative(param: Variable, operation: Operation) numpy.ndarray ¶
Compute the derivative of given operation/function
- Args:
param: variable to be differentiated operation: function performed on the input variable
- Returns:
Derivative of input variable with respect to the output of the operation
- gradient(target: Variable, source: Variable) numpy.ndarray | None ¶
Reverse accumulation of partial derivatives to calculate gradients of target variable with respect to source variable.
- Args:
target: target variable for which gradients are calculated. source: source variable with respect to which the gradients are calculated.
- Returns:
Gradient of the source variable with respect to the target variable
- enabled = False¶
- instance = None¶
- class machine_learning.automatic_differentiation.OpType(*args, **kwds)¶
Bases:
enum.Enum
Class represents list of supported operations on Variable for gradient calculation.
- ADD = 0¶
- DIV = 3¶
- MATMUL = 4¶
- MUL = 2¶
- NOOP = 6¶
- POWER = 5¶
- SUB = 1¶
- class machine_learning.automatic_differentiation.Operation(op_type: OpType, other_params: dict | None = None)¶
Class represents operation between single or two Variable objects. Operation objects contains type of operation, pointers to input Variable objects and pointer to resulting Variable from the operation.
- __eq__(value) bool ¶
- op_type¶
- other_params¶
- class machine_learning.automatic_differentiation.Variable(value: Any)¶
Class represents n-dimensional object which is used to wrap numpy array on which operations will be performed and the gradient will be calculated.
Examples: >>> Variable(5.0) Variable(5.0) >>> Variable([5.0, 2.9]) Variable([5. 2.9]) >>> Variable([5.0, 2.9]) + Variable([1.0, 5.5]) Variable([6. 8.4]) >>> Variable([[8.0, 10.0]]) Variable([[ 8. 10.]])
- __repr__() str ¶
- to_ndarray() numpy.ndarray ¶
- value¶