Loss functions
Loss functions
GradWeightedLoss(batch, unweighted_loss_function, optimizer, metric_weights, fixed_metric, alpha=0.9)
Loss function with gradient weighting for PINN training.
Implements the learning rate annealing algorithm in this article.
PARAMETER | DESCRIPTION |
---|---|
batch
|
Batch of data.
TYPE:
|
unweighted_loss_function
|
Loss function applied before weighting.
TYPE:
|
optimizer
|
torch or nevergrad optimizer for gradient or gradient-free optimization.
TYPE:
|
metric_weights
|
Initial metric weights.
TYPE:
|
fixed_metric
|
Metric whose weight is not updated and whose gradient determines the weights of the other metrics.
TYPE:
|
alpha
|
Scaling factor. Corresponds to the inertia of the weights to updates. Defaults to 0.9.
TYPE:
|
Source code in perceptrain/loss/loss.py
cross_entropy_loss(batch, model)
Cross Entropy Loss.
PARAMETER | DESCRIPTION |
---|---|
batch
|
The input batch.
TYPE:
|
model
|
The model to compute the loss for.
TYPE:
|
RETURNS | DESCRIPTION |
---|---|
tuple[Tensor, dict[str, Tensor]]
|
Tuple[Tensor, dict[str, float]]: - loss (Tensor): The computed loss value. - metrics (dict[str, float]): Empty dictionary. Not relevant for this loss function. |
Source code in perceptrain/loss/loss.py
get_loss(loss_fn)
Returns the appropriate loss function based on the input argument.
PARAMETER | DESCRIPTION |
---|---|
loss_fn
|
The loss function to use.
- If
TYPE:
|
RETURNS | DESCRIPTION |
---|---|
Callable
|
The corresponding loss function.
TYPE:
|
RAISES | DESCRIPTION |
---|---|
ValueError
|
If |
Source code in perceptrain/loss/loss.py
mse_loss(batch, model)
Mean Squared Error Loss.
PARAMETER | DESCRIPTION |
---|---|
batch
|
The input batch.
TYPE:
|
model
|
The model to compute the loss for.
TYPE:
|
RETURNS | DESCRIPTION |
---|---|
tuple[Tensor, dict[str, Tensor]]
|
Tuple[Tensor, dict[str, float]]: - loss (Tensor): The computed loss value. - metrics (dict[str, float]): A dictionary of metrics (loss components). |