squlearn.qnn
.get_lr_decay
- class squlearn.qnn.get_lr_decay(lr_start: float, lr_end: float, iter_decay: float, iter_plateau: int = 0)
Function for running an Adam optimization with a decay in the learning rate.
Can be inputted to the learning rate of the Adam optimization.
- Parameters:
lr_start (float) – start value of the learning rate
lr_end (float) – final value of the learning rate
iter_decay (float) – decay of the learning rate
iter_plateau (int) – length of the plateau of the start value (default: 0)
- Returns:
Returns function with iteration as input for adjusting the learning rate