Use tf.keras.optimizers.schedules.CosineDecay or CosineDecayRestarts for learning rate scheduling in Keras.
Here is the code snippet you can refer to:

In the above code, we are using the following approaches:
- Cosine Decay: Smoothly reduces learning rate following a cosine curve.
- Prevents Overfitting: Avoids premature convergence with slow annealing.
- Hyperparameter Flexibility: Supports setting initial LR, decay steps, and minimum LR.
- Compatible with Adam/SGD: Works with most optimizers for stable training.
Hence, CosineDecay in Keras efficiently schedules the learning rate to improve convergence and generalization in deep learning models.