(The URL seems to have a confusing name: it is about automatic differentiation).
Not sure I am understanding the implications of this. I know many users of optimization libraries use finite differences (probably without even knowing it). This is almost always a bad idea: you lose half of the precision in the gradient (solvers typically assume exact gradients) and at the same time, this needs many extra function evaluations.
Original paper:
https://arxiv.org/pdf/2105.15183.pdf
No comments:
Post a Comment