Machine learning and other gibberish
See also: https://sharing.leima.is
Archives: https://datumorphism.leima.is/amneumarkt/
See also: https://sharing.leima.is
Archives: https://datumorphism.leima.is/amneumarkt/
#machinelearning
A nice colloquium paper:
The unreasonable effectiveness of deep learning in artificial intelligence | PNAS
https://www.pnas.org/content/117/48/30033
A nice colloquium paper:
The unreasonable effectiveness of deep learning in artificial intelligence | PNAS
https://www.pnas.org/content/117/48/30033
https://github.com/volotat/DiffMorph
#machinelearning #opensource
Differentiable Morphing
> Image morphing without reference points by applying warp maps and optimizing over them.
#machinelearning #opensource
Differentiable Morphing
> Image morphing without reference points by applying warp maps and optimizing over them.
#machinelearning
https://arxiv.org/abs/2007.04504
Learning Differential Equations that are Easy to Solve
Jacob Kelly, Jesse Bettencourt, Matthew James Johnson, David Duvenaud
Differential equations parameterized by neural networks become expensive to solve numerically as training progresses. We propose a remedy that encourages learned dynamics to be easier to solve. Specifically, we introduce a differentiable surrogate for the time cost of standard numerical solvers, using higher-order derivatives of solution trajectories. These derivatives are efficient to compute with Taylor-mode automatic differentiation. Optimizing this additional objective trades model performance against the time cost of solving the learned dynamics. We demonstrate our approach by training substantially faster, while nearly as accurate, models in supervised classification, density estimation, and time-series modelling tasks.
https://arxiv.org/abs/2007.04504
Learning Differential Equations that are Easy to Solve
Jacob Kelly, Jesse Bettencourt, Matthew James Johnson, David Duvenaud
Differential equations parameterized by neural networks become expensive to solve numerically as training progresses. We propose a remedy that encourages learned dynamics to be easier to solve. Specifically, we introduce a differentiable surrogate for the time cost of standard numerical solvers, using higher-order derivatives of solution trajectories. These derivatives are efficient to compute with Taylor-mode automatic differentiation. Optimizing this additional objective trades model performance against the time cost of solving the learned dynamics. We demonstrate our approach by training substantially faster, while nearly as accurate, models in supervised classification, density estimation, and time-series modelling tasks.