Machine learning and other gibberish
See also: https://sharing.leima.is
Archives: https://datumorphism.leima.is/amneumarkt/
#dl

Park, Chanwook, Sourav Saha, Jiachen Guo, Hantao Zhang, Xiaoyu Xie, Miguel A. Bessa, Dong Qian, et al. 2025. “Unifying Machine Learning and Interpolation Theory via Interpolating Neural Networks.” Nature Communications 16 (1): 1–12.
https://www.nature.com/articles/s41467-025-63790-8
#dl

Sure, nobody uses it anyways.
#dl

There is this new lib called scale. One could compile CUDA code to use it on AMD GPU.

https://docs.scale-lang.com/manual/how-to-use/


I don't know who is more pissed off, NVidia or AMD.
#dl

Google & USC benchmarked a prompt based forecasting method, and the results are amazing.

Cao D, Jia F, Arik SO, Pfister T, Zheng Y, Ye W, et al. TEMPO: Prompt-based Generative Pre-trained Transformer for time series forecasting. arXiv [cs.LG]. 2023. Available: http://arxiv.org/abs/2310.04948
#dl

https://github.com/Lightning-AI/lightning/releases/tag/2.0.0

You can compile (torch 2.0) LightningModule now.

import torch
import lightning as L
model = LitModel()
# This will compile forward and {training,validation,test,predict}_step
compiled_model = torch.compile(model)
trainer = L.Trainer()
trainer.fit(compiled_model)
Release Lightning 2.0: Fast, Flexible, Stable · Lightning-AI/lightning
 
 
Back to Top