digitalmars.D.announce - grain - D Language for Deep Learning
- Andrei Alexandrescu (3/3) Apr 23 2019 Google Alerts just found these slides:
- Fynn =?UTF-8?B?U2NocsO2ZGVy?= (7/10) Apr 23 2019 It's an autograd library for dynamic neural networks based on mir
- jmh530 (2/8) Apr 24 2019 Cool. Thanks for the summary.
- jmh530 (7/17) Apr 24 2019 Hmm, it looks like there are comparisons between it and chainer,
- Shigeki Karita (5/24) Apr 24 2019 I see. I'm interested in Stan that is the best library for
- jmh530 (10/16) Apr 24 2019 Conveniently enough, they just incorporated some GPU support in
- Shigeki Karita (7/25) Apr 25 2019 I haven't know that GPU support in Stan. That's Cool! Cholesky
- jmh530 (7/14) Apr 26 2019 I think I recall hearing something about Edward. In my
Google Alerts just found these slides: https://speakerdeck.com/shigekikarita/grain-d-language-for-deep-learning Does anyone have more information about this?
Apr 23 2019
On Wednesday, 24 April 2019 at 00:02:42 UTC, Andrei Alexandrescu wrote:Google Alerts just found these slides: https://speakerdeck.com/shigekikarita/grain-d-language-for-deep-learning Does anyone have more information about this?It's an autograd library for dynamic neural networks based on mir and cuda. See GitHub for more details: https://github.com/ShigekiKarita/grain I've tried it and it works great -- although it's far from feature complete in comparison to e.g. PyTorch.
Apr 23 2019
On Wednesday, 24 April 2019 at 06:13:13 UTC, Fynn Schröder wrote:[snip] It's an autograd library for dynamic neural networks based on mir and cuda. See GitHub for more details: https://github.com/ShigekiKarita/grain I've tried it and it works great -- although it's far from feature complete in comparison to e.g. PyTorch.Cool. Thanks for the summary.
Apr 24 2019
On Wednesday, 24 April 2019 at 10:51:08 UTC, jmh530 wrote:On Wednesday, 24 April 2019 at 06:13:13 UTC, Fynn Schröder wrote:Hmm, it looks like there are comparisons between it and chainer, pytorch, and tensorflow. It might be interesting to compare it to some other static autograd libraries. The only one I can think of off the top of my head is Stan's [1], though that's designed more for probabilistic programming than neural networks. [1] https://github.com/stan-dev/math[snip] It's an autograd library for dynamic neural networks based on mir and cuda. See GitHub for more details: https://github.com/ShigekiKarita/grain I've tried it and it works great -- although it's far from feature complete in comparison to e.g. PyTorch.Cool. Thanks for the summary.
Apr 24 2019
On Wednesday, 24 April 2019 at 10:56:54 UTC, jmh530 wrote:On Wednesday, 24 April 2019 at 10:51:08 UTC, jmh530 wrote:I see. I'm interested in Stan that is the best library for probabilistic models but it lacks of GPU computation. Therefore, I plan to add some probabilistic programming paradigm into grain like pytorch (pyro) and tensorflow (tf probability).On Wednesday, 24 April 2019 at 06:13:13 UTC, Fynn Schröder wrote:Hmm, it looks like there are comparisons between it and chainer, pytorch, and tensorflow. It might be interesting to compare it to some other static autograd libraries. The only one I can think of off the top of my head is Stan's [1], though that's designed more for probabilistic programming than neural networks. [1] https://github.com/stan-dev/math[snip] It's an autograd library for dynamic neural networks based on mir and cuda. See GitHub for more details: https://github.com/ShigekiKarita/grain I've tried it and it works great -- although it's far from feature complete in comparison to e.g. PyTorch.Cool. Thanks for the summary.
Apr 24 2019
On Wednesday, 24 April 2019 at 16:33:00 UTC, Shigeki Karita wrote:[snip] I see. I'm interested in Stan that is the best library for probabilistic models but it lacks of GPU computation. Therefore, I plan to add some probabilistic programming paradigm into grain like pytorch (pyro) and tensorflow (tf probability).Conveniently enough, they just incorporated some GPU support in the release in March [1]. Here's an earlier status update [2]. The initial work was focused on cholesky decompositions because that was a big source of slowdown for some types of models. Probably still has a ways to go before reaching tensorflows maturity on the GPU. [1] https://github.com/stan-dev/math/releases/tag/v2.19.0 [2] https://discourse.mc-stan.org/t/gpu-update-whats-up-and-where-we-are-going/6015
Apr 24 2019
On Wednesday, 24 April 2019 at 17:31:03 UTC, jmh530 wrote:On Wednesday, 24 April 2019 at 16:33:00 UTC, Shigeki Karita wrote:I haven't know that GPU support in Stan. That's Cool! Cholesky decomposition always suffers me when I use covariance matrix or something. If you are interested in GPU acceleration in probabilistic programming, see also this paper (Table 2) of Edward (previous name of Tensorflow Probability) https://arxiv.org/pdf/1701.03757.pdf[snip] I see. I'm interested in Stan that is the best library for probabilistic models but it lacks of GPU computation. Therefore, I plan to add some probabilistic programming paradigm into grain like pytorch (pyro) and tensorflow (tf probability).Conveniently enough, they just incorporated some GPU support in the release in March [1]. Here's an earlier status update [2]. The initial work was focused on cholesky decomposition because that was a big source of slowdown for some types of models. Probably still has a ways to go before reaching tensorflows maturity on the GPU. [1] https://github.com/stan-dev/math/releases/tag/v2.19.0 [2] https://discourse.mc-stan.org/t/gpu-update-whats-up-and-where-we-are-going/6015
Apr 25 2019
On Friday, 26 April 2019 at 06:35:42 UTC, Shigeki Karita wrote:I haven't know that GPU support in Stan. That's Cool! Cholesky decomposition always suffers me when I use covariance matrix or something. If you are interested in GPU acceleration in probabilistic programming, see also this paper (Table 2) of Edward (previous name of Tensorflow Probability) https://arxiv.org/pdf/1701.03757.pdfI think I recall hearing something about Edward. In my experience, Bayesian modelling can be quite finicky...you might do something to get faster results, but then the results may not make sense, particularly as the model becomes more complicated. While I often prefer the Bayesian approach, faster doesn't necessarily mean better.
Apr 26 2019