Skip to content

Commit d222700

Browse files
committedFeb 18, 2018
tweaks
1 parent 989adcd commit d222700

File tree

2 files changed

+8
-5
lines changed

2 files changed

+8
-5
lines changed
 

‎paper/paper.bib

+5-2
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,9 @@
11
@misc{Julia,
2-
author = {{Jeff Bezanson, Alan Edelman, Stefan Karpinski and Viral B. Shah}},
2+
author = {Jeff Bezanson and Alan Edelman and Stefan Karpinski and Viral B. Shah},
33
title = {Julia: A Fresh Approach to Numerical Computing},
4+
journal = {SIAM Review},
5+
volume = {59},
6+
year = {2017},
47
doi = {10.1137/141000671},
58
howpublished = {\url{julialang.org/publications/julia-fresh-approach-BEKS.pdf}}
69
}
@@ -23,7 +26,7 @@ @online{CuArrays
2326
}
2427

2528
@online{MLPL,
26-
author = {Mike Innes, et al.},
29+
author = {Mike Innes and others},
2730
title = {On Machine Learning and Programming Languages},
2831
year = 2017,
2932
url = {https://julialang.org/blog/2017/12/ml&pl},

‎paper/paper.md

+3-3
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
---
2-
title: 'Flux: Seamless machine learning with Julia'
2+
title: 'Flux: Elegant machine learning with Julia'
33
tags:
44
- deep learning
55
- machine learning
@@ -22,9 +22,9 @@ bibliography: paper.bib
2222

2323
# Summary
2424

25-
Flux is library for machine learning (ML), written using the numerical computing language Julia [@Julia]. The package allows models to be written using Julia's simple mathematical syntax, and applies automatic differentiation (AD) to seamlessly calculate derivatives and train the model. Meanwhile, it makes heavy use of Julia's advanced language features to carry out code analysis and make optimisations. For example, Julia's GPU compilation support [@besard:2017] can be used to JIT-compile custom GPU kernels for model layers [@CuArrays].
25+
Flux is library for machine learning (ML), written using the numerical computing language Julia [@Julia]. The package allows models to be written using Julia's simple mathematical syntax, and applies automatic differentiation (AD) to seamlessly calculate derivatives and train the model. Meanwhile, it makes heavy use of Julia's language and compiler features to carry out code analysis and make optimisations. For example, Julia's GPU compilation support [@besard:2017] can be used to JIT-compile custom GPU kernels for model layers [@CuArrays].
2626

27-
The machine learning community has traditionally been divided between "static" and "dynamic" frameworks that are easy to optimise and easy to use, respectively [@MLPL]. Flux blurs the line between these two approaches, combining a highly intuitive programming model with the advanced compiler techniques needed by ML. As a result of this approach, it already supports several features not available in any other dynamic framework, such as kernel fusion [@Fusion], memory usage optimisations, importing of models via ONNX, and deployment of models to JavaScript for running in the browser.
27+
The machine learning community has traditionally been divided between "static" and "dynamic" frameworks that are easy to optimise and easy to use, respectively [@MLPL]. Flux blurs the line between these two approaches, combining a highly intuitive programming model with the compiler techniques needed by ML. As a result of this approach, it already supports several features not available in any other dynamic framework, such as kernel fusion [@Fusion], memory usage optimisations, importing of models via ONNX, and deployment of models to JavaScript for running in the browser.
2828

2929
Flux has been used heavily for natural language processing, but can also support state-of-the-art research models in areas like computer vision, reinforcement learning and robotics.
3030

0 commit comments

Comments
 (0)