The language speed matters, because it entirely changes how you can build a machine coding library. You can composability which is impossible in TensorFlow.

In simple terms, TensorFlow has to match what the whole Julia eco system can do. Almost any Julia package can in principle be used Flux.

You cannot take say a PyTorch activation function and stick it into a TensorFlow grap.

But on the Julia side of the fence you can basically do the equivalent of that. You can take a generic activation function and reuse across many different machine learning libraries.

That is really only possible because of the high native speed of Julia which profoundly change how you design these kinds of libraries.

I give a more detailed explanation here:

Geek dad, living in Oslo, Norway with passion for UX, Julia programming, science, teaching, reading and writing.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store