The language speed matters, because it entirely changes how you can build a machine coding library. You can composability which is impossible in TensorFlow.
In simple terms, TensorFlow has to match what the whole Julia eco system can do. Almost any Julia package can in principle be used Flux.
You cannot take say a PyTorch activation function and stick it into a TensorFlow grap.
But on the Julia side of the fence you can basically do the equivalent of that. You can take a generic activation function and reuse across many different machine learning libraries.
That is really only possible because of the high native speed of Julia which profoundly change how you design these kinds of libraries.
I give a more detailed explanation here: https://medium.com/@Jernfrost/flux-vs-tensorflow-misconceptions-2737a8b464fb