At some point last summer, I decided Julia was going to win.
I had been coding up a time series algorithm that involved a number of loops, and despite all my best NumPy magic, I couldn’t avoid them. Numba didn’t help. I coded the same algorithm in Julia with very little trouble (and more than a few suboptimal patterns, in retrospect), and I got a tenfold speedup. Like that.
So yes, it’s rough around the edges. Yes, there are breaking changes ahead. But 1.0 is coming, and despite the fact that Python remains our go-to solution for data analysis in P[λ]ab, we are increasingly using Julia for our machine learning efforts. In fact, last month, John spoke at the third annual JuliaCon, and we’ve been doing our part to contribute to the ecosystem surrounding it. We’re excited for what’s coming and hope to see lots of new labs, particularly in computational neuroscience come on board.Previous post: Pλab in the news Next post: Time allocation in neuroscience research