Bundle for Julia

Would be nice to have out-of-the-box support for Julia via a bundle.

Seems like a perfect fit given both CL’s and Julia’s focus on high performance.

2 Likes

We have had a Julia bundle in the past, but it due to some updates, it’s not currently working. It’s on the list to be supported, but is going to take some work to get it packaged nicely.

TBH for newer / rapidly moving projects like Julia and Rust, my preference is to install the upstream supported binaries rather than anything from any Linux distro, including Clear. Intel has done a tremendous amount of work on the Python / R / machine learning / computer vision stacks, so that’s what I’m using. For other things, containers are my friends. :wink:

3 Likes

I’ve spent a few hours on the new code, and, I am very, very disappointed in this project. It is almost like they do not want users to actually use this project. The barriers to adoption are unusually high and the project makes unreasonable expectations towards integrators.

That’s an industry trend in open source - small projects like RStudio, Anaconda, Julia are looking to monetize early by controlling the brand, selling support from day one, and getting free QA from folks like me. :wink:

It’s much more likely that this was not intentional, but caused by people just not understanding how to make adoption of the project easier.

From previous experience in building and updating julia, it seemed maintainable only if you let it bundle its own LLVM (uses version 6 currently), or built an old (possibly patched) LLVM specifically for julia.

Any movement to bundling bundling Julia?

From what I’ve read (statements by folks from the Julia team)…
Julia keeps running into LLVM-bugs, so they patch it and upstream the patches. But it takes much too long for the up-streamed patches to make it into a release version of LLVM, and by then a new host of bugs inevitably appear. This is why they use their own patched LLVM.

The creators of Julia also prioritize (scientific) reproducibility. They want folks with Julia in general linking to the same libraries, rather than system libraries which may give slightly different results. Or, they build 64-bit OpenBLAS so that Julia is compatible with matrices that have more than 4 billion elements, while default/system OpenBLAS libraries are not.

While the release build is LLVM 6, you can build from source with LLVM 8 (I haven’t run into issues with LLVM_VER = 8.0.0 in my Make.user). More significantly, building from source lets you link MKL for BLAS/LAPACK functionality, which delivers substantially better performance (approx 2x) than OpenBLAS on avx512 hardware for array of sizes. Small matrix multiplication and LAPACK are much faster even for avx2, which OpenBLAS does support well.

If you’re chasing performance, I’d recommend going that route; MKL may not be free as in freedom, but it is at least free as in beer.
If you want to build from source but use the default OpenBLAS, be sure to edit deps/blas.mk, adding F_COMPILER=GFORTRAN to the compiler flags. Otherwise, it will accidentally mistake Clear Linux’s gfortran for ifort, and pass it incorrect compiler flags (eg, i8 instead of fdefault-real-8). It checks the compiler versions by reading the version information; gfortran on clear linux contains the word “Intel”.
You can also use the official binaries, which is the recommended means of installing Julia.

Julia will benefit less from optimizations like profile guided optimization that Clear does for Python, for the same reason that a faster (clan)g++ won’t make the C++ code you compile with it any faster.

Pretty similar story to rust, except they typically include a much more recent version of LLVM for the default build.

I don’t actually use Julia myself, but I have experience integrating it into a distribution (though it was 0.5.0 so things have likely changed a bit). Make.user looks interesting as it would work around a bug I was seeing with passing the changes via the make install command, so thanks for the tip. New LLVM releases were mainly a problem in the weeks after a major release before builds were fixed upstream. I imagine things have stabilised a lot more as julia matured to 1.0 and beyond.

For what it’s worth, my Make.user:

> cat Make.user
MARCH=native
USE_INTEL_MKL=1
LLVM_VER=8.0.0
USE_BINARYBUILDER=0

You will need to source the intel compilervars.sh file that came with the MKL install the first time you build Julia.

Pretty similar story to rust, except they typically include a much more recent version of LLVM for the default build.

Rust’s development branch also tracks LLVM’s development branch, IIRC. Julia’s Makefile accepts a LLVM_VER=svn argument, but I habe never known that to actually compile.
I’d like to try polly support in Julia, which there was some past work to enable, but it seems unmaintained.

I don’t actually use Julia myself, but I have experience integrating it into a distribution (though it was 0.5.0 so things have likely changed a bit). Make.user looks interesting as it would work around a bug I was seeing with passing the changes via the make install command, so thanks for the tip. New LLVM releases were mainly a problem in the weeks after a major release before builds were fixed upstream. I imagine things have stabilised a lot more as julia matured to 1.0 and beyond.

Worth pointing out that Julia still applies a bunch of patches to these. Line links to the deps/llvm.mk file applying patches to llvm 6, 7, and 8.
7 and 8 have much shorter patch lists, but only one was dropped in the move from 7 to 8, and only 1 for 8 has a note saying to drop it for llvm 9.
If the llvm-front is stabilizing, it is doing so slowly.

Any updates on this? I’d love to get up and running with CL but I use Julia on a daily basis…

The official binaries are the recommended way to install Julia.

With OSes that do bundle Julia with their package manager, users are discouraged from actually going that route anyway. So I really don’t think getting swupd bundle-add julia support should be a high priority, since users will be discouraged from doing that anyway.

I can verify that the official binaries work on Clear Linux, and they’re a very easy way to get up and running.

Confirmed.


I realize my wording was ambiguous.
By “I can verify”, I didn’t mean “I can test if anyone wants me to”, but that I’ve already tested several of them and had 0 issues.

It’s good to have more confirmations anyway.