Running sudo R CMD javareconf works as it should, and displays that all is fine with Rstudio and java, but rJava package install in Rstudio fails.
I think the issue is similar to here but the context slightly different. Unclear if something is missing from the swupd jre/jdk? I’ve tried all the combinations of bundles, but still am stuck with no running rJava in Rstudio; I need it (rJava) for use/access to my Spark cluster.
Checked these resources first for help:
- DARS image and [documentation] (https://docs.01.org/clearlinux/latest/guides/stacks/dars.html#using-apache-spark-in-dars) only for reference, using bare metal for Spark, not docker
- setting JAVA_HOME here, reviewing documentation and here
- reviewing compile error from Rstudio console here and generalized similar error and here
- reviewing possible issue with conflicting/custom libraries here
Below is output attempting rJava install in RStudio:
/usr/bin/ld: cannot find -lpcre2-8… -llzma… -lbz2
collect2: error: ld returned 1 exit status
make[2]: *** [Makefile.all:35: libjri.so] Error 1
make[2]: Leaving directory ‘/tmp/Rtmp55q2q1/R.INSTALL3d42a322337/rJava/jri/src’
make[1]: *** [Makefile.all:19: src/JRI.jar] Error 2
make[1]: Leaving directory ‘/tmp/Rtmp55q2q1/R.INSTALL3d42a322337/rJava/jri’
make: *** [Makevars:14: jri] Error 2
Spark is up and running, but I’m at a loss how to connect the Spark cluster to Rstudio because of the above errors.
Any suggestions what to try next?
As it stands “which java” and “java -version” display as they should, and the bash scripts for Spark use the JAVA_HOME variable as it is just fine; no trouble with spark directly, only getting it to connect with Rstudio.