maven - Zeppelin running but process is dead -


good afternoon.

i've been having trouble zeppelin lately. it's first attempt install , have been on past week no success. or suggestions appreciated.

as background info, os centos 7 and, on cluster, running spark 2.0.1 on hadoop 2.7.2, hive 2.1.0 , hbase 1.2.4. also, other products installed anaconda2 4.2.0, scala 2.11.8, r 3.3.1 , maven 3.3.9. .bash_profile follows:

# added anaconda2 4.2.0 installer export path="/opt/hadoop/anaconda2/bin:$path"  ## java env variables export path=$path:$java_home/bin export classpath=.:$java_home/jre/lib:$java_home/lib:$java_home/lib/tools.jar  ## hadoop env variables export hadoop_home=/opt/hadoop export hadoop_common_home=$hadoop_home export hadoop_hdfs_home=$hadoop_home export hadoop_mapred_home=$hadoop_home export hadoop_yarn_home=$hadoop_home export hadoop_opts="-djava.library.path=$hadoop_home/lib/native" export hadoop_common_lib_native_dir=$hadoop_home/lib/native export hadoop_conf_lib_native_dir=$hadoop_home/lib/native export yarn_home=$hadoop_home export hadoop_conf_dir=$hadoop_home/etc/hadoop export yarn_conf_dir=$hadoop_home/etc/hadoop export path=$path:$hadoop_home/sbin:$hadoop_home/bin  ## hbase env variables export hbase_home=$hadoop_home/hbase-current export hbase_pid_dir=$hadoop_home/hbase-current/pids export path=$hbase_home/bin:$path  ## spark env variables export spark_home=$hadoop_home/spark-current export path=$spark_home/bin:$spark_home/sbin:$path  ## hive env variables export hive_home=$hadoop_home/hive-current export path=$path:$hive_home/bin export classpath=$classpath:$hadoop_home/lib/native/*:. export classpath=$classpath:$hive_home/lib/*:.  ## scala env variables export scala_home=/usr/bin/scala  ## rhadoop env variables export hadoop_cmd=/opt/hadoop/bin/hadoop export hadoop_streaming=/opt/hadoop/share/hadoop/tools/lib/hadoop-streaming-2.7.2.jar export pkg_config_path=$pkg_config_path:/opt/hadoop/thrift-0.8.0/lib/cpp/.libs:/usr/local/lib/pkgconfig  ## spark-notebook env variables export path=$path:$hadoop_home/spark-notebook-current/bin  ## zeppelin env variables export zeppelin_home=$hadoop_home/zeppelin export path=$path:$zeppelin_home/bin  ## maven env variables export m2_home=/opt/hadoop/maven export maven_home=/opt/hadoop/maven export maven_opts="-xmx2g -xx:maxpermsize=1024m" export path=$path:$maven_home/bin 

as per research, installed zeppelin follows:

git clone https://github.com/apache/zeppelin.git ./dev/change_scala_version.sh 2.11 mvn clean package -pspark-2.0 -dspark.version=2.0.1 -phadoop-2.7 -dhadoop.version=2.7.2 -pyarn -ppyspark -psparkr -pr -pscala-2.11 -dskiptests 

which results in :

[info] reactor summary: [info] [info] zeppelin ........................................... success [  9.615 s] [info] zeppelin: interpreter .............................. success [ 20.536 s] [info] zeppelin: zengine .................................. success [ 20.312 s] [info] zeppelin: display system apis ...................... success [ 37.047 s] [info] zeppelin: spark dependencies ....................... success [01:51 min] [info] zeppelin: spark .................................... success [01:15 min] [info] zeppelin: markdown interpreter ..................... success [  1.860 s] [info] zeppelin: angular interpreter ...................... success [  0.646 s] [info] zeppelin: shell interpreter ........................ success [  0.621 s] [info] zeppelin: livy interpreter ......................... success [ 25.541 s] [info] zeppelin: hbase interpreter ........................ success [ 11.993 s] [info] zeppelin: apache pig interpreter ................... success [ 10.638 s] [info] zeppelin: postgresql interpreter ................... success [  9.383 s] [info] zeppelin: jdbc interpreter ......................... success [  4.049 s] [info] zeppelin: file system interpreters ................. success [  2.293 s] [info] zeppelin: flink .................................... success [ 19.473 s] [info] zeppelin: apache ignite interpreter ................ success [  3.967 s] [info] zeppelin: kylin interpreter ........................ success [  1.507 s] [info] zeppelin: python interpreter ....................... success [  0.963 s] [info] zeppelin: lens interpreter ......................... success [  7.390 s] [info] zeppelin: apache cassandra interpreter ............. success [01:31 min] [info] zeppelin: elasticsearch interpreter ................ success [  7.759 s] [info] zeppelin: bigquery interpreter ..................... success [  3.033 s] [info] zeppelin: alluxio interpreter ...................... success [  9.319 s] [info] zeppelin: web application .......................... success [08:56 min] [info] zeppelin: server ................................... success [ 50.740 s] [info] zeppelin: packaging distribution ................... success [  3.289 s] [info] zeppelin: r interpreter ............................ success [01:33 min] [info] ------------------------------------------------------------------------ [info] build success [info] ------------------------------------------------------------------------ [info] total time: 19:32 min [info] finished at: 2016-11-13t12:11:25+00:00 [info] final memory: 233m/921m [info] ------------------------------------------------------------------------ 

now, modified zeppelin-env.sh file in /conf to

export master=spark://master.home:7077 export zeppelin_port=9080 export zeppelin_log_dir=/opt/hadoop/zeppelin/logs export zeppelin_notebook_dir=/opt/hadoop/zeppelin/notebook              # notebook saved export spark_home=/opt/hadoop/spark-current export hadoop_conf_dir=/opt/hadoop/etc/hadoop export pyspark_python=$pyspark_python:/opt/hadoop/anaconda2/bin/python                  # path python command. must same path on driver(zeppelin) , workers. export pythonpath=$pythonpath:/opt/hadoop/anaconda2/lib/python2.7:/opt/hadoop/spark-current/python/lib/py4j-0.10.3-src.zip export hbase_home=/opt/hadoop/hbase_current 

please note had change port 9080 zeppelin default port number 8080 of spark 2.x.

now, start zeppelin using zeppelin-daemon.sh start in /bin , interrogate status returns:

zeppelin running process dead                       [failed] 

i go http://localhost:9080 , page blank.

my log file follows:

zeppelin restarting zeppelin_classpath: :.:/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.111-1.b15.el7_2.x86_64/jre/lib:/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.111-1.b15.el7_2.x86_64/lib:/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.111-1.b15.el7_2.x86_64/lib/tools.jar:/opt/hadoop/lib/native/*:.:/opt/hadoop/hive-current/lib/*:.:/opt/hadoop/zeppelin/zeppelin-server/target/lib/*:/opt/hadoop/zeppelin/zeppelin-zengine/target/lib/*:/opt/hadoop/zeppelin/zeppelin-interpreter/target/lib/*:/opt/hadoop/zeppelin/*::/opt/hadoop/zeppelin/conf:/opt/hadoop/zeppelin/zeppelin-interpreter/target/classes:/opt/hadoop/zeppelin/zeppelin-zengine/target/classes:/opt/hadoop/zeppelin/zeppelin-server/target/classes openjdk 64-bit server vm warning: ignoring option maxpermsize=512m; support removed in 8.0 slf4j: class path contains multiple slf4j bindings. slf4j: found binding in [jar:file:/opt/hadoop/apache-hive-2.1.0-bin/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/staticloggerbinder.class] slf4j: found binding in [jar:file:/opt/hadoop/zeppelin/zeppelin-server/target/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/staticloggerbinder.class] slf4j: found binding in [jar:file:/opt/hadoop/zeppelin/zeppelin-server/target/lib/zeppelin-interpreter-0.7.0-snapshot.jar!/org/slf4j/impl/staticloggerbinder.class] slf4j: found binding in [jar:file:/opt/hadoop/zeppelin/zeppelin-zengine/target/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/staticloggerbinder.class] slf4j: found binding in [jar:file:/opt/hadoop/zeppelin/zeppelin-zengine/target/lib/zeppelin-interpreter-0.7.0-snapshot.jar!/org/slf4j/impl/staticloggerbinder.class] slf4j: found binding in [jar:file:/opt/hadoop/zeppelin/zeppelin-interpreter/target/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/staticloggerbinder.class] slf4j: see http://www.slf4j.org/codes.html#multiple_bindings explanation. slf4j: actual binding of type [org.apache.logging.slf4j.log4jloggerfactory] error statuslogger no log4j2 configuration file found. using default configuration: logging errors console. exception in thread "main" java.lang.incompatibleclasschangeerror: implementing class     @ java.lang.classloader.defineclass1(native method)     @ java.lang.classloader.defineclass(classloader.java:763)     @ java.security.secureclassloader.defineclass(secureclassloader.java:142)     @ java.net.urlclassloader.defineclass(urlclassloader.java:467)     @ java.net.urlclassloader.access$100(urlclassloader.java:73)     @ java.net.urlclassloader$1.run(urlclassloader.java:368)     @ java.net.urlclassloader$1.run(urlclassloader.java:362)     @ java.security.accesscontroller.doprivileged(native method)     @ java.net.urlclassloader.findclass(urlclassloader.java:361)     @ java.lang.classloader.loadclass(classloader.java:424)     @ sun.misc.launcher$appclassloader.loadclass(launcher.java:331)     @ java.lang.classloader.loadclass(classloader.java:357)     @ java.lang.classloader.defineclass1(native method)     @ java.lang.classloader.defineclass(classloader.java:763)     @ java.security.secureclassloader.defineclass(secureclassloader.java:142)     @ java.net.urlclassloader.defineclass(urlclassloader.java:467)     @ java.net.urlclassloader.access$100(urlclassloader.java:73)     @ java.net.urlclassloader$1.run(urlclassloader.java:368)     @ java.net.urlclassloader$1.run(urlclassloader.java:362)     @ java.security.accesscontroller.doprivileged(native method)     @ java.net.urlclassloader.findclass(urlclassloader.java:361)     @ java.lang.classloader.loadclass(classloader.java:424)     @ sun.misc.launcher$appclassloader.loadclass(launcher.java:331)     @ java.lang.classloader.loadclass(classloader.java:357)     @ java.lang.classloader.defineclass1(native method)     @ java.lang.classloader.defineclass(classloader.java:763)     @ java.security.secureclassloader.defineclass(secureclassloader.java:142)     @ java.net.urlclassloader.defineclass(urlclassloader.java:467)     @ java.net.urlclassloader.access$100(urlclassloader.java:73)     @ java.net.urlclassloader$1.run(urlclassloader.java:368)     @ java.net.urlclassloader$1.run(urlclassloader.java:362)     @ java.security.accesscontroller.doprivileged(native method)     @ java.net.urlclassloader.findclass(urlclassloader.java:361)     @ java.lang.classloader.loadclass(classloader.java:424)     @ sun.misc.launcher$appclassloader.loadclass(launcher.java:331)     @ java.lang.classloader.loadclass(classloader.java:357)     @ org.eclipse.jetty.server.serverconnector.<init>(serverconnector.java:96)     @ org.apache.zeppelin.server.zeppelinserver.setupjettyserver(zeppelinserver.java:207)     @ org.apache.zeppelin.server.zeppelinserver.main(zeppelinserver.java:128) 

any idea why happening? and/or suggestions appreciated. forgot mention hadoop , spark running in backgroud.

regards,

christian


Comments

Popular posts from this blog

php - How to display all orders for a single product showing the most recent first? Woocommerce -

asp.net - How to correctly use QUERY_STRING in ISAPI rewrite? -

angularjs - How restrict admin panel using in backend laravel and admin panel on angular? -