Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fabric8 Maven Plugin Example #206

Open
YegorMaksymchuk opened this issue Jun 25, 2018 · 4 comments
Open

Fabric8 Maven Plugin Example #206

YegorMaksymchuk opened this issue Jun 25, 2018 · 4 comments
Labels

Comments

@YegorMaksymchuk
Copy link

YegorMaksymchuk commented Jun 25, 2018

Hello !

I've found an error in Fabric8 maven plugin example application

Preconditions:

  1. Install maven
  2. Install openshift
  3. Execute oc cluster up or use remote server with openshift
  4. Login to Openshift and select empty project in terminal (oc login and oc new-project <project_name>)

Steps to reproduce:

  1. Execute oc create -f https://radanalytics.io/resources.yaml
  2. Clone repositories Kafka , WebUI, Spark
  3. Go to folder with Kafka and run mvn clean fabric8:deploy
  4. Go to folder with WebUI and run mvn clean fabric8:deploy
  5. Go to folder with Spark and run mvn clean fabric8:deploy

Actual result:
With Spark versions 2.1.0 and 2.2.0 I got the same result.
All parts of the application were deployed but an integration between Sprak and WebUI is broken.
With errors in Spark-pod-log:
`+ [[ /usr/local/s2i/run == /\u\s\r/\l\o\c\a\l/\s\2\i ]]

  • exec /usr/local/s2i/run
    oshinko v0.5.3
    Default spark image: radanalyticsio/openshift-spark:2.3-latest
    grep: /etc/podinfo/labels: No such file or directory
    Found cluster mycluster
    Using shared cluster mycluster
    Waiting for spark master http://mycluster-ui:8080 to be available ...
    Waiting for spark workers (2/2 alive) ...
    All spark workers alive
    Cluster configuration is
    [
    {
    "namespace": "fabric8-maven-plugin",
    "name": "mycluster",
    "href": "/clusters/mycluster",
    "image": "radanalyticsio/openshift-spark:2.3-latest",
    "masterUrl": "spark://mycluster:7077",
    "masterWebUrl": "http://mycluster-ui:8080 ",
    "masterWebRoute": "mycluster-ui-route-fabric8-maven-plugin.127.0.0.1.nip.io",
    "status": "Running",
    "workerCount": 2,
    "masterCount": 1,
    "Config": {
    "MasterCount": 1,
    "WorkerCount": 2,
    "Name": "myconfig",
    "SparkMasterConfig": "",
    "SparkWorkerConfig": "",
    "SparkImage": "radanalyticsio/openshift-spark:2.3-latest",
    "ExposeWebUI": "true",
    "Metrics": "false"
    },
    "ephemeral": "\u003cshared\u003e"
    }
    ]
    spark-submit --class com.voxxed.bigdata.spark.Stream --master spark://mycluster:7077 /opt/app-root/src/voxxed-bigdata-spark-1.0-SNAPSHOT.jar
    18/06/25 10:46:06 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    18/06/25 10:46:06 INFO SparkContext: Running Spark version 2.3.0
    18/06/25 10:46:06 INFO SparkContext: Submitted application: voxxed-bigdata-spark
    18/06/25 10:46:06 INFO SecurityManager: Changing view acls to: default
    18/06/25 10:46:06 INFO SecurityManager: Changing modify acls to: default
    18/06/25 10:46:06 INFO SecurityManager: Changing view acls groups to:
    18/06/25 10:46:06 INFO SecurityManager: Changing modify acls groups to:
    18/06/25 10:46:06 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(default); groups with view permissions: Set(); users with modify permissions: Set(default); groups with modify permissions: Set()
    18/06/25 10:46:07 INFO Utils: Successfully started service 'sparkDriver' on port 42485.
    18/06/25 10:46:07 INFO SparkEnv: Registering MapOutputTracker
    18/06/25 10:46:07 INFO SparkEnv: Registering BlockManagerMaster
    18/06/25 10:46:07 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
    18/06/25 10:46:07 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
    18/06/25 10:46:07 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-66ae1f57-b63c-4479-b78a-e292fdf8db26
    18/06/25 10:46:07 INFO MemoryStore: MemoryStore started with capacity 366.3 MB
    18/06/25 10:46:07 INFO SparkEnv: Registering OutputCommitCoordinator
    18/06/25 10:46:07 INFO Utils: Successfully started service 'SparkUI' on port 4040.
    18/06/25 10:46:07 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://voxxed-bigdata-spark-1-tftwx:4040
    18/06/25 10:46:07 INFO SparkContext: Added JAR file:/opt/app-root/src/voxxed-bigdata-spark-1.0-SNAPSHOT.jar at spark://voxxed-bigdata-spark-1-tftwx:42485/jars/voxxed-bigdata-spark-1.0-SNAPSHOT.jar with timestamp 1529923567555
    18/06/25 10:46:07 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://mycluster:7077...
    18/06/25 10:46:07 INFO TransportClientFactory: Successfully created connection to mycluster/172.30.111.85:7077 after 24 ms (0 ms spent in bootstraps)
    18/06/25 10:46:07 INFO StandaloneSchedulerBackend: Connected to Spark cluster with app ID app-20180625104607-0004
    18/06/25 10:46:07 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 37657.
    18/06/25 10:46:07 INFO NettyBlockTransferService: Server created on voxxed-bigdata-spark-1-tftwx:37657
    18/06/25 10:46:07 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
    18/06/25 10:46:07 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, voxxed-bigdata-spark-1-tftwx, 37657, None)
    18/06/25 10:46:07 INFO StandaloneAppClient$ClientEndpoint: Executor added: app-20180625104607-0004/0 on worker-20180625104347-172.18.0.2-41937 (172.18.0.2:41937) with 4 core(s)
    18/06/25 10:46:07 INFO StandaloneSchedulerBackend: Granted executor ID app-20180625104607-0004/0 on hostPort 172.18.0.2:41937 with 4 core(s), 1024.0 MB RAM
    18/06/25 10:46:07 INFO BlockManagerMasterEndpoint: Registering block manager voxxed-bigdata-spark-1-tftwx:37657 with 366.3 MB RAM, BlockManagerId(driver, voxxed-bigdata-spark-1-tftwx, 37657, None)
    18/06/25 10:46:07 INFO StandaloneAppClient$ClientEndpoint: Executor added: app-20180625104607-0004/1 on worker-20180625104347-172.18.0.11-35055 (172.18.0.11:35055) with 4 core(s)
    18/06/25 10:46:07 INFO StandaloneSchedulerBackend: Granted executor ID app-20180625104607-0004/1 on hostPort 172.18.0.11:35055 with 4 core(s), 1024.0 MB RAM
    18/06/25 10:46:07 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, voxxed-bigdata-spark-1-tftwx, 37657, None)
    18/06/25 10:46:07 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, voxxed-bigdata-spark-1-tftwx, 37657, None)
    18/06/25 10:46:08 INFO StandaloneAppClient$ClientEndpoint: Executor updated: app-20180625104607-0004/0 is now RUNNING
    18/06/25 10:46:08 INFO StandaloneAppClient$ClientEndpoint: Executor updated: app-20180625104607-0004/1 is now RUNNING
    18/06/25 10:46:08 INFO StandaloneSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
    18/06/25 10:46:08 INFO ZkEventThread: Starting ZkClient event thread.
    18/06/25 10:46:08 INFO ZooKeeper: Client environment:zookeeper.version=3.4.6-1569965, built on 02/20/2014 09:09 GMT
    18/06/25 10:46:08 INFO ZooKeeper: Client environment:host.name=voxxed-bigdata-spark-1-tftwx
    18/06/25 10:46:08 INFO ZooKeeper: Client environment:java.version=1.8.0_161
    18/06/25 10:46:08 INFO ZooKeeper: Client environment:java.vendor=Oracle Corporation
    18/06/25 10:46:08 INFO ZooKeeper: Client environment:java.home=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.161-0.b14.el7_4.x86_64/jre
    18/06/25 10:46:08 INFO ZooKeeper: Client environment:java.class.path=/opt/spark/conf/:/opt/spark/jars/commons-compress-1.4.1.jar:/opt/spark/jars/aopalliance-repackaged-2.4.0-b34.jar:/opt/spark/jars/macro-compat_2.11-1.1.1.jar:/opt/spark/jars/spark-mesos_2.11-2.3.0.jar:/opt/spark/jars/hadoop-client-2.7.3.jar:/opt/spark/jars/spark-streaming_2.11-2.3.0.jar:/opt/spark/jars/jersey-container-servlet-core-2.22.2.jar:/opt/spark/jars/orc-mapreduce-1.4.1-nohive.jar:/opt/spark/jars/jackson-mapper-asl-1.9.13.jar:/opt/spark/jars/flatbuffers-1.2.0-3f79e055.jar:/opt/spark/jars/orc-core-1.4.1-nohive.jar:/opt/spark/jars/breeze_2.11-0.13.2.jar:/opt/spark/jars/joda-time-2.9.3.jar:/opt/spark/jars/parquet-encoding-1.8.2.jar:/opt/spark/jars/metrics-graphite-3.1.5.jar:/opt/spark/jars/parquet-common-1.8.2.jar:/opt/spark/jars/jackson-module-scala_2.11-2.6.7.1.jar:/opt/spark/jars/scala-library-2.11.8.jar:/opt/spark/jars/hadoop-mapreduce-client-core-2.7.3.jar:/opt/spark/jars/jersey-client-2.22.2.jar:/opt/spark/jars/json4s-jackson_2.11-3.2.11.jar:/opt/spark/jars/spark-mllib-local_2.11-2.3.0.jar:/opt/spark/jars/hadoop-yarn-server-web-proxy-2.7.3.jar:/opt/spark/jars/antlr4-runtime-4.7.jar:/opt/spark/jars/datanucleus-api-jdo-3.2.6.jar:/opt/spark/jars/mesos-1.4.0-shaded-protobuf.jar:/opt/spark/jars/calcite-linq4j-1.2.0-incubating.jar:/opt/spark/jars/spark-tags_2.11-2.3.0.jar:/opt/spark/jars/stream-2.7.0.jar:/opt/spark/jars/commons-net-2.2.jar:/opt/spark/jars/janino-3.0.8.jar:/opt/spark/jars/chill-java-0.8.4.jar:/opt/spark/jars/hadoop-annotations-2.7.3.jar:/opt/spark/jars/jsr305-1.3.9.jar:/opt/spark/jars/spark-core_2.11-2.3.0.jar:/opt/spark/jars/hadoop-yarn-common-2.7.3.jar:/opt/spark/jars/slf4j-api-1.7.16.jar:/opt/spark/jars/xbean-asm5-shaded-4.4.jar:/opt/spark/jars/stax-api-1.0.1.jar:/opt/spark/jars/activation-1.1.1.jar:/opt/spark/jars/commons-crypto-1.0.0.jar:/opt/spark/jars/hk2-api-2.4.0-b34.jar:/opt/spark/jars/jersey-common-2.22.2.jar:/opt/spark/jars/commons-beanutils-core-1.8.0.jar:/opt/spark/jars/javax.servlet-api-3.1.0.jar:/opt/spark/jars/leveldbjni-all-1.8.jar:/opt/spark/jars/bcprov-jdk15on-1.58.jar:/opt/spark/jars/RoaringBitmap-0.5.11.jar:/opt/spark/jars/commons-compiler-3.0.8.jar:/opt/spark/jars/zookeeper-3.4.6.jar:/opt/spark/jars/jackson-core-2.6.7.jar:/opt/spark/jars/protobuf-java-2.5.0.jar:/opt/spark/jars/spark-catalyst_2.11-2.3.0.jar:/opt/spark/jars/commons-pool-1.5.4.jar:/opt/spark/jars/hk2-locator-2.4.0-b34.jar:/opt/spark/jars/log4j-1.2.17.jar:/opt/spark/jars/hive-metastore-1.2.1.spark2.jar:/opt/spark/jars/spark-hive-thriftserver_2.11-2.3.0.jar:/opt/spark/jars/snakeyaml-1.15.jar:/opt/spark/jars/javassist-3.18.1-GA.jar:/opt/spark/jars/univocity-parsers-2.5.9.jar:/opt/spark/jars/hive-jdbc-1.2.1.spark2.jar:/opt/spark/jars/paranamer-2.8.jar:/opt/spark/jars/jsp-api-2.1.jar:/opt/spark/jars/jta-1.1.jar:/opt/spark/jars/hadoop-common-2.7.3.jar:/opt/spark/jars/jcl-over-slf4j-1.7.16.jar:/opt/spark/jars/aopalliance-1.0.jar:/opt/spark/jars/hadoop-yarn-api-2.7.3.jar:/opt/spark/jars/kubernetes-model-2.0.0.jar:/opt/spark/jars/ivy-2.4.0.jar:/opt/spark/jars/hadoop-mapreduce-client-shuffle-2.7.3.jar:/opt/spark/jars/arrow-memory-0.8.0.jar:/opt/spark/jars/commons-logging-1.1.3.jar:/opt/spark/jars/commons-httpclient-3.1.jar:/opt/spark/jars/guava-14.0.1.jar:/opt/spark/jars/libfb303-0.9.3.jar:/opt/spark/jars/spark-sql_2.11-2.3.0.jar:/opt/spark/jars/automaton-1.11-8.jar:/opt/spark/jars/xercesImpl-2.9.1.jar:/opt/spark/jars/httpclient-4.5.4.jar:/opt/spark/jars/guice-servlet-3.0.jar:/opt/spark/jars/javax.inject-1.jar:/opt/spark/jars/zstd-jni-1.3.2-2.jar:/opt/spark/jars/logging-interceptor-3.8.1.jar:/opt/spark/jars/parquet-hadoop-1.8.2.jar:/opt/spark/jars/antlr-2.7.7.jar:/opt/spark/jars/calcite-avatica-1.2.0-incubating.jar:/opt/spark/jars/hadoop-yarn-client-2.7.3.jar:/opt/spark/jars/jersey-container-servlet-2.22.2.jar:/opt/spark/jars/parquet-format-2.3.1.jar:/opt/spark/jars/hadoop-mapreduce-client-jobclient-2.7.3.jar:/opt/spark/jars/netty-3.9.9.Final.jar:/opt/spark/jars/jersey-guava-2.22.2.jar:/opt/spark/jars/guice-3.0.jar:/opt/spark/jars/validation-api-1.1.0.Final.jar:/opt/spark/jars/kryo-shaded-3.0.3.jar:/opt/spark/jars/okio-1.13.0.jar:/opt/spark/jars/arrow-vector-0.8.0.jar:/opt/spark/jars/curator-recipes-2.7.1.jar:/opt/spark/jars/jpam-1.1.jar:/opt/spark/jars/hive-cli-1.2.1.spark2.jar:/opt/spark/jars/metrics-jvm-3.1.5.jar:/opt/spark/jars/netty-all-4.1.17.Final.jar:/opt/spark/jars/minlog-1.3.0.jar:/opt/spark/jars/jackson-xc-1.9.13.jar:/opt/spark/jars/commons-configuration-1.6.jar:/opt/spark/jars/commons-dbcp-1.4.jar:/opt/spark/jars/osgi-resource-locator-1.0.1.jar:/opt/spark/jars/base64-2.3.8.jar:/opt/spark/jars/json4s-core_2.11-3.2.11.jar:/opt/spark/jars/java-xmlbuilder-1.1.jar:/opt/spark/jars/antlr-runtime-3.4.jar:/opt/spark/jars/spark-launcher_2.11-2.3.0.jar:/opt/spark/jars/jersey-media-jaxb-2.22.2.jar:/opt/spark/jars/compress-lzf-1.0.3.jar:/opt/spark/jars/generex-1.0.1.jar:/opt/spark/jars/slf4j-log4j12-1.7.16.jar:/opt/spark/jars/jtransforms-2.4.0.jar:/opt/spark/jars/lz4-java-1.4.0.jar:/opt/spark/jars/pyrolite-4.13.jar:/opt/spark/jars/calcite-core-1.2.0-incubating.jar:/opt/spark/jars/metrics-json-3.1.5.jar:/opt/spark/jars/spark-network-shuffle_2.11-2.3.0.jar:/opt/spark/jars/core-1.1.2.jar:/opt/spark/jars/spire-macros_2.11-0.13.0.jar:/opt/spark/jars/datanucleus-core-3.2.10.jar:/opt/spark/jars/parquet-hadoop-bundle-1.6.0.jar:/opt/spark/jars/stringtemplate-3.2.1.jar:/opt/spark/jars/curator-framework-2.7.1.jar:/opt/spark/jars/bonecp-0.8.0.RELEASE.jar:/opt/spark/jars/arpack_combined_all-0.1.jar:/opt/spark/jars/jul-to-slf4j-1.7.16.jar:/opt/spark/jars/jaxb-api-2.2.2.jar:/opt/spark/jars/eigenbase-properties-1.1.5.jar:/opt/spark/jars/avro-1.7.7.jar:/opt/spark/jars/jackson-dataformat-yaml-2.6.7.jar:/opt/spark/jars/spark-kubernetes_2.11-2.3.0.jar:/opt/spark/jars/chill_2.11-0.8.4.jar:/opt/spark/jars/hive-beeline-1.2.1.spark2.jar:/opt/spark/jars/spark-graphx_2.11-2.3.0.jar:/opt/spark/jars/commons-beanutils-1.7.0.jar:/opt/spark/jars/xz-1.0.jar:/opt/spark/jars/spire_2.11-0.13.0.jar:/opt/spark/jars/javax.inject-2.4.0-b34.jar:/opt/spark/jars/okhttp-3.8.1.jar:/opt/spark/jars/commons-collections-3.2.2.jar:/opt/spark/jars/jackson-core-asl-1.9.13.jar:/opt/spark/jars/libthrift-0.9.3.jar:/opt/spark/jars/commons-io-2.4.jar:/opt/spark/jars/jackson-jaxrs-1.9.13.jar:/opt/spark/jars/machinist_2.11-0.6.1.jar:/opt/spark/jars/spark-repl_2.11-2.3.0.jar:/opt/spark/jars/hadoop-hdfs-2.7.3.jar:/opt/spark/jars/metrics-core-3.1.5.jar:/opt/spark/jars/api-util-1.0.0-M20.jar:/opt/spark/jars/jackson-annotations-2.6.7.jar:/opt/spark/jars/shapeless_2.11-2.3.2.jar:/opt/spark/jars/spark-network-common_2.11-2.3.0.jar:/opt/spark/jars/scalap-2.11.8.jar:/opt/spark/jars/jackson-module-paranamer-2.7.9.jar:/opt/spark/jars/scala-compiler-2.11.8.jar:/opt/spark/jars/scala-parser-combinators_2.11-1.0.4.jar:/opt/spark/jars/zjsonpatch-0.3.0.jar:/opt/spark/jars/commons-codec-1.10.jar:/opt/spark/jars/snappy-java-1.1.2.6.jar:/opt/spark/jars/parquet-jackson-1.8.2.jar:/opt/spark/jars/commons-lang3-3.5.jar:/opt/spark/jars/commons-digester-1.8.jar:/opt/spark/jars/spark-sketch_2.11-2.3.0.jar:/opt/spark/jars/spark-kvstore_2.11-2.3.0.jar:/opt/spark/jars/htrace-core-3.1.0-incubating.jar:/opt/spark/jars/apacheds-i18n-2.0.0-M15.jar:/opt/spark/jars/apache-log4j-extras-1.2.17.jar:/opt/spark/jars/commons-lang-2.6.jar:/opt/spark/jars/breeze-macros_2.11-0.13.2.jar:/opt/spark/jars/hadoop-mapreduce-client-app-2.7.3.jar:/opt/spark/jars/py4j-0.10.6.jar:/opt/spark/jars/jline-2.12.1.jar:/opt/spark/jars/spark-yarn_2.11-2.3.0.jar:/opt/spark/jars/snappy-0.2.jar:/opt/spark/jars/hive-exec-1.2.1.spark2.jar:/opt/spark/jars/httpcore-4.4.8.jar:/opt/spark/jars/stax-api-1.0-2.jar:/opt/spark/jars/super-csv-2.2.0.jar:/opt/spark/jars/kubernetes-client-3.0.0.jar:/opt/spark/jars/api-asn1-api-1.0.0-M20.jar:/opt/spark/jars/scala-xml_2.11-1.0.5.jar:/opt/spark/jars/ST4-4.0.4.jar:/opt/spark/jars/javax.annotation-api-1.2.jar:/opt/spark/jars/jersey-server-2.22.2.jar:/opt/spark/jars/spark-unsafe_2.11-2.3.0.jar:/opt/spark/jars/spark-hive_2.11-2.3.0.jar:/opt/spark/jars/hadoop-mapreduce-client-common-2.7.3.jar:/opt/spark/jars/jackson-databind-2.6.7.1.jar:/opt/spark/jars/javax.ws.rs-api-2.0.1.jar:/opt/spark/jars/JavaEWAH-0.3.2.jar:/opt/spark/jars/datanucleus-rdbms-3.2.9.jar:/opt/spark/jars/objenesis-2.1.jar:/opt/spark/jars/jdo-api-3.0.1.jar:/opt/spark/jars/jetty-util-6.1.26.jar:/opt/spark/jars/opencsv-2.3.jar:/opt/spark/jars/parquet-column-1.8.2.jar:/opt/spark/jars/gson-2.2.4.jar:/opt/spark/jars/json4s-ast_2.11-3.2.11.jar:/opt/spark/jars/jodd-core-3.5.2.jar:/opt/spark/jars/apacheds-kerberos-codec-2.0.0-M15.jar:/opt/spark/jars/avro-mapred-1.7.7-hadoop2.jar:/opt/spark/jars/avro-ipc-1.7.7.jar:/opt/spark/jars/hppc-0.7.2.jar:/opt/spark/jars/derby-10.12.1.1.jar:/opt/spark/jars/hadoop-yarn-server-common-2.7.3.jar:/opt/spark/jars/jetty-6.1.26.jar:/opt/spark/jars/hadoop-auth-2.7.3.jar:/opt/spark/jars/commons-math3-3.4.1.jar:/opt/spark/jars/jets3t-0.9.4.jar:/opt/spark/jars/oro-2.0.8.jar:/opt/spark/jars/commons-cli-1.2.jar:/opt/spark/jars/javolution-5.5.1.jar:/opt/spark/jars/curator-client-2.7.1.jar:/opt/spark/jars/hk2-utils-2.4.0-b34.jar:/opt/spark/jars/xmlenc-0.52.jar:/opt/spark/jars/aircompressor-0.8.jar:/opt/spark/jars/jackson-module-jaxb-annotations-2.6.7.jar:/opt/spark/jars/scala-reflect-2.11.8.jar:/opt/spark/jars/spark-mllib_2.11-2.3.0.jar:/opt/spark/jars/arrow-format-0.8.0.jar
    18/06/25 10:46:08 INFO ZooKeeper: Client environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
    18/06/25 10:46:08 INFO ZooKeeper: Client environment:java.io.tmpdir=/tmp
    18/06/25 10:46:08 INFO ZooKeeper: Client environment:java.compiler=
    18/06/25 10:46:08 INFO ZooKeeper: Client environment:os.name=Linux
    18/06/25 10:46:08 INFO ZooKeeper: Client environment:os.arch=amd64
    18/06/25 10:46:08 INFO ZooKeeper: Client environment:os.version=4.16.15-200.fc27.x86_64
    18/06/25 10:46:08 INFO ZooKeeper: Client environment:user.name=default
    18/06/25 10:46:08 INFO ZooKeeper: Client environment:user.home=/
    18/06/25 10:46:08 INFO ZooKeeper: Client environment:user.dir=/opt/jboss
    18/06/25 10:46:08 INFO ZooKeeper: Initiating client connection, connectString=zookeeper:2181 sessionTimeout=10000 watcher=org.I0Itec.zkclient.ZkClient@273842a6
    18/06/25 10:46:08 INFO ZkClient: Waiting for keeper state SyncConnected
    18/06/25 10:46:08 INFO ClientCnxn: Opening socket connection to server zookeeper.fabric8-maven-plugin.svc.cluster.local/172.30.252.173:2181. Will not attempt to authenticate using SASL (unknown error)
    18/06/25 10:46:08 INFO ClientCnxn: Socket connection established to zookeeper.fabric8-maven-plugin.svc.cluster.local/172.30.252.173:2181, initiating session
    18/06/25 10:46:09 INFO ClientCnxn: Session establishment complete on server zookeeper.fabric8-maven-plugin.svc.cluster.local/172.30.252.173:2181, sessionid = 0x1643688effd0009, negotiated timeout = 10000
    18/06/25 10:46:09 INFO ZkClient: zookeeper state changed (SyncConnected)
    kafka.common.TopicExistsException: Topic "stars" already exists.
    at kafka.admin.AdminUtils$.createOrUpdateTopicPartitionAssignmentPathInZK(AdminUtils.scala:420)
    at kafka.admin.AdminUtils$.createTopic(AdminUtils.scala:404)
    at com.voxxed.bigdata.spark.KafkaSupport$.createTopicIfNotPresent(KafkaSupport.scala:41)
    at com.voxxed.bigdata.spark.Stream$.main(Stream.scala:21)
    at com.voxxed.bigdata.spark.Stream.main(Stream.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:879)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
    18/06/25 10:46:09 INFO ZooKeeper: Initiating client connection, connectString=zookeeper:2181 sessionTimeout=10000 watcher=org.I0Itec.zkclient.ZkClient@1cb19dba
    18/06/25 10:46:09 INFO ZkEventThread: Starting ZkClient event thread.
    18/06/25 10:46:09 INFO ZkClient: Waiting for keeper state SyncConnected
    18/06/25 10:46:09 INFO ClientCnxn: Opening socket connection to server zookeeper.fabric8-maven-plugin.svc.cluster.local/172.30.252.173:2181. Will not attempt to authenticate using SASL (unknown error)
    18/06/25 10:46:09 INFO ClientCnxn: Socket connection established to zookeeper.fabric8-maven-plugin.svc.cluster.local/172.30.252.173:2181, initiating session
    18/06/25 10:46:09 INFO ClientCnxn: Session establishment complete on server zookeeper.fabric8-maven-plugin.svc.cluster.local/172.30.252.173:2181, sessionid = 0x1643688effd000a, negotiated timeout = 10000
    18/06/25 10:46:09 INFO ZkClient: zookeeper state changed (SyncConnected)
    kafka.common.TopicExistsException: Topic "recommendations" already exists.
    at kafka.admin.AdminUtils$.createOrUpdateTopicPartitionAssignmentPathInZK(AdminUtils.scala:420)
    at kafka.admin.AdminUtils$.createTopic(AdminUtils.scala:404)
    at com.voxxed.bigdata.spark.KafkaSupport$.createTopicIfNotPresent(KafkaSupport.scala:41)
    at com.voxxed.bigdata.spark.Stream$.main(Stream.scala:22)
    at com.voxxed.bigdata.spark.Stream.main(Stream.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:879)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
    Exception in thread "main" java.lang.AbstractMethodError
    at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:99)
    at org.apache.spark.streaming.kafka010.KafkaUtils$.initializeLogIfNecessary(KafkaUtils.scala:40)
    at org.apache.spark.internal.Logging$class.log(Logging.scala:46)
    at org.apache.spark.streaming.kafka010.KafkaUtils$.log(KafkaUtils.scala:40)
    at org.apache.spark.internal.Logging$class.logWarning(Logging.scala:66)
    at org.apache.spark.streaming.kafka010.KafkaUtils$.logWarning(KafkaUtils.scala:40)
    at org.apache.spark.streaming.kafka010.KafkaUtils$.fixKafkaParams(KafkaUtils.scala:208)
    at org.apache.spark.streaming.kafka010.DirectKafkaInputDStream.(DirectKafkaInputDStream.scala:66)
    at org.apache.spark.streaming.kafka010.KafkaUtils$.createDirectStream(KafkaUtils.scala:150)
    at org.apache.spark.streaming.kafka010.KafkaUtils$.createDirectStream(KafkaUtils.scala:127)
    at com.voxxed.bigdata.spark.Stream$.main(Stream.scala:27)
    at com.voxxed.bigdata.spark.Stream.main(Stream.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:879)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
    18/06/25 10:46:10 INFO SparkContext: Invoking stop() from shutdown hook
    18/06/25 10:46:10 INFO SparkUI: Stopped Spark web UI at http://voxxed-bigdata-spark-1-tftwx:4040
    18/06/25 10:46:10 INFO StandaloneSchedulerBackend: Shutting down all executors
    18/06/25 10:46:10 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Asking each executor to shut down
    18/06/25 10:46:10 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
    18/06/25 10:46:10 INFO MemoryStore: MemoryStore cleared
    18/06/25 10:46:10 INFO BlockManager: BlockManager stopped
    18/06/25 10:46:10 INFO BlockManagerMaster: BlockManagerMaster stopped
    18/06/25 10:46:10 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
    18/06/25 10:46:10 INFO SparkContext: Successfully stopped SparkContext
    18/06/25 10:46:10 INFO ShutdownHookManager: Shutdown hook called
    18/06/25 10:46:10 INFO ShutdownHookManager: Deleting directory /tmp/spark-5d868bdf-ea56-474c-8caf-183aaefe191c
    18/06/25 10:46:10 INFO ShutdownHookManager: Deleting directory /tmp/spark-0d4f9ac1-a242-4587-a858-15270556afae
    Deleting cluster 'mycluster'
    cluster is not ephemeral
    cluster not deleted 'mycluster'
    `

In case if I change spark version in Spark/pom.xm form 2.1.0 (2.2.0) to 2.3.0, maven can't build project:

[ymaks@ymaks test]$ cd voxxed-bigdata-spark/ [ymaks@ymaks voxxed-bigdata-spark]$ mvn clean fabric8:deploy [INFO] Scanning for projects... [WARNING] [WARNING] Some problems were encountered while building the effective model for com.voxxed.bigdata:voxxed-bigdata-spark:jar:1.0-SNAPSHOT [WARNING] The expression ${build.finalName} is deprecated. Please use ${project.build.finalName} instead. [WARNING] [WARNING] It is highly recommended to fix these problems because they threaten the stability of your build. [WARNING] [WARNING] For this reason, future Maven versions might no longer support building such malformed projects. [WARNING] [INFO] [INFO] --------------< com.voxxed.bigdata:voxxed-bigdata-spark >--------------- [INFO] Building voxxed-bigdata-spark 1.0-SNAPSHOT [INFO] --------------------------------[ jar ]--------------------------------- Downloading from sonatype-nexus-snapshots: https://oss.sonatype.org/content/repositories/snapshots/commons-codec/commons-codec/maven-metadata.xml Downloading from central: https://repo.maven.apache.org/maven2/commons-codec/commons-codec/maven-metadata.xml Downloading from apache.snapshots: https://repository.apache.org/snapshots/commons-codec/commons-codec/maven-metadata.xml Downloaded from central: https://repo.maven.apache.org/maven2/commons-codec/commons-codec/maven-metadata.xml (642 B at 1.0 kB/s) Downloaded from apache.snapshots: https://repository.apache.org/snapshots/commons-codec/commons-codec/maven-metadata.xml (339 B at 236 B/s) Downloading from apache.snapshots: https://repository.apache.org/snapshots/commons-codec/commons-codec/2.0-SNAPSHOT/maven-metadata.xml Downloading from sonatype-nexus-snapshots: https://oss.sonatype.org/content/repositories/snapshots/commons-codec/commons-codec/2.0-SNAPSHOT/maven-metadata.xml Downloaded from apache.snapshots: https://repository.apache.org/snapshots/commons-codec/commons-codec/2.0-SNAPSHOT/maven-metadata.xml (2.5 kB at 2.0 kB/s) [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ voxxed-bigdata-spark --- [INFO] [INFO] >>> fabric8-maven-plugin:3.2.28:deploy (default-cli) > install @ voxxed-bigdata-spark >>> [INFO] [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ voxxed-bigdata-spark --- [WARNING] Using platform encoding (UTF-8 actually) to copy filtered resources, i.e. build is platform dependent! [INFO] skip non existing resourceDirectory /home/ymaks/sources/IdeaProjects/test/voxxed-bigdata-spark/src/main/resources [INFO] [INFO] --- fabric8-maven-plugin:3.2.28:resource (default) @ voxxed-bigdata-spark --- [INFO] F8: Running in OpenShift mode [INFO] F8: Using docker image name of namespace: fabric8-maven-plugin [INFO] F8: Running generator java-exec [INFO] F8: java-exec: Using Docker image radanalyticsio/radanalytics-java-spark as base / builder [INFO] F8: Using resource templates from /home/ymaks/sources/IdeaProjects/test/voxxed-bigdata-spark/src/main/fabric8 [INFO] F8: fmp-service: Adding a default Service with ports [8080] [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ voxxed-bigdata-spark --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- scala-maven-plugin:3.2.1:compile (default) @ voxxed-bigdata-spark --- [INFO] artifact commons-codec:commons-codec: checking for updates from central [INFO] artifact commons-codec:commons-codec: checking for updates from spy [WARNING] Expected all dependencies to require Scala version: 2.11.8 [WARNING] com.twitter:chill_2.11:0.8.4 requires scala version: 2.11.8 [WARNING] org.apache.spark:spark-core_2.11:2.3.0 requires scala version: 2.11.8 [WARNING] org.json4s:json4s-jackson_2.11:3.2.11 requires scala version: 2.11.0 [WARNING] Multiple versions of scala libraries detected! [INFO] /home/ymaks/sources/IdeaProjects/test/voxxed-bigdata-spark/src/main/scala:-1: info: compiling [INFO] Compiling 5 source files to /home/ymaks/sources/IdeaProjects/test/voxxed-bigdata-spark/target/classes at 1529924157678 [ERROR] /home/ymaks/sources/IdeaProjects/test/voxxed-bigdata-spark/src/main/scala/com/voxxed/bigdata/spark/KafkaSupport.scala:3: error: object admin is not a member of package kafka [ERROR] import kafka.admin.AdminUtils [ERROR] ^ [ERROR] /home/ymaks/sources/IdeaProjects/test/voxxed-bigdata-spark/src/main/scala/com/voxxed/bigdata/spark/KafkaSupport.scala:4: error: object utils is not a member of package kafka [ERROR] import kafka.utils.ZkUtils [ERROR] ^ [ERROR] /home/ymaks/sources/IdeaProjects/test/voxxed-bigdata-spark/src/main/scala/com/voxxed/bigdata/spark/KafkaSupport.scala:39: error: not found: value ZkUtils [ERROR] val zkUtils = ZkUtils("zookeeper:2181", 10000, 10000, isZkSecurityEnabled = false) [ERROR] ^ [ERROR] /home/ymaks/sources/IdeaProjects/test/voxxed-bigdata-spark/src/main/scala/com/voxxed/bigdata/spark/KafkaSupport.scala:39: error: not found: value isZkSecurityEnabled [ERROR] val zkUtils = ZkUtils("zookeeper:2181", 10000, 10000, isZkSecurityEnabled = false) [ERROR] ^ [ERROR] /home/ymaks/sources/IdeaProjects/test/voxxed-bigdata-spark/src/main/scala/com/voxxed/bigdata/spark/KafkaSupport.scala:41: error: not found: value AdminUtils [ERROR] AdminUtils.createTopic(zkUtils, name, 1, 1) [ERROR] ^ [ERROR] 5 errors found [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 27.473 s [INFO] Finished at: 2018-06-25T13:56:02+03:00 [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.1:compile (default) on project voxxed-bigdata-spark: wrap: org.apache.commons.exec.ExecuteException: Process exited with an error: 1 (Exit value: 1) -> [Help 1] [ERROR]

Expected result:
Application should work in the same scenario like in description of Example

@elmiko
Copy link
Collaborator

elmiko commented Jun 25, 2018

hi @YegorMaksymchuk , thanks for the report!

i will take a look at the tutorial to see if i can reproduce your results.

@elmiko elmiko added the bug label Jun 25, 2018
@YegorMaksymchuk
Copy link
Author

YegorMaksymchuk commented Jun 26, 2018

Hi! @elmiko
so for run all spark pods with version 2.1.0 need specify tag:v.0.2.7 in resources.yaml and in Spark\pom.xml in plugin section set:

             <generator>
                   <config>
                       <java-exec>
                           <!-- The radanalytics java-spark image -->
                           <from>radanalyticsio/radanalytics-java-spark:v0.2.7</from>
                       </java-exec>
                   </config>
               </generator>

after that each spark pod will be with version 2.1.0.
But application not work properly you can't get recommendation:
Kafka log : https://pastebin.com/j1MhK4RN
Spark log: https://pastebin.com/ge5eEayq
WebUI:
Part1: https://pastebin.com/3kHx9LtN
Part2: https://pastebin.com/jGcu08Vh
Part3: https://pastebin.com/LRA7M7bV

@elmiko
Copy link
Collaborator

elmiko commented Jun 28, 2018

After reviewing the source code for this tutorial, i think we should hold off on automated testing for it. i would like to propose that we refactor this code into a new tutorial that just focuses on using the maven fabric8 plugin to deploy spark clusters with oshinko.

@elmiko
Copy link
Collaborator

elmiko commented Jun 28, 2018

@tmckayus @willb i would love to hear your thoughts on this

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants