Giter Club home page Giter Club logo

flink-sql-cookbook-on-zeppelin's Introduction

flink-sql-cookbook-on-zeppelin's People

Contributors

haydenzhourepo avatar zjffdu avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

flink-sql-cookbook-on-zeppelin's Issues

版本一致,使用公众号上的教程搭建报错如下,恳请解惑.

%flink.ssql

DROP TABLE IF EXISTS orders;

CREATE TABLE orders (
order_uid BIGINT,
product_id BIGINT,
price DOUBLE,
order_time TIMESTAMP(3)
) WITH (
'connector' = 'datagen'
);

org.apache.zeppelin.interpreter.InterpreterException: org.apache.zeppelin.interpreter.InterpreterException: java.lang.NoSuchMethodError: org.apache.flink.api.common.ExecutionConfig.disableSysoutLogging()Lorg/apache/flink/api/common/ExecutionConfig;
at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:76)
at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:760)
at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:668)
at org.apache.zeppelin.scheduler.Job.run(Job.java:172)
at org.apache.zeppelin.scheduler.AbstractScheduler.runJob(AbstractScheduler.java:130)
at org.apache.zeppelin.scheduler.ParallelScheduler.lambda$runJobInScheduler$0(ParallelScheduler.java:39)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.zeppelin.interpreter.InterpreterException: java.lang.NoSuchMethodError: org.apache.flink.api.common.ExecutionConfig.disableSysoutLogging()Lorg/apache/flink/api/common/ExecutionConfig;
at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:76)
at org.apache.zeppelin.interpreter.Interpreter.getInterpreterInTheSameSessionByClassName(Interpreter.java:355)
at org.apache.zeppelin.interpreter.Interpreter.getInterpreterInTheSameSessionByClassName(Interpreter.java:366)
at org.apache.zeppelin.flink.FlinkStreamSqlInterpreter.open(FlinkStreamSqlInterpreter.java:47)
at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
... 8 more
Caused by: java.lang.NoSuchMethodError: org.apache.flink.api.common.ExecutionConfig.disableSysoutLogging()Lorg/apache/flink/api/common/ExecutionConfig;
at org.apache.zeppelin.flink.FlinkScalaInterpreter.setTableEnvConfig(FlinkScalaInterpreter.scala:444)
at org.apache.zeppelin.flink.FlinkScalaInterpreter.open(FlinkScalaInterpreter.scala:114)
at org.apache.zeppelin.flink.FlinkInterpreter.open(FlinkInterpreter.java:67)
at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
... 12 more
ERROR
Took 15 sec. Last updated by anonymous at March 09 2021, 9:18:28 AM.

scala.reflect.internal.MissingRequirementError: object java.lang.Object in compiler mirror not found.

我使用docker-compose启动后,执行一个例子时会遇到这个报错,我试了几个例子的create语句都会有报错。
完整报错如下:

org.apache.zeppelin.interpreter.InterpreterException: org.apache.zeppelin.interpreter.InterpreterException: scala.reflect.internal.MissingRequirementError: object java.lang.Object in compiler mirror not found.
	at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:76)
	at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:836)
	at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:744)
	at org.apache.zeppelin.scheduler.Job.run(Job.java:172)
	at org.apache.zeppelin.scheduler.AbstractScheduler.runJob(AbstractScheduler.java:132)
	at org.apache.zeppelin.scheduler.ParallelScheduler.lambda$runJobInScheduler$0(ParallelScheduler.java:46)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.zeppelin.interpreter.InterpreterException: scala.reflect.internal.MissingRequirementError: object java.lang.Object in compiler mirror not found.
	at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:76)
	at org.apache.zeppelin.interpreter.Interpreter.getInterpreterInTheSameSessionByClassName(Interpreter.java:355)
	at org.apache.zeppelin.interpreter.Interpreter.getInterpreterInTheSameSessionByClassName(Interpreter.java:366)
	at org.apache.zeppelin.flink.FlinkStreamSqlInterpreter.open(FlinkStreamSqlInterpreter.java:47)
	at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
	... 8 more
Caused by: scala.reflect.internal.MissingRequirementError: object java.lang.Object in compiler mirror not found.
	at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:17)
	at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:18)
	at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:53)
	at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:45)
	at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:45)
	at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:66)
	at scala.reflect.internal.Mirrors$RootsBase.getClassByName(Mirrors.scala:102)
	at scala.reflect.internal.Mirrors$RootsBase.getRequiredClass(Mirrors.scala:105)
	at scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass$lzycompute(Definitions.scala:257)
	at scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass(Definitions.scala:257)
	at scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1390)
	at scala.tools.nsc.Global$Run.<init>(Global.scala:1242)
	at scala.tools.nsc.interpreter.IMain.compileSourcesKeepingRun(IMain.scala:439)
	at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.compileAndSaveRun(IMain.scala:862)
	at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.compile(IMain.scala:820)
	at scala.tools.nsc.interpreter.IMain.bind(IMain.scala:682)
	at scala.tools.nsc.interpreter.IMain.bind(IMain.scala:719)
	at scala.tools.nsc.interpreter.IMain.bind(IMain.scala:720)
	at org.apache.zeppelin.flink.FlinkScalaInterpreter$$anonfun$createFlinkILoop$1.apply(FlinkScalaInterpreter.scala:319)
	at org.apache.zeppelin.flink.FlinkScalaInterpreter$$anonfun$createFlinkILoop$1.apply(FlinkScalaInterpreter.scala:317)
	at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:221)
	at org.apache.zeppelin.flink.FlinkScalaInterpreter.createFlinkILoop(FlinkScalaInterpreter.scala:317)
	at org.apache.zeppelin.flink.FlinkScalaInterpreter.open(FlinkScalaInterpreter.scala:113)
	at org.apache.zeppelin.flink.FlinkInterpreter.open(FlinkInterpreter.java:65)
	at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
	... 12 more

image

我看了下docker容器的java版本,scala没有找到这个是正常的吗:

image

docker-compose文件:

version: "3.4"

services:
  zeppelin:
    image: apache/zeppelin:0.9.0
    volumes:
      - ./logs:/logs
      - .:/notebook
      - ~/Documents/ps/codes/flink/flink-1.12.1:/flink
    environment:
      EPPELIN_LOG_DIR: /logs
      ZEPPELIN_NOTEBOOK_DIR: /notebook
    ports:
    - 8080:8080
    - 8081:8081

只修改了flink的路径,flink使用的是flink-1.12.1-bin-scala_2.11.tgz

docker info | grep Registry:

image

docker images

image

error for run 01 Aggregating Time Series Data (TVF)

org.apache.flink.table.api.ValidationException: Unable to create a source for reading table 'hive.default.server_logs'.

Table options are:

'connector'='faker'
'fields.client_identity.expression'='-'
'fields.client_ip.expression'='#{Internet.publicIpV4Address}'
'fields.log_time.expression'='#{date.past ''15'',''5'',''SECONDS''}'
'fields.request_line.expression'='#{regexify ''(GET|POST|PUT|PATCH){1}''} #{regexify ''(/search.html|/login.html|/prod.html|cart.html|/order.html){1}''} #{regexify ''(HTTP/1.1|HTTP/2|/HTTP/1.0){1}''}'
'fields.status_code.expression'='#{regexify ''(200|201|204|400|401|403|301){1}''}'
'fields.userid.expression'='-'
at org.apache.flink.table.factories.FactoryUtil.createTableSource(FactoryUtil.java:137)
at org.apache.flink.table.planner.plan.schema.CatalogSourceTable.createDynamicTableSource(CatalogSourceTable.java:116)
at org.apache.flink.table.planner.plan.schema.CatalogSourceTable.toRel(CatalogSourceTable.java:82)
at org.apache.calcite.sql2rel.SqlToRelConverter.toRel(SqlToRelConverter.java:3585)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertIdentifier(SqlToRelConverter.java:2507)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2144)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2093)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2050)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertSelectImpl(SqlToRelConverter.java:663)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertSelect(SqlToRelConverter.java:644)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertQueryRecursive(SqlToRelConverter.java:3438)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertQueryOrInList(SqlToRelConverter.java:1627)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertExists(SqlToRelConverter.java:1603)
at org.apache.calcite.sql2rel.SqlToRelConverter.substituteSubQuery(SqlToRelConverter.java:1279)
at org.apache.calcite.sql2rel.SqlToRelConverter.replaceSubQueries(SqlToRelConverter.java:1063)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertCollectionTable(SqlToRelConverter.java:2529)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2190)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2093)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2050)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertSelectImpl(SqlToRelConverter.java:663)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertSelect(SqlToRelConverter.java:644)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertQueryRecursive(SqlToRelConverter.java:3438)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertQuery(SqlToRelConverter.java:570)
at org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$rel(FlinkPlannerImpl.scala:169)
at org.apache.flink.table.planner.calcite.FlinkPlannerImpl.rel(FlinkPlannerImpl.scala:161)
at org.apache.flink.table.planner.operations.SqlToOperationConverter.toQueryOperation(SqlToOperationConverter.java:989)
at org.apache.flink.table.planner.operations.SqlToOperationConverter.convertSqlQuery(SqlToOperationConverter.java:958)
at org.apache.flink.table.planner.operations.SqlToOperationConverter.convert(SqlToOperationConverter.java:283)
at org.apache.flink.table.planner.delegation.ParserImpl.parse(ParserImpl.java:101)
at org.apache.flink.table.api.internal.TableEnvironmentImpl.sqlQuery(TableEnvironmentImpl.java:704)
at org.apache.zeppelin.flink.sql.AbstractStreamSqlJob.run(AbstractStreamSqlJob.java:102)
at org.apache.zeppelin.flink.FlinkStreamSqlInterpreter.callInnerSelect(FlinkStreamSqlInterpreter.java:89)
at org.apache.zeppelin.flink.FlinkSqlInterrpeter.callSelect(FlinkSqlInterrpeter.java:503)
at org.apache.zeppelin.flink.FlinkSqlInterrpeter.callCommand(FlinkSqlInterrpeter.java:266)
at org.apache.zeppelin.flink.FlinkSqlInterrpeter.runSqlList(FlinkSqlInterrpeter.java:160)
at org.apache.zeppelin.flink.FlinkSqlInterrpeter.internalInterpret(FlinkSqlInterrpeter.java:112)
at org.apache.zeppelin.interpreter.AbstractInterpreter.interpret(AbstractInterpreter.java:47)
at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:110)
at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:852)
at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:744)
at org.apache.zeppelin.scheduler.Job.run(Job.java:172)
at org.apache.zeppelin.scheduler.AbstractScheduler.runJob(AbstractScheduler.java:132)
at org.apache.zeppelin.scheduler.ParallelScheduler.lambda$runJobInScheduler$0(ParallelScheduler.java:46)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: org.apache.flink.table.api.ValidationException: Unable to create a source for reading table 'hive.default.server_logs'.

Table options are:

'connector'='faker'
'fields.client_identity.expression'='-'
'fields.client_ip.expression'='#{Internet.publicIpV4Address}'
'fields.log_time.expression'='#{date.past ''15'',''5'',''SECONDS''}'
'fields.request_line.expression'='#{regexify ''(GET|POST|PUT|PATCH){1}''} #{regexify ''(/search.html|/login.html|/prod.html|cart.html|/order.html){1}''} #{regexify ''(HTTP/1.1|HTTP/2|/HTTP/1.0){1}''}'
'fields.status_code.expression'='#{regexify ''(200|201|204|400|401|403|301){1}''}'
'fields.userid.expression'='-'
at org.apache.flink.table.factories.FactoryUtil.createTableSource(FactoryUtil.java:137)
at org.apache.flink.connectors.hive.HiveDynamicTableFactory.createDynamicTableSource(HiveDynamicTableFactory.java:134)
at org.apache.flink.table.factories.FactoryUtil.createTableSource(FactoryUtil.java:134)
... 45 more
Caused by: org.apache.flink.table.api.ValidationException: Cannot discover a connector using option: 'connector'='faker'
at org.apache.flink.table.factories.FactoryUtil.enrichNoMatchingConnectorError(FactoryUtil.java:467)
at org.apache.flink.table.factories.FactoryUtil.getDynamicTableFactory(FactoryUtil.java:441)
at org.apache.flink.table.factories.FactoryUtil.createTableSource(FactoryUtil.java:133)
... 47 more
Caused by: org.apache.flink.table.api.ValidationException: Could not find any factory for identifier 'faker' that implements 'org.apache.flink.table.factories.DynamicTableFactory' in the classpath.

Available factory identifiers are:

blackhole
datagen
filesystem
print
at org.apache.flink.table.factories.FactoryUtil.discoverFactory(FactoryUtil.java:319)
at org.apache.flink.table.factories.FactoryUtil.enrichNoMatchingConnectorError(FactoryUtil.java:463)
... 49 more
ERROR
Took 2 sec. Last updated by anonymous at February 19 2022, 3:45:07 AM.

NPE in FlinkInterpreterLauncher

Hi @zjffdu
I follow this tutorial https://medium.com/analytics-vidhya/learn-flink-sql-the-easy-way-d9d48a95ae57
Run the following command to start Zeppelin.
docker run -u $(id -u) -p 8081:8081 -p 8080:8080 --rm -v $PWD/logs:/logs -v /***:/notebook -v /***/flink-1.13.0:/opt/flink -e FLINK_HOME=/opt/flink -e ZEPPELIN_LOG_DIR='/logs' -e ZEPPELIN_NOTEBOOK_DIR='/notebook' --name zeppelin apache/zeppelin:0.10.0

Then I set FLINK_HOME /opt/flink.

When I try to run a sample in flink-sql-cookbook, it throws NPE. Could you help?
java.lang.NullPointerException at java.util.Arrays.stream(Arrays.java:5004) at org.apache.zeppelin.interpreter.launcher.FlinkInterpreterLauncher.chooseFlinkAppJar(FlinkInterpreterLauncher.java:121) at org.apache.zeppelin.interpreter.launcher.FlinkInterpreterLauncher.buildEnvFromProperties(FlinkInterpreterLauncher.java:67) at org.apache.zeppelin.interpreter.launcher.StandardInterpreterLauncher.launchDirectly(StandardInterpreterLauncher.java:77) at org.apache.zeppelin.interpreter.launcher.InterpreterLauncher.launch(InterpreterLauncher.java:110) at org.apache.zeppelin.interpreter.InterpreterSetting.createInterpreterProcess(InterpreterSetting.java:847) at org.apache.zeppelin.interpreter.ManagedInterpreterGroup.getOrCreateInterpreterProcess(ManagedInterpreterGroup.java:66) at org.apache.zeppelin.interpreter.remote.RemoteInterpreter.getOrCreateInterpreterProcess(RemoteInterpreter.java:104) at org.apache.zeppelin.interpreter.remote.RemoteInterpreter.internal_create(RemoteInterpreter.java:154) at org.apache.zeppelin.interpreter.remote.RemoteInterpreter.open(RemoteInterpreter.java:126) at org.apache.zeppelin.interpreter.remote.RemoteInterpreter.getFormType(RemoteInterpreter.java:271) at org.apache.zeppelin.notebook.Paragraph.jobRun(Paragraph.java:440) at org.apache.zeppelin.notebook.Paragraph.jobRun(Paragraph.java:71) at org.apache.zeppelin.scheduler.Job.run(Job.java:172) at org.apache.zeppelin.scheduler.AbstractScheduler.runJob(AbstractScheduler.java:132) at org.apache.zeppelin.scheduler.RemoteScheduler$JobRunner.run(RemoteScheduler.java:182) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748)

web loading,F12提示错误

angular-viewport-watch.js:77 Uncaught ReferenceError: angular is not defined
at angular-viewport-watch.js:77
at Object. (angular-viewport-watch.js:3)
at n (bootstrap:19)
at Object. (app.js:22)
at n (bootstrap:19)
at Object. (index.js:18)
at n (bootstrap:19)
at bootstrap:83
at bootstrap:83
(anonymous) @ angular-viewport-watch.js:77
(anonymous) @ angular-viewport-watch.js:3
n @ bootstrap:19
(anonymous) @ app.js:22
n @ bootstrap:19
(anonymous) @ index.js:18
n @ bootstrap:19
(anonymous) @ bootstrap:83
(anonymous) @ bootstrap:83
[Violation] Forced reflow while executing JavaScript took 32ms

为了使用 scala 2.12 版的flink ,zeppelin0.10 遇到问题

env:

jdk 11

scala 2.12.12 Scala 2.12.12 (Eclipse OpenJ9 VM, Java 11.0.10)

flink flink-1.13.2-bin-scala_2.12.tgz

zeppelin 0.10

when I use flink interpreter meet some error ,I have see the flink interpreter demon process is running ,but when I run the scala -flink mode tell me can not open flink interpreter ,feel confused ,

I just import some flink dependency


import org.apache.flink.api.common.typeinfo.Types
import org.apache.flink.api.java.typeutils._
import org.apache.flink.api.scala.typeutils._
import org.apache.flink.api.scala._
import org.apache.flink.streaming.api.scala.{DataStream, StreamExecutionEnvironment}
import org.apache.flink.table.api.{DataTypes, EnvironmentSettings}
import org.apache.flink.table.api.bridge.scala.StreamTableEnvironment
import org.apache.flink.table.descriptors.{Csv, FileSystem, Schema}
import org.apache.flink.table.factories.TableSourceFactory


org.apache.zeppelin.interpreter.InterpreterException: org.apache.zeppelin.interpreter.InterpreterException: Fail to open FlinkInterpreter at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:76) at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:833) at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:741) at org.apache.zeppelin.scheduler.Job.run(Job.java:172) at org.apache.zeppelin.scheduler.AbstractScheduler.runJob(AbstractScheduler.java:132) at org.apache.zeppelin.scheduler.FIFOScheduler.lambda$runJobInScheduler$0(FIFOScheduler.java:42) at org.apache.zeppelin.scheduler.FIFOScheduler$$Lambda$337/0x0000000000000000.run(Unknown Source) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:836) Caused by: org.apache.zeppelin.interpreter.InterpreterException: Fail to open FlinkInterpreter at org.apache.zeppelin.flink.FlinkInterpreter.open(FlinkInterpreter.java:80) at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70) ... 9 more Caused by: java.lang.IllegalArgumentException: argument type mismatch at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490) at org.apache.zeppelin.flink.FlinkInterpreter.loadFlinkScalaInterpreter(FlinkInterpreter.java:98) at org.apache.zeppelin.flink.FlinkInterpreter.open(FlinkInterpreter.java:74) ... 10 more

(base) [root@centos data]# ps -ef |grep flink
root 3778 3545 1 18:10 pts/1 00:00:07 /data/jdk11/bin/java -Dfile.encoding=UTF-8 -Dlog4j.configuration=file:///data/model/zeppelin-0.10.0-bin-all/conf/log4j.properties -Dlog4j.configurationFile=file:///data/model/zeppelin-0.10.0-bin-all/conf/log4j2.properties -Dzeppelin.log.file=/data/model/zeppelin-0.10.0-bin-all/logs/zeppelin-interpreter-flink-shared_process-root-centos.log -Xmx1024m -cp :/data/model/zeppelin-0.10.0-bin-all/local-repo/flink/*:/data/flink/lib/log4j-slf4j-impl-2.12.1.jar:/data/flink/lib/log4j-core-2.12.1.jar:/data/flink/lib/log4j-api-2.12.1.jar:/data/flink/lib/log4j-1.2-api-2.12.1.jar:/data/flink/lib/flink-table-blink_2.12-1.13.2.jar:/data/flink/lib/flink-table_2.12-1.13.2.jar:/data/flink/lib/flink-shaded-zookeeper-3.4.14.jar:/data/flink/lib/flink-json-1.13.2.jar:/data/flink/lib/flink-dist_2.12-1.13.2.jar:/data/flink/lib/flink-csv-1.13.2.jar:::/data/model/zeppelin-0.10.0-bin-all/interpreter/zeppelin-interpreter-shaded-0.10.0.jar:/data/flink/opt/flink-python_2.12-1.13.2.jar:/data/model/zeppelin-0.10.0-bin-all/interpreter/flink/zeppelin-flink-0.10.0-2.12.jar org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer 125.94.240.172 21116 flink-shared_process :
root 21696 49209 0 18:21 pts/1 00:00:00 grep --color=auto flink
root 34191 1 1 17:17 pts/0 00:00:48 /data/jdk11/bin/java -Xmx1073741824 -Xms1073741824 -XX:MaxMetaspaceSize=268435456 -Dlog.file=/data/flink/log/flink-root-standalonesession-6-centos.log -Dlog4j.configuration=file:/data/flink/conf/log4j.properties -Dlog4j.configurationFile=file:/data/flink/conf/log4j.properties -Dlogback.configurationFile=file:/data/flink/conf/logback.xml -classpath /data/flink/lib/flink-csv-1.13.2.jar:/data/flink/lib/flink-json-1.13.2.jar:/data/flink/lib/flink-shaded-zookeeper-3.4.14.jar:/data/flink/lib/flink-table_2.12-1.13.2.jar:/data/flink/lib/flink-table-blink_2.12-1.13.2.jar:/data/flink/lib/log4j-1.2-api-2.12.1.jar:/data/flink/lib/log4j-api-2.12.1.jar:/data/flink/lib/log4j-core-2.12.1.jar:/data/flink/lib/log4j-slf4j-impl-2.12.1.jar:/data/flink/lib/flink-dist_2.12-1.13.2.jar::: org.apache.flink.runtime.entrypoint.StandaloneSessionClusterEntrypoint --configDir /data/flink/conf --executionMode cluster -D jobmanager.memory.off-heap.size=134217728b -D jobmanager.memory.jvm-overhead.min=201326592b -D jobmanager.memory.jvm-metaspace.size=268435456b -D jobmanager.memory.heap.size=1073741824b -D jobmanager.memory.jvm-overhead.max=201326592b
root 34563 1 1 17:17 pts/0 00:00:45 /data/jdk11/bin/java -XX:+UseG1GC -Xmx536870902 -Xms536870902 -XX:MaxDirectMemorySize=268435458 -XX:MaxMetaspaceSize=268435456 -Dlog.file=/data/flink/log/flink-root-taskexecutor-6-centos.log -Dlog4j.configuration=file:/data/flink/conf/log4j.properties -Dlog4j.configurationFile=file:/data/flink/conf/log4j.properties -Dlogback.configurationFile=file:/data/flink/conf/logback.xml -classpath /data/flink/lib/flink-csv-1.13.2.jar:/data/flink/lib/flink-json-1.13.2.jar:/data/flink/lib/flink-shaded-zookeeper-3.4.14.jar:/data/flink/lib/flink-table_2.12-1.13.2.jar:/data/flink/lib/flink-table-blink_2.12-1.13.2.jar:/data/flink/lib/log4j-1.2-api-2.12.1.jar:/data/flink/lib/log4j-api-2.12.1.jar:/data/flink/lib/log4j-core-2.12.1.jar:/data/flink/lib/log4j-slf4j-impl-2.12.1.jar:/data/flink/lib/flink-dist_2.12-1.13.2.jar::: org.apache.flink.runtime.taskexecutor.TaskManagerRunner --configDir /data/flink/conf -D taskmanager.memory.network.min=134217730b -D taskmanager.cpu.cores=1.0 -D taskmanager.memory.task.off-heap.size=0b -D taskmanager.memory.jvm-metaspace.size=268435456b -D external-resources=none -D taskmanager.memory.jvm-overhead.min=201326592b -D taskmanager.memory.framework.off-heap.size=134217728b -D taskmanager.memory.network.max=134217730b -D taskmanager.memory.framework.heap.size=134217728b -D taskmanager.memory.managed.size=536870920b -D taskmanager.memory.task.heap.size=402653174b -D taskmanager.numberOfTaskSlots=1 -D taskmanager.memory.jvm-overhead.max=201326592b

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.