Giter Club home page Giter Club logo

docker-hive's Introduction

Hive Meatastore

Hive Metastore Build

Build

CD is run through DockerHub in this repo.

docker build -t "IBM/hive-metastore:master" .

docker-hive's People

Contributors

adriannabeck avatar apreethi13 avatar codyjlin avatar justineyster avatar meneal avatar shawnzhu avatar stevemar avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

docker-hive's Issues

Upgrade hive 2.3.5 -> 2.3.7

Hive 2.3.5 no longer exists here: https://www-us.apache.org/dist/hive/hive-2.3.5/apache-hive-2.3.5-bin.tar.gz

How to use this docker image?

Hello,

When I run this image, I doesn't go into hive environment:

[zd@VM-0-5-centos ~]$ sudo docker run --rm meneal/docker-hive
2021-06-27 11:03:37: Starting Hive Metastore Server
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/apache-hive-3.1.2-bin/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hadoop-3.1.3/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]

When I try with interactive mode, error reported:

[zd@VM-0-5-centos ~]$ sudo docker run --rm -it meneal/docker-hive /bin/bash
[sudo] password for zd: 
bin/hive: line 351: ps: command not found
bin/hive: line 351: ps: command not found
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/apache-hive-3.1.2-bin/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hadoop-3.1.3/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Hive Session ID = cbdae5aa-f9e3-4890-9eb1-3e6d2f82ffff

Logging initialized using configuration in jar:file:/opt/apache-hive-3.1.2-bin/lib/hive-common-3.1.2.jar!/hive-log4j2.properties Async: true
Exception in thread "main" java.lang.RuntimeException: The dir: /tmp/hive on HDFS should be writable. Current permissions are: rwxr-xr-x
        at org.apache.hadoop.hive.ql.exec.Utilities.ensurePathIsWritable(Utilities.java:4501)
        at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:760)
        at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:701)
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:627)
        at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:591)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:747)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:232)

Any suggestions?

Best,

Shixiang

Image fails to start

entrypoint.sh uses start-metastore which does not exist in hive 3.1.2 causing the container to fail to start.

org.postgresql.util.PSQLException: ERROR: column am.amcanorder does not exist

When using container image from main branch with Hive 2.3.8, I've got this error message:

      > [hive-metastore-59bb8fbdf-m4cxv metastore] Caused by: org.postgresql.util.PSQLException: ERROR: column am.amcanorder does not exist
      > [hive-metastore-59bb8fbdf-m4cxv metastore]   Position: 427
      > [hive-metastore-59bb8fbdf-m4cxv metastore] 	at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2284)
      > [hive-metastore-59bb8fbdf-m4cxv metastore] 	at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2003)
      > [hive-metastore-59bb8fbdf-m4cxv metastore] 	at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:200)
      > [hive-metastore-59bb8fbdf-m4cxv metastore] 	at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:424)
      > [hive-metastore-59bb8fbdf-m4cxv metastore] 	at org.postgresql.jdbc.PgStatement.executeWithFlags(PgStatement.java:321)
      > [hive-metastore-59bb8fbdf-m4cxv metastore] 	at org.postgresql.jdbc.PgStatement.executeQuery(PgStatement.java:284)
      > [hive-metastore-59bb8fbdf-m4cxv metastore] 	at org.postgresql.jdbc.PgDatabaseMetaData.getIndexInfo(PgDatabaseMetaData.java:2948)
      > [hive-metastore-59bb8fbdf-m4cxv metastore] 	at org.datanucleus.store.rdbms.schema.RDBMSSchemaHandler.getRDBMSTableIndexInfoForTable(RDBMSSchemaHandler.java:813)
      > [hive-metastore-59bb8fbdf-m4cxv metastore] 	at org.datanucleus.store.rdbms.schema.RDBMSSchemaHandler.getRDBMSTableIndexInfoForTable(RDBMSSchemaHandler.java:783)
      > [hive-metastore-59bb8fbdf-m4cxv metastore] 	at org.datanucleus.store.rdbms.schema.RDBMSSchemaHandler.getSchemaData(RDBMSSchemaHandler.java:333)
      > [hive-metastore-59bb8fbdf-m4cxv metastore] 	at org.datanucleus.store.rdbms.table.TableImpl.getExistingCandidateKeys(TableImpl.java:1060)
      > [hive-metastore-59bb8fbdf-m4cxv metastore] 	at org.datanucleus.store.rdbms.table.TableImpl.validateCandidateKeys(TableImpl.java:686)
      > [hive-metastore-59bb8fbdf-m4cxv metastore] 	at org.datanucleus.store.rdbms.table.TableImpl.validateConstraints(TableImpl.java:393)
      > [hive-metastore-59bb8fbdf-m4cxv metastore] 	at org.datanucleus.store.rdbms.table.ClassTable.validateConstraints(ClassTable.java:3576)
      > [hive-metastore-59bb8fbdf-m4cxv metastore] 	at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:3471)
      > [hive-metastore-59bb8fbdf-m4cxv metastore] 	at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2896)
      > [hive-metastore-59bb8fbdf-m4cxv metastore] 	at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:119)
      > [hive-metastore-59bb8fbdf-m4cxv metastore] 	at org.datanucleus.store.rdbms.RDBMSStoreManager.manageClasses(RDBMSStoreManager.java:1627)
      > [hive-metastore-59bb8fbdf-m4cxv metastore] 	at org.datanucleus.store.rdbms.RDBMSStoreManager.getDatastoreClass(RDBMSStoreManager.java:672)
      > [hive-metastore-59bb8fbdf-m4cxv metastore] 	at org.datanucleus.store.rdbms.RDBMSStoreManager.getPropertiesForGenerator(RDBMSStoreManager.java:2088)
      > [hive-metastore-59bb8fbdf-m4cxv metastore] 	at org.datanucleus.store.AbstractStoreManager.getStrategyValue(AbstractStoreManager.java:1271)
      > [hive-metastore-59bb8fbdf-m4cxv metastore] 	at org.datanucleus.ExecutionContextImpl.newObjectId(ExecutionContextImpl.java:3760)
      > [hive-metastore-59bb8fbdf-m4cxv metastore] 	at org.datanucleus.state.StateManagerImpl.setIdentity(StateManagerImpl.java:2267)
      > [hive-metastore-59bb8fbdf-m4cxv metastore] 	at org.datanucleus.state.StateManagerImpl.initialiseForPersistentNew(StateManagerImpl.java:484)
      > [hive-metastore-59bb8fbdf-m4cxv metastore] 	at org.datanucleus.state.StateManagerImpl.initialiseForPersistentNew(StateManagerImpl.java:120)
      > [hive-metastore-59bb8fbdf-m4cxv metastore] 	at org.datanucleus.state.ObjectProviderFactoryImpl.newForPersistentNew(ObjectProviderFactoryImpl.java:218)
      > [hive-metastore-59bb8fbdf-m4cxv metastore] 	at org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:2079)
      > [hive-metastore-59bb8fbdf-m4cxv metastore] 	at org.datanucleus.ExecutionContextImpl.persistObjectWork(ExecutionContextImpl.java:1923)
      > [hive-metastore-59bb8fbdf-m4cxv metastore] 	at org.datanucleus.ExecutionContextImpl.persistObject(ExecutionContextImpl.java:1778)
      > [hive-metastore-59bb8fbdf-m4cxv metastore] 	at org.datanucleus.ExecutionContextThreadedImpl.persistObject(ExecutionContextThreadedImpl.java:217)
      > [hive-metastore-59bb8fbdf-m4cxv metastore] 	at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:724)
      > [hive-metastore-59bb8fbdf-m4cxv metastore] 	... 28 more

I've learned from this answer that it requires an upgrade on postgres JDBC driver, so I inspected the postgres driver from Hive 2.3.8 and found it's postgresql-9.4.1208.jre7.jar while the latest 9.4.x driver is postgresql-9.4.1212.jre7.jar

After upgrading this JDBC driver, problem disappeared.

Expected fix

Upgrade postgres jdbc driver after extracting Hive libraries.

Hive 2.3.8 no longer exists

Expected

docker build . runs without error.

Actual:

[+] Building 4.6s (7/12)
 => [internal] load build definition from Dockerfile                                                                                                                                                                                   0.0s
 => => transferring dockerfile: 1.19kB                                                                                                                                                                                                 0.0s
 => [internal] load .dockerignore                                                                                                                                                                                                      0.0s
 => => transferring context: 2B                                                                                                                                                                                                        0.0s
 => [internal] load metadata for docker.io/azul/zulu-openjdk-debian:11.0.11                                                                                                                                                            0.0s
 => [internal] load build context                                                                                                                                                                                                      0.0s
 => => transferring context: 3.21kB                                                                                                                                                                                                    0.0s
 => [1/8] FROM docker.io/azul/zulu-openjdk-debian:11.0.11                                                                                                                                                                              0.0s
 => [2/8] WORKDIR /opt                                                                                                                                                                                                                 0.0s
 => ERROR [3/8] RUN apt-get update &&     apt-get -qqy install curl &&     curl -L https://www-us.apache.org/dist/hive/hive-2.3.8/apache-hive-2.3.8-bin.tar.gz | tar zxf - &&     curl -L https://www-us.apache.org/dist/hadoop/commo  4.5s
------                                                                                                                                                                                                                                       > [3/8] RUN apt-get update &&     apt-get -qqy install curl &&     curl -L https://www-us.apache.org/dist/hive/hive-2.3.8/apache-hive-2.3.8-bin.tar.gz | tar zxf - &&     curl -L https://www-us.apache.org/dist/hadoop/common/hadoop-2.10.1/hadoop-2.10.1.tar.gz | tar zxf - &&     apt-get install --only-upgrade openssl libssl1.1 &&     apt-get install -y libk5crypto3 libkrb5-3 libsqlite3-0:
#6 0.239 Get:1 http://security.debian.org/debian-security buster/updates InRelease [65.4 kB]
#6 0.258 Get:2 http://deb.debian.org/debian buster InRelease [121 kB]
#6 0.274 Get:3 http://deb.debian.org/debian buster-updates InRelease [51.9 kB]
#6 0.368 Get:4 http://security.debian.org/debian-security buster/updates/main amd64 Packages [292 kB]
#6 0.406 Get:5 https://repos.azul.com/zulu/deb stable InRelease [3191 B]
#6 0.458 Get:6 http://deb.debian.org/debian buster/main amd64 Packages [7907 kB]
#6 0.700 Get:7 http://deb.debian.org/debian buster-updates/main amd64 Packages [10.9 kB]
#6 0.785 Get:8 https://repos.azul.com/zulu/deb stable/main amd64 Packages [82.7 kB]
#6 1.566 Fetched 8534 kB in 1s (6316 kB/s)
#6 1.566 Reading package lists...
#6 3.283 perl: warning: Setting locale failed.
#6 3.284 perl: warning: Please check that your locale settings:
#6 3.284        LANGUAGE = "en_US:en",
#6 3.284        LC_ALL = "en_US.UTF-8",
#6 3.284        LANG = "en_US.UTF-8"
#6 3.284     are supported and installed on your system.
#6 3.284 perl: warning: Falling back to the standard locale ("C").
#6 3.342 debconf: delaying package configuration, since apt-utils is not installed
#6 3.373 Selecting previously unselected package krb5-locales.
(Reading database ... 8934 files and directories currently installed.)
#6 3.380 Preparing to unpack .../00-krb5-locales_1.17-3+deb10u1_all.deb ...
#6 3.381 Unpacking krb5-locales (1.17-3+deb10u1) ...
#6 3.404 Selecting previously unselected package libkeyutils1:amd64.
#6 3.405 Preparing to unpack .../01-libkeyutils1_1.6-6_amd64.deb ...
#6 3.408 Unpacking libkeyutils1:amd64 (1.6-6) ...
#6 3.426 Selecting previously unselected package libkrb5support0:amd64.
#6 3.426 Preparing to unpack .../02-libkrb5support0_1.17-3+deb10u1_amd64.deb ...
#6 3.428 Unpacking libkrb5support0:amd64 (1.17-3+deb10u1) ...
#6 3.447 Selecting previously unselected package libk5crypto3:amd64.
#6 3.448 Preparing to unpack .../03-libk5crypto3_1.17-3+deb10u1_amd64.deb ...
#6 3.450 Unpacking libk5crypto3:amd64 (1.17-3+deb10u1) ...
#6 3.474 Selecting previously unselected package libkrb5-3:amd64.
#6 3.475 Preparing to unpack .../04-libkrb5-3_1.17-3+deb10u1_amd64.deb ...
#6 3.477 Unpacking libkrb5-3:amd64 (1.17-3+deb10u1) ...
#6 3.520 Selecting previously unselected package libgssapi-krb5-2:amd64.
#6 3.522 Preparing to unpack .../05-libgssapi-krb5-2_1.17-3+deb10u1_amd64.deb ...
#6 3.523 Unpacking libgssapi-krb5-2:amd64 (1.17-3+deb10u1) ...
#6 3.546 Selecting previously unselected package libsasl2-modules-db:amd64.
#6 3.548 Preparing to unpack .../06-libsasl2-modules-db_2.1.27+dfsg-1+deb10u1_amd64.deb ...
#6 3.549 Unpacking libsasl2-modules-db:amd64 (2.1.27+dfsg-1+deb10u1) ...
#6 3.567 Selecting previously unselected package libsasl2-2:amd64.
#6 3.569 Preparing to unpack .../07-libsasl2-2_2.1.27+dfsg-1+deb10u1_amd64.deb ...
#6 3.570 Unpacking libsasl2-2:amd64 (2.1.27+dfsg-1+deb10u1) ...
#6 3.590 Selecting previously unselected package libldap-common.
#6 3.592 Preparing to unpack .../08-libldap-common_2.4.47+dfsg-3+deb10u6_all.deb ...
#6 3.594 Unpacking libldap-common (2.4.47+dfsg-3+deb10u6) ...
#6 3.614 Selecting previously unselected package libldap-2.4-2:amd64.
#6 3.616 Preparing to unpack .../09-libldap-2.4-2_2.4.47+dfsg-3+deb10u6_amd64.deb ...
#6 3.617 Unpacking libldap-2.4-2:amd64 (2.4.47+dfsg-3+deb10u6) ...
#6 3.646 Selecting previously unselected package libnghttp2-14:amd64.
#6 3.648 Preparing to unpack .../10-libnghttp2-14_1.36.0-2+deb10u1_amd64.deb ...
#6 3.649 Unpacking libnghttp2-14:amd64 (1.36.0-2+deb10u1) ...
#6 3.671 Selecting previously unselected package libpsl5:amd64.
#6 3.673 Preparing to unpack .../11-libpsl5_0.20.2-2_amd64.deb ...
#6 3.674 Unpacking libpsl5:amd64 (0.20.2-2) ...
#6 3.691 Selecting previously unselected package librtmp1:amd64.
#6 3.692 Preparing to unpack .../12-librtmp1_2.4+20151223.gitfa8646d.1-2_amd64.deb ...
#6 3.694 Unpacking librtmp1:amd64 (2.4+20151223.gitfa8646d.1-2) ...
#6 3.712 Selecting previously unselected package libssh2-1:amd64.
#6 3.714 Preparing to unpack .../13-libssh2-1_1.8.0-2.1_amd64.deb ...
#6 3.716 Unpacking libssh2-1:amd64 (1.8.0-2.1) ...
#6 3.739 Selecting previously unselected package libcurl4:amd64.
#6 3.741 Preparing to unpack .../14-libcurl4_7.64.0-4+deb10u2_amd64.deb ...
#6 3.742 Unpacking libcurl4:amd64 (7.64.0-4+deb10u2) ...
#6 3.779 Selecting previously unselected package curl.
#6 3.781 Preparing to unpack .../15-curl_7.64.0-4+deb10u2_amd64.deb ...
#6 3.782 Unpacking curl (7.64.0-4+deb10u2) ...
#6 3.814 Selecting previously unselected package libsasl2-modules:amd64.
#6 3.815 Preparing to unpack .../16-libsasl2-modules_2.1.27+dfsg-1+deb10u1_amd64.deb ...
#6 3.817 Unpacking libsasl2-modules:amd64 (2.1.27+dfsg-1+deb10u1) ...
#6 3.838 Selecting previously unselected package publicsuffix.
#6 3.839 Preparing to unpack .../17-publicsuffix_20190415.1030-1_all.deb ...
#6 3.841 Unpacking publicsuffix (20190415.1030-1) ...
#6 3.865 Setting up libkeyutils1:amd64 (1.6-6) ...
#6 3.869 Setting up libpsl5:amd64 (0.20.2-2) ...
#6 3.874 Setting up libsasl2-modules:amd64 (2.1.27+dfsg-1+deb10u1) ...
#6 3.880 Setting up libnghttp2-14:amd64 (1.36.0-2+deb10u1) ...
#6 3.884 Setting up krb5-locales (1.17-3+deb10u1) ...
#6 3.889 Setting up libldap-common (2.4.47+dfsg-3+deb10u6) ...
#6 3.895 Setting up libkrb5support0:amd64 (1.17-3+deb10u1) ...
#6 3.899 Setting up libsasl2-modules-db:amd64 (2.1.27+dfsg-1+deb10u1) ...
#6 3.902 Setting up librtmp1:amd64 (2.4+20151223.gitfa8646d.1-2) ...
#6 3.907 Setting up libk5crypto3:amd64 (1.17-3+deb10u1) ...
#6 3.911 Setting up libsasl2-2:amd64 (2.1.27+dfsg-1+deb10u1) ...
#6 3.914 Setting up libssh2-1:amd64 (1.8.0-2.1) ...
#6 3.919 Setting up libkrb5-3:amd64 (1.17-3+deb10u1) ...
#6 3.922 Setting up publicsuffix (20190415.1030-1) ...
#6 3.926 Setting up libldap-2.4-2:amd64 (2.4.47+dfsg-3+deb10u6) ...
#6 3.930 Setting up libgssapi-krb5-2:amd64 (1.17-3+deb10u1) ...
#6 3.936 Setting up libcurl4:amd64 (7.64.0-4+deb10u2) ...
#6 3.939 Setting up curl (7.64.0-4+deb10u2) ...
#6 3.944 Processing triggers for libc-bin (2.28-10) ...
#6 3.972   % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
#6 3.973                                  Dload  Upload   Total   Spent    Left  Speed
100   257  100   257    0     0   3426      0 --:--:-- --:--:-- --:--:--  3426
100   196  100   196    0     0    442      0 --:--:-- --:--:-- --:--:--   442
#6 4.419
#6 4.419 gzip: stdin: not in gzip format
#6 4.419 tar: Child returned status 1
#6 4.419 tar: Error is not recoverable: exiting now
------
executor failed running [/bin/sh -c apt-get update &&     apt-get -qqy install curl &&     curl -L https://www-us.apache.org/dist/hive/hive-2.3.8/apache-hive-2.3.8-bin.tar.gz | tar zxf - &&     curl -L https://www-us.apache.org/dist/hadoop/common/hadoop-2.10.1/hadoop-2.10.1.tar.gz | tar zxf - &&     apt-get install --only-upgrade openssl libssl1.1 &&     apt-get install -y libk5crypto3 libkrb5-3 libsqlite3-0]: exit code: 2

Diagnostic

curl -L https://www-us.apache.org/dist/hive/hive-2.3.8/apache-hive-2.3.8-bin.tar.gz                                                                            4981ms ๎‚ณ Wed Jun 16 15:51:25 2021
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>404 Not Found</title>
</head><body>
<h1>Not Found</h1>
<p>The requested URL was not found on this server.</p>
</body></html>

I cant figure out how to persistance mariadb data

Hive crash when triying to create tables at start and print this error

Error: Table 'CTLGS' already exists (state=42S01,code=1050)

Closing: 0: jdbc:mysql://192.168.5.249:3306/hive?createDatabaseIfNotExist=true

org.apache.hadoop.hive.metastore.HiveMetaException: Schema initialization FAILED! Metastore state would be inconsistent !!

Underlying cause: java.io.IOException : Schema script failed, errorcode 2

org.apache.hadoop.hive.metastore.HiveMetaException: Schema initialization FAILED! Metastore state would be inconsistent !!

at org.apache.hive.beeline.HiveSchemaTool.doInit(HiveSchemaTool.java:594)

at org.apache.hive.beeline.HiveSchemaTool.doInit(HiveSchemaTool.java:567)

at org.apache.hive.beeline.HiveSchemaTool.main(HiveSchemaTool.java:1517)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:498)

at org.apache.hadoop.util.RunJar.run(RunJar.java:323)

at org.apache.hadoop.util.RunJar.main(RunJar.java:236)

Caused by: java.io.IOException: Schema script failed, errorcode 2

at org.apache.hive.beeline.HiveSchemaTool.runBeeLine(HiveSchemaTool.java:1226)

at org.apache.hive.beeline.HiveSchemaTool.runBeeLine(HiveSchemaTool.java:1204)

at org.apache.hive.beeline.HiveSchemaTool.doInit(HiveSchemaTool.java:590)

... 8 more

*** schemaTool failed ***

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.