eshareditor / ambari-hue-service Goto Github PK
View Code? Open in Web Editor NEWAmbari stack service for easily installing and managing Hue on HDP cluster
License: Apache License 2.0
Ambari stack service for easily installing and managing Hue on HDP cluster
License: Apache License 2.0
Hi,
I'm trying to install this extension on Ubuntu 14.04, with Ambari server.
The installation fails, saying "Unable to locate package krb5-devel"
Googling around I find out that that package is no longer manteined. Is there a workaround?
I use the ambari2.4.2 to installed HDP2.5.3.0, use ambari-hue-service to install hue service successfully,and to start hue successfully, but period of time auto stopped, no error log, so hope you help
ambari-hue-service/package/scripts/params.py
Line 362 in 3305ccd
In case of Spark2, the variable is:
livy2_livyserver_hosts
https://github.com/apache/ambari/blob/e522648233667f8317488bbe84ee3faf4e5e60c9/ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/params.py#L235
centos6.9 + hdp 2.6.1 +ambari Version 2.5.1.0 ( HDFS HA )
error:
/usr/lib/python2.6/site-packages/resource_management/core/environment.py:165: DeprecationWarning: BaseException.message has been deprecated as of Python 2.6
Logger.info("Skipping failure of {0} due to ignore_failures. Failure reason: {1}".format(resource, ex.message))
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/2.6/services/HUE/package/scripts/hue_server.py", line 76, in
HueServer().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 329, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.6/services/HUE/package/scripts/hue_server.py", line 28, in start
self.configure(env)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 119, in locking_configure
original_configure(obj, args, **kw)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.6/services/HUE/package/scripts/hue_server.py", line 23, in configure
setup_hue()
File "/var/lib/ambari-agent/cache/stacks/HDP/2.6/services/HUE/package/scripts/setup_hue.py", line 49, in setup_hue
add_hdfs_configuration(params.has_ranger_admin, params.security_enabled)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.6/services/HUE/package/scripts/common.py", line 94, in add_hdfs_configuration
add_configurations(services_configurations)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.6/services/HUE/package/scripts/common.py", line 156, in add_configurations
Execute(cmd)
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in init
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 262, in action_run
tries=self.resource.tries, try_sleep=self.resource.try_sleep)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner
result = function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call
tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of '/var/lib/ambari-agent/cache/stacks/HDP/2.6/services/HUE/package/files/configs.sh set yxambari.yixia.com yxcluster httpfs-site 'httpfs.proxyuser.hue.groups' ''' returned 1. [ERROR] "httpfs-site" not found in server response.
[ERROR] Output of curl -k -s -u admin:yxadmin123456 "http://yxambari.yixia.com:8080/api/v1/clusters/yxcluster?fields=Clusters/desired_configs"
is:
[ERROR] { "href" : "http://yxambari.yixia.com:8080/api/v1/clusters/yxcluster?fields=Clusters/desired_configs", "Clusters" : { "cluster_name" : "yxcluster", "version" : "HDP-2.6", "desired_configs" : { "admin-log4j" : { "tag" : "version1501986852615", "user" : "admin", "version" : 6 }, "admin-properties" : { "tag" : "version1501988174500", "user" : "admin", "version" : 10 }, "ams-env" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "ams-grafana-env" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "ams-grafana-ini" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "ams-hbase-env" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "ams-hbase-log4j" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "ams-hbase-policy" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "ams-hbase-security-site" : { "tag" : "version1502016057801", "user" : "admin", "version" : 4 }, "ams-hbase-site" : { "tag" : "version1502016057477", "user" : "admin", "version" : 4 }, "ams-log4j" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "ams-logsearch-conf" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "ams-site" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "ams-ssl-client" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "ams-ssl-server" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "atlas-tagsync-ssl" : { "tag" : "version1501986852615", "user" : "admin", "version" : 6 }, "beeline-log4j2" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "capacity-scheduler" : { "tag" : "version1502068187846", "user" : "admin", "version" : 6 }, "cluster-env" : { "tag" : "version1502016057916", "user" : "admin", "version" : 4 }, "core-site" : { "tag" : "version1502464540735", "user" : "admin", "version" : 19 }, "dbks-site" : { "tag" : "version1502077245299", "user" : "admin", "version" : 2 }, "hadoop-env" : { "tag" : "version1502064587033", "user" : "admin", "version" : 5 }, "hadoop-metrics2.properties" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "hadoop-policy" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "hcat-env" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "hdfs-log4j" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "hdfs-logsearch-conf" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "hdfs-site" : { "tag" : "version1502446142364", "user" : "admin", "version" : 8 }, "hive-atlas-application.properties" : { "tag" : "version1502016057772", "user" : "admin", "version" : 4 }, "hive-env" : { "tag" : "version1502085711698", "user" : "admin", "version" : 4 }, "hive-exec-log4j" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "hive-exec-log4j2" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "hive-interactive-env" : { "tag" : "version1502458946646", "user" : "admin", "version" : 4 }, "hive-interactive-site" : { "tag" : "version1502446168582", "user" : "admin", "version" : 8 }, "hive-log4j" : { "tag" : "version1502081574947", "user" : "admin", "version" : 2 }, "hive-log4j2" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "hive-logsearch-conf" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "hive-site" : { "tag" : "version1502085711696", "user" : "admin", "version" : 8 }, "hivemetastore-site" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "hiveserver2-interactive-site" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "hiveserver2-site" : { "tag" : "version1502014500536", "user" : "admin", "version" : 2 }, "hue-auth-site" : { "tag" : "version1502466227924", "user" : "admin", "version" : 2 }, "hue-desktop-site" : { "tag" : "version1502466470898", "user" : "admin", "version" : 4 }, "hue-env" : { "tag" : "version1502466227924", "user" : "admin", "version" : 2 }, "hue-hadoop-site" : { "tag" : "version1502466227924", "user" : "admin", "version" : 2 }, "hue-hbase-site" : { "tag" : "version1502466227924", "user" : "admin", "version" : 2 }, "hue-hive-site" : { "tag" : "version1502466227924", "user" : "admin", "version" : 2 }, "hue-log4j-env" : { "tag" : "version1502466227924", "user" : "admin", "version" : 2 }, "hue-notebook-site" : { "tag" : "version1502466227924", "user" : "admin", "version" : 2 }, "hue-oozie-site" : { "tag" : "version1502466227924", "user" : "admin", "version" : 2 }, "hue-pig-site" : { "tag" : "version1502466227924", "user" : "admin", "version" : 2 }, "hue-rdbms-site" : { "tag" : "version1502466227924", "user" : "admin", "version" : 2 }, "hue-solr-site" : { "tag" : "version1502466227924", "user" : "admin", "version" : 2 }, "hue-spark-site" : { "tag" : "version1502466227924", "user" : "admin", "version" : 2 }, "hue-ugsync-site" : { "tag" : "version1502466227924", "user" : "admin", "version" : 2 }, "hue-zookeeper-site" : { "tag" : "version1502466227924", "user" : "admin", "version" : 2 }, "kerberos-env" : { "tag" : "version1502422741203", "user" : "admin", "version" : 9 }, "kms-env" : { "tag" : "version1502077246900", "user" : "admin", "version" : 1 }, "kms-log4j" : { "tag" : "version1502077246900", "user" : "admin", "version" : 1 }, "kms-properties" : { "tag" : "version1502077246900", "user" : "admin", "version" : 1 }, "kms-site" : { "tag" : "version1502444450844", "user" : "admin", "version" : 4 }, "krb5-conf" : { "tag" : "version1502422741204", "user" : "admin", "version" : 9 }, "livy2-conf" : { "tag" : "version1502016057947", "user" : "admin", "version" : 4 }, "livy2-env" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "livy2-log4j-properties" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "livy2-spark-blacklist" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "llap-cli-log4j2" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "llap-daemon-log4j" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "mapred-env" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "mapred-logsearch-conf" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "mapred-site" : { "tag" : "version1502016057615", "user" : "admin", "version" : 4 }, "oozie-env" : { "tag" : "version1502444455719", "user" : "admin", "version" : 1 }, "oozie-log4j" : { "tag" : "version1502444455719", "user" : "admin", "version" : 1 }, "oozie-logsearch-conf" : { "tag" : "version1502444455719", "user" : "admin", "version" : 1 }, "oozie-site" : { "tag" : "version1502459031834", "user" : "admin", "version" : 3 }, "pig-env" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "pig-log4j" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "pig-properties" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "pseudo-distributed.ini" : { "tag" : "version1502466227924", "user" : "admin", "version" : 2 }, "ranger-admin-site" : { "tag" : "version1502016057630", "user" : "admin", "version" : 12 }, "ranger-env" : { "tag" : "version1502014500530", "user" : "admin", "version" : 10 }, "ranger-hdfs-audit" : { "tag" : "version1502016057833", "user" : "admin", "version" : 7 }, "ranger-hdfs-plugin-properties" : { "tag" : "version1502014500533", "user" : "admin", "version" : 3 }, "ranger-hdfs-policymgr-ssl" : { "tag" : "version1501849108055", "user" : "admin", "version" : 2 }, "ranger-hdfs-security" : { "tag" : "version1501849108054", "user" : "admin", "version" : 2 }, "ranger-hive-audit" : { "tag" : "version1502016057963", "user" : "admin", "version" : 7 }, "ranger-hive-plugin-properties" : { "tag" : "version1501849108060", "user" : "admin", "version" : 2 }, "ranger-hive-policymgr-ssl" : { "tag" : "version1501849108063", "user" : "admin", "version" : 2 }, "ranger-hive-security" : { "tag" : "version1501849108062", "user" : "admin", "version" : 2 }, "ranger-kms-audit" : { "tag" : "version1502077245283", "user" : "admin", "version" : 2 }, "ranger-kms-logsearch-conf" : { "tag" : "version1502077246900", "user" : "admin", "version" : 1 }, "ranger-kms-policymgr-ssl" : { "tag" : "version1502077246900", "user" : "admin", "version" : 1 }, "ranger-kms-security" : { "tag" : "version1502077246900", "user" : "admin", "version" : 1 }, "ranger-kms-site" : { "tag" : "version1502077246900", "user" : "admin", "version" : 1 }, "ranger-logsearch-conf" : { "tag" : "version1501986852615", "user" : "admin", "version" : 6 }, "ranger-site" : { "tag" : "version1501986852615", "user" : "admin", "version" : 6 }, "ranger-solr-configuration" : { "tag" : "version1501986852615", "user" : "admin", "version" : 6 }, "ranger-tagsync-policymgr-ssl" : { "tag" : "version1501986852615", "user" : "admin", "version" : 6 }, "ranger-tagsync-site" : { "tag" : "version1502016057601", "user" : "admin", "version" : 9 }, "ranger-ugsync-site" : { "tag" : "version1502016057507", "user" : "admin", "version" : 9 }, "ranger-yarn-audit" : { "tag" : "version1502016057663", "user" : "admin", "version" : 7 }, "ranger-yarn-plugin-properties" : { "tag" : "version1502014500541", "user" : "admin", "version" : 3 }, "ranger-yarn-policymgr-ssl" : { "tag" : "version1501849108059", "user" : "admin", "version" : 2 }, "ranger-yarn-security" : { "tag" : "version1501849108058", "user" : "admin", "version" : 2 }, "slider-client" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "slider-env" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "slider-log4j" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "spark-defaults" : { "tag" : "version1502016057979", "user" : "admin", "version" : 3 }, "spark-thrift-sparkconf" : { "tag" : "version1502016057787", "user" : "admin", "version" : 3 }, "spark2-defaults" : { "tag" : "version1502016057721", "user" : "admin", "version" : 4 }, "spark2-env" : { "tag" : "version1502159640746", "user" : "admin", "version" : 2 }, "spark2-hive-site-override" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "spark2-log4j-properties" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "spark2-logsearch-conf" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "spark2-metrics-properties" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "spark2-thrift-fairscheduler" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "spark2-thrift-sparkconf" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "sqoop-atlas-application.properties" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "sqoop-env" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "sqoop-site" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "ssl-client" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "ssl-server" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "tagsync-application-properties" : { "tag" : "version1502016057492", "user" : "admin", "version" : 9 }, "tagsync-log4j" : { "tag" : "version1501986852615", "user" : "admin", "version" : 6 }, "tez-env" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "tez-interactive-site" : { "tag" : "version1502178115256", "user" : "admin", "version" : 2 }, "tez-site" : { "tag" : "version1502016057677", "user" : "admin", "version" : 4 }, "usersync-log4j" : { "tag" : "version1501986852615", "user" : "admin", "version" : 6 }, "usersync-properties" : { "tag" : "version1501986852615", "user" : "admin", "version" : 6 }, "webhcat-env" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "webhcat-log4j" : { "tag" : "version1502081574948", "user" : "admin", "version" : 2 }, "webhcat-site" : { "tag" : "version1502458847046", "user" : "admin", "version" : 6 }, "yarn-env" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "yarn-log4j" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "yarn-logsearch-conf" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "yarn-site" : { "tag" : "version1502085637584", "user" : "admin", "version" : 7 }, "zoo.cfg" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "zookeeper-env" : { "tag" : "version1502016057849", "user" : "admin", "version" : 4 }, "zookeeper-log4j" : { "tag" : "version1", "user" : "admin", "version" : 1 }, "zookeeper-logsearch-conf" : { "tag" : "version1", "user" : "admin", "version" : 1 } } } }
stdout: /var/lib/ambari-agent/data/output-5823.txt
how can i slove it ? thanks
Traceback (most recent call last): File "/var/lib/ambari-agent/cache/stacks/HDP/2.4/services/HUE/package/scripts/hue_server.py", line 76, in <module> HueServer().execute() File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 280, in execute method(env) File "/var/lib/ambari-agent/cache/stacks/HDP/2.4/services/HUE/package/scripts/hue_server.py", line 28, in start self.configure(env) File "/var/lib/ambari-agent/cache/stacks/HDP/2.4/services/HUE/package/scripts/hue_server.py", line 23, in configure setup_hue() File "/var/lib/ambari-agent/cache/stacks/HDP/2.4/services/HUE/package/scripts/setup_hue.py", line 27, in setup_hue Link("{0}/desktop/libs/hadoop/java-lib/*".format(params.hue_dir),to = "/usr/hdp/current/hadoop-client/lib") File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__ self.env.run() File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run self.run_action(resource, action) File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action provider_action() File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 238, in action_create sudo.symlink(self.resource.to, path) File "/usr/lib/python2.6/site-packages/resource_management/core/sudo.py", line 93, in symlink os.symlink(source, link_name) OSError: [Errno 2] No such file or directory
We put a link on http://gethue.com/ to promote your plugin, if you are fine with it?
hi, I need help, an error occurred when I installed hue server on the ambari, the hue version is 4.2.0. The log is as follows:
`2019-10-14 16:01:35,001 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6
2019-10-14 16:01:35,007 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2019-10-14 16:01:35,008 - Group['livy'] {}
2019-10-14 16:01:35,009 - Group['spark'] {}
2019-10-14 16:01:35,010 - Group['ranger'] {}
2019-10-14 16:01:35,010 - Group['hdfs'] {}
2019-10-14 16:01:35,010 - Group['hue'] {}
2019-10-14 16:01:35,010 - Group['zeppelin'] {}
2019-10-14 16:01:35,010 - Group['hadoop'] {}
2019-10-14 16:01:35,010 - Group['users'] {}
2019-10-14 16:01:35,011 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2019-10-14 16:01:35,012 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2019-10-14 16:01:35,014 - User['superset'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2019-10-14 16:01:35,015 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2019-10-14 16:01:35,016 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2019-10-14 16:01:35,017 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2019-10-14 16:01:35,018 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2019-10-14 16:01:35,020 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'ranger'], 'uid': None}
2019-10-14 16:01:35,021 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2019-10-14 16:01:35,022 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'zeppelin', u'hadoop'], 'uid': None}
2019-10-14 16:01:35,023 - User['logsearch'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2019-10-14 16:01:35,024 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2019-10-14 16:01:35,026 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2019-10-14 16:01:35,027 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2019-10-14 16:01:35,028 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2019-10-14 16:01:35,029 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2019-10-14 16:01:35,030 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None}
2019-10-14 16:01:35,031 - User['hue'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2019-10-14 16:01:35,033 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2019-10-14 16:01:35,034 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2019-10-14 16:01:35,035 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2019-10-14 16:01:35,036 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2019-10-14 16:01:35,037 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2019-10-14 16:01:35,038 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2019-10-14 16:01:35,040 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2019-10-14 16:01:35,052 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2019-10-14 16:01:35,053 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2019-10-14 16:01:35,054 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2019-10-14 16:01:35,056 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2019-10-14 16:01:35,057 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2019-10-14 16:01:35,071 - call returned (0, '1016')
2019-10-14 16:01:35,071 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1016'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2019-10-14 16:01:35,085 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1016'] due to not_if
2019-10-14 16:01:35,086 - Group['hdfs'] {}
2019-10-14 16:01:35,086 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', u'hdfs']}
2019-10-14 16:01:35,087 - FS Type:
2019-10-14 16:01:35,088 - Directory['/etc/hadoop'] {'mode': 0755}
2019-10-14 16:01:35,104 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2019-10-14 16:01:35,105 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2019-10-14 16:01:35,133 - Repository['HDP-2.6-repo-52'] {'append_to_file': False, 'base_url': 'http://ha01/HDP/centos7/2.6.5.0-292/', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-52', 'mirror_list': ''}
2019-10-14 16:01:35,142 - File['/etc/yum.repos.d/ambari-hdp-52.repo'] {'content': '[HDP-2.6-repo-52]\nname=HDP-2.6-repo-52\nbaseurl=http://ha01/HDP/centos7/2.6.5.0-292/\n\npath=/\nenabled=1\ngpgcheck=0'}
2019-10-14 16:01:35,142 - Writing File['/etc/yum.repos.d/ambari-hdp-52.repo'] because contents don't match
2019-10-14 16:01:35,143 - Repository['HDP-2.6-GPL-repo-52'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-GPL/centos7/2.x/updates/2.6.5.0', 'action': ['create'], 'components': [u'HDP-GPL', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-52', 'mirror_list': ''}
2019-10-14 16:01:35,147 - File['/etc/yum.repos.d/ambari-hdp-52.repo'] {'content': '[HDP-2.6-repo-52]\nname=HDP-2.6-repo-52\nbaseurl=http://ha01/HDP/centos7/2.6.5.0-292/\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-2.6-GPL-repo-52]\nname=HDP-2.6-GPL-repo-52\nbaseurl=http://public-repo-1.hortonworks.com/HDP-GPL/centos7/2.x/updates/2.6.5.0\n\npath=/\nenabled=1\ngpgcheck=0'}
2019-10-14 16:01:35,148 - Writing File['/etc/yum.repos.d/ambari-hdp-52.repo'] because contents don't match
2019-10-14 16:01:35,148 - Repository['HDP-UTILS-1.1.0.22-repo-52'] {'append_to_file': True, 'base_url': 'http://ha01/HDP-UTILS/', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-52', 'mirror_list': ''}
2019-10-14 16:01:35,154 - File['/etc/yum.repos.d/ambari-hdp-52.repo'] {'content': '[HDP-2.6-repo-52]\nname=HDP-2.6-repo-52\nbaseurl=http://ha01/HDP/centos7/2.6.5.0-292/\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-2.6-GPL-repo-52]\nname=HDP-2.6-GPL-repo-52\nbaseurl=http://public-repo-1.hortonworks.com/HDP-GPL/centos7/2.x/updates/2.6.5.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.22-repo-52]\nname=HDP-UTILS-1.1.0.22-repo-52\nbaseurl=http://ha01/HDP-UTILS/\n\npath=/\nenabled=1\ngpgcheck=0'}
2019-10-14 16:01:35,154 - Writing File['/etc/yum.repos.d/ambari-hdp-52.repo'] because contents don't match
2019-10-14 16:01:35,155 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-10-14 16:01:35,390 - Skipping installation of existing package unzip
2019-10-14 16:01:35,391 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-10-14 16:01:35,416 - Skipping installation of existing package curl
2019-10-14 16:01:35,418 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-10-14 16:01:35,444 - Skipping installation of existing package hdp-select
2019-10-14 16:01:35,456 - The repository with version 2.6.5.0-292 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2019-10-14 16:01:35,474 - Skipping stack-select on HUE because it does not exist in the stack-select package structure.
2019-10-14 16:01:36,333 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2019-10-14 16:01:36,339 - Package['wget'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-10-14 16:01:36,499 - Skipping installation of existing package wget
2019-10-14 16:01:36,500 - Package['tar'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-10-14 16:01:36,510 - Skipping installation of existing package tar
2019-10-14 16:01:36,511 - Package['asciidoc'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-10-14 16:01:36,530 - Skipping installation of existing package asciidoc
2019-10-14 16:01:36,532 - Package['krb5-devel'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-10-14 16:01:36,545 - Skipping installation of existing package krb5-devel
2019-10-14 16:01:36,547 - Package['libxml2-devel'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-10-14 16:01:36,576 - Skipping installation of existing package libxml2-devel
2019-10-14 16:01:36,577 - Package['libxslt-devel'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-10-14 16:01:36,597 - Skipping installation of existing package libxslt-devel
2019-10-14 16:01:36,598 - Package['openldap-devel'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-10-14 16:01:36,617 - Skipping installation of existing package openldap-devel
2019-10-14 16:01:36,618 - Package['python-devel'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-10-14 16:01:36,640 - Skipping installation of existing package python-devel
2019-10-14 16:01:36,641 - Package['python-simplejson'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-10-14 16:01:36,660 - Installing package python-simplejson ('/usr/bin/yum -d 0 -e 0 -y install python-simplejson')
2019-10-14 16:01:45,355 - Package['python-setuptools'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-10-14 16:01:45,368 - Skipping installation of existing package python-setuptools
2019-10-14 16:01:45,369 - Package['python-psycopg2'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-10-14 16:01:45,380 - Skipping installation of existing package python-psycopg2
2019-10-14 16:01:45,380 - Package['sqlite-devel'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-10-14 16:01:45,391 - Skipping installation of existing package sqlite-devel
2019-10-14 16:01:45,392 - Package['rsync'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-10-14 16:01:45,402 - Skipping installation of existing package rsync
2019-10-14 16:01:45,403 - Package['saslwrapper-devel'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-10-14 16:01:45,415 - Skipping installation of existing package saslwrapper-devel
2019-10-14 16:01:45,415 - Package['pycrypto'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-10-14 16:01:45,430 - Installing package pycrypto ('/usr/bin/yum -d 0 -e 0 -y install pycrypto')
2019-10-14 16:01:49,167 - Package['gmp-devel'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-10-14 16:01:49,192 - Skipping installation of existing package gmp-devel
2019-10-14 16:01:49,193 - Package['libyaml-devel'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-10-14 16:01:49,207 - Skipping installation of existing package libyaml-devel
2019-10-14 16:01:49,208 - Package['cyrus-sasl-plain'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-10-14 16:01:49,220 - Skipping installation of existing package cyrus-sasl-plain
2019-10-14 16:01:49,221 - Package['cyrus-sasl-devel'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-10-14 16:01:49,232 - Skipping installation of existing package cyrus-sasl-devel
2019-10-14 16:01:49,233 - Package['cyrus-sasl-gssapi'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-10-14 16:01:49,247 - Skipping installation of existing package cyrus-sasl-gssapi
2019-10-14 16:01:49,249 - Package['libffi-devel'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-10-14 16:01:49,295 - Skipping installation of existing package libffi-devel
2019-10-14 16:01:49,300 - Downloading Hue Service
2019-10-14 16:01:49,301 - Execute['cat /etc/yum.repos.d/HDP.repo | grep "baseurl" | awk -F '=' '{print $2"hue/hue-4.2.0.tgz"}' | xargs wget -O hue.tgz'] {}
2019-10-14 16:02:11,256 - The repository with version 2.6.5.0-292 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2019-10-14 16:02:11,266 - Skipping stack-select on HUE because it does not exist in the stack-select package structure.
Command failed after 1 tries`
Modify this configuration in pseudo-distributed.ini:
when start the hue service:
stderr:
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/3.1/services/HUE/package/scripts/hue_server.py", line 76, in
HueServer().execute()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 352, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/3.1/services/HUE/package/scripts/hue_server.py", line 14, in install
import params
File "/var/lib/ambari-agent/cache/stacks/HDP/3.1/services/HUE/package/scripts/params.py", line 250, in
resourcemanager_host1 = resourcemanager_hosts[0]
IndexError: list index out of range
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/2.6/services/HUE/package/scripts/hue_server.py", line 76, in
HueServer().execute()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 375, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.6/services/HUE/package/scripts/hue_server.py", line 28, in start
self.configure(env)
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 120, in locking_configure
original_configure(obj, *args, **kw)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.6/services/HUE/package/scripts/hue_server.py", line 23, in configure
setup_hue()
File "/var/lib/ambari-agent/cache/stacks/HDP/2.6/services/HUE/package/scripts/setup_hue.py", line 49, in setup_hue
add_hdfs_configuration(params.has_ranger_admin, params.security_enabled)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.6/services/HUE/package/scripts/common.py", line 95, in add_hdfs_configuration
add_configurations(services_configurations)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.6/services/HUE/package/scripts/common.py", line 157, in add_configurations
Execute(cmd)
File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in init
self.env.run()
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 262, in action_run
tries=self.resource.tries, try_sleep=self.resource.try_sleep)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner
result = function(command, **kwargs)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call
tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 303, in _call
raise ExecutionFailed(err_msg, code, out, err)
first,I build hue3.11.0.tar and make the tar.than cp the tar to HDP baseurl/centos7/hue.
the server install ok. but it can not run.
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/2.5/services/HUE/package/scripts/hue_server.py", line 76, in
HueServer().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 280, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.5/services/HUE/package/scripts/hue_server.py", line 28, in start
self.configure(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.5/services/HUE/package/scripts/hue_server.py", line 23, in configure
setup_hue()
File "/var/lib/ambari-agent/cache/stacks/HDP/2.5/services/HUE/package/scripts/setup_hue.py", line 27, in setup_hue
Link("{0}/desktop/libs/hadoop/java-lib/*".format(params.hue_dir),to = "/usr/hdp/current/hadoop-client/lib")
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in init
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 238, in action_create
sudo.symlink(self.resource.to, path)
File "/usr/lib/python2.6/site-packages/resource_management/core/sudo.py", line 93, in symlink
os.symlink(source, link_name)
OSError: [Errno 2] No such file or directory
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/2.6/services/HUE/package/scripts/hue_server.py", line 76, in <module>
HueServer().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 329, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.6/services/HUE/package/scripts/hue_server.py", line 28, in start
self.configure(env)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 119, in locking_configure
original_configure(obj, *args, **kw)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.6/services/HUE/package/scripts/hue_server.py", line 23, in configure
setup_hue()
File "/var/lib/ambari-agent/cache/stacks/HDP/2.6/services/HUE/package/scripts/setup_hue.py", line 45, in setup_hue
owner = params.hue_user
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 138, in action_create
sudo.create_file(path, content, encoding=self.resource.encoding)
File "/usr/lib/python2.6/site-packages/resource_management/core/sudo.py", line 141, in create_file
fp.write(content)
UnicodeEncodeError: 'ascii' codec can't encode character u'\u201c' in position 3462: ordinal not in range(128)
Once I've modified the code, the HUE service will be started properly using the local package installation. The error message is a character set problem. I have limited ability to find out where it is. I hope I can give you some answers. thank you
My friend told me this was due to a Chinese quotation mark.
“”
“
Quotation marks in Chinese
'
The quotation marks are the same in English
What should I do?
I'm using Ambari blueprint to install automatically a HDP cluster with HUE, but I need to explicitly execute the metastoresync after the installation.
It will be better to do that during the installation process, isn't it ?
Have not integrated sqoop yet?
Hello, I encountered the following problems in the test hue: When I modify the configuration interface below a tab, such as Hue service Moudle, I will Hue HDFS Moudle from YES to NO, after the save did not prompt me to hue Reboot. And I still have access to HDFS Brower in the Hue WEB UI.
The same Hue User Info and Hue Databases changes also have this problem.
But when I modify the Advanced configuration properties below, will be prompted to restart. When I restart after the entry into force.
I refer to the other components of the cluster to check the next, and no similar problems. Modify the configuration will have a restart prompt.
Do not know if you have this problem? Please also help to test and provide solutions. thank you very much!
When I execute the installation command, the following error occurs:
'cat /etc/yum.repos.d/HDP.repo | grep "baseurl" | awk -F '=' '{print $2"hue/hue-3.11.0.tgz"}' | xargs wget -O hue.tgz' returned 123. --2017-11-23 13:47:21-- http://192.168.80.5/mirrors/HDP-2.6.2.0-centos7/hue/hue-3.11.0.tgz
Connecting to 192.168.80.5:80... connected.
HTTP request sent, awaiting response... 404 Not Found
There should be no resources. You should update it.
package download url:https://pkgs.org/
yum install -y wget
yum install -y tar
yum install -y asciidoc
yum install -y krb5-devel
yum install -y libxml2-devel
yum install -y libxslt-devel
yum install -y openldap-devel
yum install -y python-devel
yum install -y python-simplejson
yum install -y python-setuptools
yum install -y python-psycopg2
yum install -y sqlite-devel
yum install -y rsync
yum install -y saslwrapper-devel
yum install -y pycrypto
yum install -y gmp-devel
yum install -y libyaml-devel
yum install -y cyrus-sasl-plain
yum install -y cyrus-sasl-devel
yum install -y cyrus-sasl-gssapi
yum install -y libffi-devel
I installed hue using ambari-hue=service, and using "Metastoresync " command and "Usersync" to synchronization the data in sqlite and system user after install hue service. then turn the "Hue Hive Module" on. There is no table called beeswax_metainstall in database, and hue cannot work for hive. Also, I cannot synchronization data using Metastoresync command, with the error code "Migration desktop:0007_auto__add_documentpermission__add_documenttag__add_document should not have been applied before beeswax:0008_auto__add_field_queryhistory_query_type but was.". But if I turn the "Hue Hive Module" on, and next step is to execute the command "Metastoresync ", hue service works well.
Installation problem
centos7 + hdp 2.5 +ambari Version 2.4.2.0
hue server install failed!
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/2.5/services/HUE/package/scripts/hue_server.py", line 76, in
HueServer().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 280, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.5/services/HUE/package/scripts/hue_server.py", line 16, in install
self.install_packages(env)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 567, in install_packages
retry_count=agent_stack_retry_count)
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in init
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/init.py", line 54, in action_install
self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos)
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 51, in install_package
self.checked_call_with_retries(cmd, sudo=True, logoutput=self.get_logoutput())
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/init.py", line 86, in checked_call_with_retries
return self._call_with_retries(cmd, is_checked=True, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/init.py", line 98, in _call_with_retries
code, out = func(cmd, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner
result = function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call
tries=tries, try_sleep=try_sleep)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 293, in _call
raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/bin/yum -d 0 -e 0 -y install python-simplejs
yum -y install sasl
libgsasl-1.4.0-4.el6.x86_64
libgsasl-devel-1.4.0-4.el6.x86_64
cyrus-sasl-sql-2.1.23-15.el6_6.2.x86_64
saslwrapper-devel-0.14-1.el6.x86_64
cyrus-sasl-gssapi-2.1.23-15.el6_6.2.x86_64
cyrus-sasl-ldap-2.1.23-15.el6_6.2.x86_64
cyrus-sasl-devel-2.1.23-15.el6_6.2.x86_64
cyrus-sasl-2.1.23-15.el6_6.2.x86_64
cyrus-sasl-lib-2.1.23-15.el6_6.2.x86_64
erlang-sasl-R14B-04.3.el6.x86_64
ruby-saslwrapper-0.14-1.el6.x86_64
python-saslwrapper-0.14-1.el6.x86_64
cyrus-sasl-plain-2.1.23-15.el6_6.2.x86_64
cyrus-sasl-ntlm-2.1.23-15.el6_6.2.x86_64
lua-cyrussasl-1.1.0-1.el6.x86_64
cyrus-sasl-md5-2.1.23-15.el6_6.2.x86_64
saslwrapper-0.14-1.el6.x86_64
yum install cyrus-sasl-lib.x86_64
yum install cyrus-sasl-devel.x86_64
yum install libgsasl-devel.x86_64
yum install saslwrapper-devel.x86_64
centos7 + hdp 2.5 +ambari Version 2.4.2.0
waring:
Consistency Check Failed
The configuration changes could not be validated for consistency due to an unknown error. Your changes have not been saved yet. Would you like to proceed and save the changes?
how can i slove it ? thanks
Hi,
I am getting the following error during installation
resource_management.core.exceptions.ExecutionFailed: Execution of 'cat /etc/yum.repos.d/HDP.repo | grep "baseurl" | awk -F '=' '{print $2"hue/hue-3.11.0.tgz"}' | xargs wget -O hue.tgz' returned 123. --2018-05-02 19:33:13-- http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.1.0hue/hue-3.11.0.tgz
Resolving public-repo-1.hortonworks.com (public-repo-1.hortonworks.com)... 52.84.128.252, 52.84.128.2, 52.84.128.7, ...
Connecting to public-repo-1.hortonworks.com (public-repo-1.hortonworks.com)|52.84.128.252|:80... connected.
HTTP request sent, awaiting response... 404 Not Found
2018-05-02 19:33:13 ERROR 404: Not Found.
Any ideas how to fix it ?
thank you in advanced,
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/2.4/services/HUE/package/scripts/hue_server.py", line 76, in
HueServer().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 280, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.4/services/HUE/package/scripts/hue_server.py", line 18, in install
download_hue()
File "/var/lib/ambari-agent/cache/stacks/HDP/2.4/services/HUE/package/scripts/common.py", line 57, in download_hue
Execute('{0} | xargs wget -O hue.tgz'.format(params.download_url))
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in init
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 273, in action_run
tries=self.resource.tries, try_sleep=self.resource.try_sleep)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner
result = function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call
tries=tries, try_sleep=try_sleep)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 293, in _call
raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of 'cat /etc/yum.repos.d/HDP.repo | grep "baseurl" | awk -F '=' '{print $2"hue/hue-3.11.0.tgz"}' | xargs wget -O hue.tgz' returned 123. --2017-02-01 12:46:09-- http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.4.3.0hue/hue-3.11.0.tgz
Resolving public-repo-1.hortonworks.com... 54.230.191.111, 54.230.191.36, 54.230.191.119, ...
Connecting to public-repo-1.hortonworks.com|54.230.191.111|:80... connected.
HTTP request sent, awaiting response... 404 Not Found
2017-02-01 12:46:10 ERROR 404: Not Found.
Hi,
I looked into your code and there is dependency of HUE on Hadoop httpfs service. This service is not installed using Ambari in HDP stack, but I haven't seen installation of it as a part of HUE either. Does it mean you have to install it manually before HUE?
EsharEditor:
你好!我有两个问题需要你帮忙解答。问题1:Ambari这个版本并没有提供Hbase Thriftserver,导致Hue是无法使用Hbase服务,你是通过手动启动Hbase Thriftserver服务吗?问题2:Hue服务通过Ambari安装好后,在Zookeeper模块无法操作znode,请问你是怎么解决的?
谢谢!
Hi,
I am trying your service and the conditions inside the theme are not working when installing in Ambari 2.1. For example
{
"config": "hue-desktop-site/db_user",
"subsection-name": "subsection-hue-ms-row1-col1",
"depends-on": [
{
"configs":[
"hue-desktop-site/DB_FLAVOR"
],
"if": "${hue-desktop-site/DB_FLAVOR} === sqlite3",
"then": {
"property_value_attributes": {
"visible": false
}
},
"else": {
"property_value_attributes": {
"visible": true
}
}
}
]
},
Is not working and it does not deactivate the property.
When I installed on the interface, the prompt: the installation is successful, start failure. The log is as follows:
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/2.5/services/HUE/package/scripts/hue_server.py", line 76, in
HueServer().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 329, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.5/services/HUE/package/scripts/hue_server.py", line 28, in start
self.configure(env)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 119, in locking_configure
original_configure(obj, args, **kw)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.5/services/HUE/package/scripts/hue_server.py", line 23, in configure
setup_hue()
File "/var/lib/ambari-agent/cache/stacks/HDP/2.5/services/HUE/package/scripts/setup_hue.py", line 27, in setup_hue
Link("{0}/desktop/libs/hadoop/java-lib/".format(params.hue_dir),to = "/usr/hdp/current/hadoop-client/lib")
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in init
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 238, in action_create
sudo.symlink(self.resource.to, path)
File "/usr/lib/python2.6/site-packages/resource_management/core/sudo.py", line 123, in symlink
os.symlink(source, link_name)
OSError: [Errno 2] No such file or directory
Hi,
I'm trying to installa on HDP 2.6 over ubuntu 16 but when add de service raise an error:
2018-02-05 15:48:38,382 - Installing package krb5-devel ('/usr/bin/apt-get -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install krb5-devel')
Reading package lists...
Building dependency tree...
Reading state information...
E: Unable to locate package krb5-devel
2018-02-05 15:48:40,482 - Execution of '/usr/bin/apt-get -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install krb5-devel' returned 100. Reading package lists...
Building dependency tree...
Kerberos Client is already installed no cluster.
Hi,
I am getting the following error during installation
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stack-hooks/before-ANY/scripts/hook.py", line 38, in
BeforeAnyHook().execute()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 351, in execute
method(env)
File "/var/lib/ambari-agent/cache/stack-hooks/before-ANY/scripts/hook.py", line 31, in hook
setup_users()
File "/var/lib/ambari-agent/cache/stack-hooks/before-ANY/scripts/shared_initialization.py", line 50, in setup_users
groups = params.user_to_groups_dict[user],
KeyError: u'hue'
Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stack-hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-953.json', '/var/lib/ambari-agent/cache/stack-hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-953.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1_2', '']
Any ideas how to fix it ?
thank you in advanced
This service gives the ability to install Hue on multiple hosts I assume like the other Ambari services, but does it provide the ability to load balance them via Nginx, for example?
I know Cloudera Manager provides the option to setup a load balancer, but could this be done in Amabri?
In a Jinja template for Nginx, you could so something like
upstream hue {
ip_hash;
# List all the Hue instances here for high availability.
{% for server in serverlist %}
server {{ server }}:8888 max_fails=3;
{% endfor %}
}
After installing, I can change conf, but it's not saved to hue.ini file and not working.
Am I supposed to get the content of pseudo-distributed conf and save it to hue.ini?
Hi,
thanks for this plugin! We wanna have hue as service in cluster and this plugin looks nice, can you please provide more details about licence for this plugin?
Thanks!
It seems that it doesn't contain any hue package, which may means relative packages will be downloaded from internet. So any method for installing in no internet access method?
The README.md says:
Version
Hue v3.11.0+
I managed to install Hue 4.0.1 with that stack.
But it shouldn't force the name of the tar.gz to be "hue-3.11.0.tgz"
Would be better to have a generic name "hue.tgz"
In fact, it would be better to use a RPM package, by contributing to Apache BigTop or creating a new RPM package (fork from https://github.com/apache/bigtop/blob/master/bigtop-packages/src/rpm/hue/SPECS/hue.spec )
你好,我在ambari 2.2上面集成hue 3.11时在"Assign Masters"步骤报错,截图如下:
后台ambari-server日志报错如下:
13 十二月 2016 16:59:04,491 WARN [qtp-ambari-client-167] ServletHandler:563 - /api/v1/stacks/HDP/versions/2.4/recommendations
java.lang.NullPointerException
at org.apache.ambari.server.state.PropertyInfo.getAttributesMap(PropertyInfo.java:145)
at org.apache.ambari.server.state.PropertyInfo.convertToResponse(PropertyInfo.java:128)
at org.apache.ambari.server.controller.AmbariManagementControllerImpl.getStackConfigurations(AmbariManagementControllerImpl.java:3898)
at org.apache.ambari.server.controller.AmbariManagementControllerImpl.getStackConfigurations(AmbariManagementControllerImpl.java:3867)
at org.apache.ambari.server.controller.internal.StackConfigurationResourceProvider$1.invoke(StackConfigurationResourceProvider.java:114)
at org.apache.ambari.server.controller.internal.StackConfigurationResourceProvider$1.invoke(StackConfigurationResourceProvider.java:111)
at org.apache.ambari.server.controller.internal.AbstractResourceProvider.getResources(AbstractResourceProvider.java:302)
at org.apache.ambari.server.controller.internal.StackConfigurationResourceProvider.getResources(StackConfigurationResourceProvider.java:111)
at org.apache.ambari.server.controller.internal.ClusterControllerImpl$ExtendedResourceProviderWrapper.queryForResources(ClusterControllerImpl.java:945)
at org.apache.ambari.server.controller.internal.ClusterControllerImpl.getResources(ClusterControllerImpl.java:132)
at org.apache.ambari.server.api.query.QueryImpl.doQuery(QueryImpl.java:508)
at org.apache.ambari.server.api.query.QueryImpl.queryForSubResources(QueryImpl.java:463)
at org.apache.ambari.server.api.query.QueryImpl.queryForSubResources(QueryImpl.java:482)
at org.apache.ambari.server.api.query.QueryImpl.queryForResources(QueryImpl.java:436)
at org.apache.ambari.server.api.query.QueryImpl.execute(QueryImpl.java:216)
at org.apache.ambari.server.api.handlers.ReadHandler.handleRequest(ReadHandler.java:68)
at org.apache.ambari.server.api.services.BaseRequest.process(BaseRequest.java:135)
at org.apache.ambari.server.api.services.BaseService.handleRequest(BaseService.java:106)
at org.apache.ambari.server.api.services.BaseService.handleRequest(BaseService.java:75)
at org.apache.ambari.server.api.services.stackadvisor.commands.StackAdvisorCommand.getServicesInformation(StackAdvisorCommand.java:356)
at org.apache.ambari.server.api.services.stackadvisor.commands.StackAdvisorCommand.invoke(StackAdvisorCommand.java:247)
at org.apache.ambari.server.api.services.stackadvisor.StackAdvisorHelper.recommend(StackAdvisorHelper.java:109)
at org.apache.ambari.server.controller.internal.RecommendationResourceProvider.createResources(RecommendationResourceProvider.java:92)
at org.apache.ambari.server.controller.internal.ClusterControllerImpl.createResources(ClusterControllerImpl.java:289)
at org.apache.ambari.server.api.services.persistence.PersistenceManagerImpl.create(PersistenceManagerImpl.java:76)
at org.apache.ambari.server.api.handlers.CreateHandler.persist(CreateHandler.java:36)
at org.apache.ambari.server.api.handlers.BaseManagementHandler.handleRequest(BaseManagementHandler.java:72)
at org.apache.ambari.server.api.services.BaseRequest.process(BaseRequest.java:135)
at org.apache.ambari.server.api.services.BaseService.handleRequest(BaseService.java:106)
at org.apache.ambari.server.api.services.BaseService.handleRequest(BaseService.java:75)
at org.apache.ambari.server.api.services.RecommendationService.getRecommendation(RecommendationService.java:59)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:540)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:715)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:770)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:684)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1496)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.apache.ambari.server.security.authorization.AmbariAuthorizationFilter.doFilter(AmbariAuthorizationFilter.java:196)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:237)
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:167)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1467)
at org.apache.ambari.server.api.MethodOverrideFilter.doFilter(MethodOverrideFilter.java:72)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1467)
at org.apache.ambari.server.api.AmbariPersistFilter.doFilter(AmbariPersistFilter.java:47)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1467)
at org.apache.ambari.server.security.AbstractSecurityHeaderFilter.doFilter(AbstractSecurityHeaderFilter.java:109)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1467)
at org.eclipse.jetty.servlets.UserAgentFilter.doFilter(UserAgentFilter.java:82)
at org.eclipse.jetty.servlets.GzipFilter.doFilter(GzipFilter.java:294)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1467)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:501)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:137)
at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:557)
at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:231)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1086)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:429)
at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:193)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1020)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)
at org.apache.ambari.server.controller.AmbariHandlerList.processHandlers(AmbariHandlerList.java:216)
at org.apache.ambari.server.controller.AmbariHandlerList.processHandlers(AmbariHandlerList.java:205)
at org.apache.ambari.server.controller.AmbariHandlerList.handle(AmbariHandlerList.java:139)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)
at org.eclipse.jetty.server.Server.handle(Server.java:370)
at org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:494)
at org.eclipse.jetty.server.AbstractHttpConnection.content(AbstractHttpConnection.java:982)
at org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.content(AbstractHttpConnection.java:1043)
at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:865)
at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:240)
at org.eclipse.jetty.server.AsyncHttpConnection.handle(AsyncHttpConnection.java:82)
at org.eclipse.jetty.io.nio.SelectChannelEndPoint.handle(SelectChannelEndPoint.java:696)
at org.eclipse.jetty.io.nio.SelectChannelEndPoint$1.run(SelectChannelEndPoint.java:53)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
at java.lang.Thread.run(Thread.java:745)
麻烦帮忙分析下错误原因,谢谢
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.