-
Notifications
You must be signed in to change notification settings - Fork 50
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Zepplin install failed on Azure HDP 2.5 standard 3 master 3 node cluster #37
Comments
Per the README, "Zeppelin is now officially available via Ambari in HDP 2.5 so as of this release this project is not longer needed" |
I dont think so, I don’t see it coming in as standard. Also though below is shown for HDinsight, I have done the standard install hip 2.5 which doesn’t include zeppelin.
Need you help regarding debugging. That is why I have to catapult it form the steps you have highlighted.
https://docs.microsoft.com/en-us/azure/hdinsight/hdinsight-component-versioning <https://docs.microsoft.com/en-us/azure/hdinsight/hdinsight-component-versioning>
Component HDInsight version 3.6 HDInsight version 3.5 (Default) HDInsight version 3.4 HDInsight Version 3.3 HDInsight Version 3.2 HDInsight Version 3.1 HDInsight Version 3.0
Hortonworks Data Platform 2.6 2.5 2.4 2.3 2.2 2.1.7 2.0
Apache Hadoop & YARN 2.7.3 2.7.3 2.7.1 2.7.1 2.6.0 2.4.0 2.2.0
Apache Tez 0.7.0 0.7.0 0.7.0 0.7.0 0.5.2 0.4.0 -
Apache Pig 0.16.0 0.16.0 0.15.0 0.15.0 0.14.0 0.12.1 0.12.0
Apache Hive & HCatalog 1.2.1 1.2.1 1.2.1 1.2.1 0.14.0 0.13.1 0.12.0
Apache Hive2 2.1.0 - - - - - -
Apache Tez-Hive2 0.8.4 - - - - - -
Apache Ranger 0.7.0 0.6.0 - - - - -
Apache HBase 1.1.2 1.1.2 1.1.2 1.1.1 0.98.4 0.98.0 -
Apache Sqoop 1.4.6 1.4.6 1.4.6 1.4.6 1.4.5 1.4.4 1.4.4
Apache Oozie 4.2.0 4.2.0 4.2.0 4.2.0 4.1.0 4.0.0 4.0.0
Apache Zookeeper 3.4.6 3.4.6 3.4.6 3.4.6 3.4.6 3.4.5 3.4.5
Apache Storm 1.1.0 1.0.1 0.10.0 0.10.0 0.9.3 0.9.1 -
Apache Mahout 0.9.0+ 0.9.0+ 0.9.0+ 0.9.0+ 0.9.0 0.9.0 -
Apache Phoenix 4.7.0 4.7.0 4.4.0 4.4.0 4.2.0 4.0.0.2.1.7.0-2162 -
Apache Spark 2.1.0 (Linux only) 1.6.2 + 2.0 (Linux only) 1.6.0 (Linux only) 1.5.2 (Linux only/Experimental build) 1.3.1 (Windows-only) - -
Apache Kafka 0.10.0 0.10.0 0.9.0 - - - -
Apache Ambari 2.5.0 2.4.0 2.2.1 2.1.0 - - -
Apache Zeppelin 0.7.0 - - - - - -
Mono 4.2.1 4.2.1 3.2.8 - -
… On Apr 21, 2017, at 12:20 AM, Ali Bajwa ***@***.***> wrote:
Per the README, "Zeppelin is now officially available via Ambari in HDP 2.5 so as of this release this project is not longer needed"
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub <#37 (comment)>, or mute the thread <https://github.com/notifications/unsubscribe-auth/AajRJixM-zDW5qSqSOlB47fjCXkO8zSEks5ryC5zgaJpZM4NDzm9>.
|
On AWS HWX HDP, Zeppelin was installed but when service was restarted it wont reflect as started. |
Zepplin install failed on Azure HDP 2.5 standard 3 master 3 node cluster.
x
master1.t53hiwxtr3xunlr5uyfutxkiaa.bx.internal.cloudapp.net
Tasks
Copy Open Zeppelin Notebook Install
stderr: /var/lib/ambari-agent/data/errors-221.txt
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/2.4/services/ZEPPELIN/package/scripts/master.py", line 235, in
Master().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.4/services/ZEPPELIN/package/scripts/master.py", line 52, in install
Execute('sudo yum install -y epel-release')
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in init
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 158, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 121, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 238, in action_run
tries=self.resource.tries, try_sleep=self.resource.try_sleep)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner
result = function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call
tries=tries, try_sleep=try_sleep)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 291, in _call
raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of 'sudo yum install -y epel-release' returned 1. sudo: sorry, you must have a tty to run sudo
stdout: /var/lib/ambari-agent/data/output-221.txt
2017-04-21 02:55:53,398 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.4.0.0-169
2017-04-21 02:55:53,398 - Checking if need to create versioned conf dir /etc/hadoop/2.4.0.0-169/0
2017-04-21 02:55:53,398 - call['conf-select create-conf-dir --package hadoop --stack-version 2.4.0.0-169 --conf-version 0'] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2017-04-21 02:55:53,420 - call returned (1, '/etc/hadoop/2.4.0.0-169/0 exist already', '')
2017-04-21 02:55:53,420 - checked_call['conf-select set-conf-dir --package hadoop --stack-version 2.4.0.0-169 --conf-version 0'] {'logoutput': False, 'sudo': True, 'quiet': False}
2017-04-21 02:55:53,441 - checked_call returned (0, '/usr/hdp/2.4.0.0-169/hadoop/conf -> /etc/hadoop/2.4.0.0-169/0')
2017-04-21 02:55:53,441 - Ensuring that hadoop has the correct symlink structure
2017-04-21 02:55:53,442 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2017-04-21 02:55:53,443 - Group['spark'] {}
2017-04-21 02:55:53,444 - Group['zeppelin'] {}
2017-04-21 02:55:53,444 - Group['hadoop'] {}
2017-04-21 02:55:53,444 - Group['users'] {}
2017-04-21 02:55:53,444 - Group['knox'] {}
2017-04-21 02:55:53,445 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-04-21 02:55:53,445 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-04-21 02:55:53,446 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-04-21 02:55:53,446 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2017-04-21 02:55:53,447 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-04-21 02:55:53,447 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-04-21 02:55:53,448 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2017-04-21 02:55:53,448 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2017-04-21 02:55:53,449 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-04-21 02:55:53,450 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-04-21 02:55:53,450 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2017-04-21 02:55:53,451 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-04-21 02:55:53,451 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-04-21 02:55:53,452 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-04-21 02:55:53,453 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-04-21 02:55:53,453 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-04-21 02:55:53,454 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-04-21 02:55:53,454 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-04-21 02:55:53,455 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-04-21 02:55:53,455 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-04-21 02:55:53,456 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-04-21 02:55:53,457 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2017-04-21 02:55:53,462 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2017-04-21 02:55:53,462 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'recursive': True, 'mode': 0775, 'cd_access': 'a'}
2017-04-21 02:55:53,463 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-04-21 02:55:53,463 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2017-04-21 02:55:53,467 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if
2017-04-21 02:55:53,467 - Group['hdfs'] {}
2017-04-21 02:55:53,468 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': [u'hadoop', u'hdfs']}
2017-04-21 02:55:53,468 - Directory['/etc/hadoop'] {'mode': 0755}
2017-04-21 02:55:53,481 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2017-04-21 02:55:53,482 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 0777}
2017-04-21 02:55:53,493 - Repository['HDP-2.4'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.4.3.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP', 'mirror_list': None}
2017-04-21 02:55:53,500 - File['/etc/yum.repos.d/HDP.repo'] {'content': '[HDP-2.4]\nname=HDP-2.4\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.4.3.0\n\npath=/\nenabled=1\ngpgcheck=0'}
2017-04-21 02:55:53,501 - Repository['HDP-UTILS-1.1.0.20'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None}
2017-04-21 02:55:53,503 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': '[HDP-UTILS-1.1.0.20]\nname=HDP-UTILS-1.1.0.20\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
2017-04-21 02:55:53,504 - Package['unzip'] {}
2017-04-21 02:55:53,579 - Skipping installation of existing package unzip
2017-04-21 02:55:53,579 - Package['curl'] {}
2017-04-21 02:55:53,589 - Skipping installation of existing package curl
2017-04-21 02:55:53,589 - Package['hdp-select'] {}
2017-04-21 02:55:53,598 - Skipping installation of existing package hdp-select
2017-04-21 02:55:53,768 - Execute['find /var/lib/ambari-agent/cache/stacks/HDP/2.4/services/ZEPPELIN/package -iname "*.sh" | xargs chmod +x'] {}
2017-04-21 02:55:53,775 - Execute['echo platform.linux_distribution:CentOS Linux+7.2.1511+Core'] {}
2017-04-21 02:55:53,777 - Execute['echo Installing python packages for Centos'] {}
2017-04-21 02:55:53,780 - Execute['sudo yum install -y epel-release'] {}
The text was updated successfully, but these errors were encountered: