1. 程式人生 > >HbaseRegionserver通過指令碼自動重啟

HbaseRegionserver通過指令碼自動重啟

  • 環境:Hdp2.5 + hbase 1.2 + linux環境,5個數據節點

場景:

  • 由於平臺提供出去使用,時常有一段時間進行大量資料的寫入與查詢,這時可能會導致Hbase RegionServer出現宕機的情況。為了保證對資料寫入與查詢不產生影響,分別間隔一定時間對ResionServer檢測是否宕機,如果宕機則重啟,否則不做處理。

指令碼如下:

[[email protected] project]# more autoRestartHbaseRegionserver.sh
#!/bin/bash
su - hbase -c "/usr/hdp/current/hbase-regionserver/bin/hbase-daemon.sh --config /usr/hdp/current/hbase-regionserver/conf start regionserver"

指令碼說明:該指令碼只是很暴力地執行了當HbaseRegionserver宕機時重啟,當存在HbaseRegionserver時則不做任何處理。需要切換至hbase使用者執行,ambari介面才能監聽並同步至可視介面。

定時任務設定如下:

#每隔一段時間定時檢測hbaseRegionserver是否宕機,如果其宕機則重啟以防止由於其宕機導致資料寫入與輸出異常
0 */1 * * * cd /root/project/&& ./autoRestartHbaseRegionserver.sh

思路:

  • 1、Ambari 有提供Hbase滾動重啟的操作選項,只不過通過該選項只能啟動一個輪迴,每臺機器之間的間隔時間可自動配置,但不能超過最大時間,大概最大是10個小時。詳情如下圖:
  • 在這裡插入圖片描述

在這裡插入圖片描述

  • 2、當通過Ambari介面重啟Hbase Regionserver時,檢視日誌發現其實啟動過程主要是執行了如下語句
2018-10-09 14:39:38,918 - Execute['/usr/hdp/current/hbase-regionserver/bin/hbase-daemon.sh --config /usr/hdp/current/hbase-regionserver/conf start regionserver'] {'not_if': 'ambari-sudo.sh  -H -E test -f /var/run/hbase/hbase-hbase-regionserver.pid && ps -p `ambari-sudo.sh  -H -E cat /var/run/hbase/hbase-hbase-regionserver.pid` >/dev/null 2>&1', 'user': 'hbase'}
2018-10-09 14:39:38,981 - Skipping Execute['/usr/hdp/current/hbase-regionserver/bin/hbase-daemon.sh --config /usr/hdp/current/hbase-regionserver/conf start regionserver'] due to not_if

在這裡插入圖片描述

通過上述日誌於是便自行在指令碼中簡單執行上述語句,經測試可以正常執行,如需要優化指令碼可詳細解讀日誌跟原始碼。

  • 3、注意要用切換至hbase使用者再執行,如若通過其他使用者執行用需要手動刪除如下檔案
    [[email protected] ~]# cd /var/run/hbase/
    [[email protected] hbase]# ll
    total 8
    -rw-r--r-- 1 hbase hadoop  6 Oct  9 16:10 hbase-hbase-regionserver.pid
    -rw-r--r-- 1 hbase hadoop 60 Oct  9 16:10 hbase-hbase-regionserver.znode
  • 附全量日誌,通過Ambari介面執行後日志如下所示:
stdout:   /var/lib/ambari-agent/data/output-9452.txt
2018-10-09 14:39:35,977 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.5.3.0-37
2018-10-09 14:39:35,977 - Checking if need to create versioned conf dir /etc/hadoop/2.5.3.0-37/0
2018-10-09 14:39:35,978 - call[('ambari-python-wrap', '/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.3.0-37', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2018-10-09 14:39:36,016 - call returned (1, '/etc/hadoop/2.5.3.0-37/0 exist already', '')
2018-10-09 14:39:36,017 - checked_call[('ambari-python-wrap', '/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.3.0-37', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False}
2018-10-09 14:39:36,060 - checked_call returned (0, '')
2018-10-09 14:39:36,061 - Ensuring that hadoop has the correct symlink structure
2018-10-09 14:39:36,061 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2018-10-09 14:39:36,174 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.5.3.0-37
2018-10-09 14:39:36,174 - Checking if need to create versioned conf dir /etc/hadoop/2.5.3.0-37/0
2018-10-09 14:39:36,175 - call[('ambari-python-wrap', '/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.3.0-37', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2018-10-09 14:39:36,215 - call returned (1, '/etc/hadoop/2.5.3.0-37/0 exist already', '')
2018-10-09 14:39:36,216 - checked_call[('ambari-python-wrap', '/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.3.0-37', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False}
2018-10-09 14:39:36,254 - checked_call returned (0, '')
2018-10-09 14:39:36,254 - Ensuring that hadoop has the correct symlink structure
2018-10-09 14:39:36,254 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2018-10-09 14:39:36,256 - Group['livy'] {}
2018-10-09 14:39:36,257 - Group['spark'] {}
2018-10-09 14:39:36,257 - Group['ranger'] {}
2018-10-09 14:39:36,258 - Group['hadoop'] {}
2018-10-09 14:39:36,258 - Group['users'] {}
2018-10-09 14:39:36,258 - Group['knox'] {}
2018-10-09 14:39:36,258 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2018-10-09 14:39:36,259 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2018-10-09 14:39:36,260 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2018-10-09 14:39:36,260 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['ranger']}
2018-10-09 14:39:36,261 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2018-10-09 14:39:36,261 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2018-10-09 14:39:36,262 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2018-10-09 14:39:36,263 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2018-10-09 14:39:36,263 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2018-10-09 14:39:36,264 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2018-10-09 14:39:36,264 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2018-10-09 14:39:36,265 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2018-10-09 14:39:36,266 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2018-10-09 14:39:36,266 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2018-10-09 14:39:36,267 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2018-10-09 14:39:36,267 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2018-10-09 14:39:36,268 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2018-10-09 14:39:36,269 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-10-09 14:39:36,270 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2018-10-09 14:39:36,292 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2018-10-09 14:39:36,292 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2018-10-09 14:39:36,293 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-10-09 14:39:36,294 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2018-10-09 14:39:36,317 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if
2018-10-09 14:39:36,317 - Group['hdfs'] {}
2018-10-09 14:39:36,318 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'hdfs']}
2018-10-09 14:39:36,318 - FS Type: 
2018-10-09 14:39:36,319 - Directory['/etc/hadoop'] {'mode': 0755}
2018-10-09 14:39:36,334 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'root', 'group': 'hadoop'}
2018-10-09 14:39:36,335 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2018-10-09 14:39:36,348 - Execute[('setenforce', '0')] {'not_if': '(! which getenforce ) || (which getenforce && getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'}
2018-10-09 14:39:36,372 - Skipping Execute[('setenforce', '0')] due to not_if
2018-10-09 14:39:36,372 - Directory['/var/log/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'}
2018-10-09 14:39:36,375 - Directory['/var/run/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'root', 'cd_access': 'a'}
2018-10-09 14:39:36,375 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'create_parents': True, 'cd_access': 'a'}
2018-10-09 14:39:36,382 - File['/usr/hdp/current/hadoop-client/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'root'}
2018-10-09 14:39:36,383 - File['/usr/hdp/current/hadoop-client/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'root'}
2018-10-09 14:39:36,384 - File['/usr/hdp/current/hadoop-client/conf/log4j.properties'] {'content': ..., 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
2018-10-09 14:39:36,394 - File['/usr/hdp/current/hadoop-client/conf/hadoop-metrics2.properties'] {'content': Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs', 'group': 'hadoop'}
2018-10-09 14:39:36,395 - File['/usr/hdp/current/hadoop-client/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
2018-10-09 14:39:36,396 - File['/usr/hdp/current/hadoop-client/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'}
2018-10-09 14:39:36,400 - File['/etc/hadoop/conf/topology_mappings.data'] {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'), 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop'}
2018-10-09 14:39:36,421 - File['/etc/hadoop/conf/topology_script.py'] {'content': StaticFile('topology_script.py'), 'only_if': 'test -d /etc/hadoop/conf', 'mode': 0755}
2018-10-09 14:39:36,609 - Stack Feature Version Info: stack_version=2.5, version=2.5.3.0-37, current_cluster_version=2.5.3.0-37 -> 2.5.3.0-37
2018-10-09 14:39:36,610 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.5.3.0-37
2018-10-09 14:39:36,610 - Checking if need to create versioned conf dir /etc/hadoop/2.5.3.0-37/0
2018-10-09 14:39:36,610 - call[('ambari-python-wrap', '/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.3.0-37', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2018-10-09 14:39:36,649 - call returned (1, '/etc/hadoop/2.5.3.0-37/0 exist already', '')
2018-10-09 14:39:36,649 - checked_call[('ambari-python-wrap', '/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.3.0-37', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False}
2018-10-09 14:39:36,688 - checked_call returned (0, '')
2018-10-09 14:39:36,688 - Ensuring that hadoop has the correct symlink structure
2018-10-09 14:39:36,689 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2018-10-09 14:39:36,691 - checked_call['hostid'] {}
2018-10-09 14:39:36,711 - checked_call returned (0, 'c20a0543')
2018-10-09 14:39:36,715 - Directory['/etc/hbase'] {'mode': 0755}
2018-10-09 14:39:36,719 - Directory['/usr/hdp/current/hbase-regionserver/conf'] {'owner': 'hbase', 'group': 'hadoop', 'create_parents': True}
2018-10-09 14:39:36,720 - Directory['/tmp'] {'create_parents': True, 'mode': 0777}
2018-10-09 14:39:36,720 - Changing permission for /tmp from 1777 to 777
2018-10-09 14:39:36,720 - Directory['/tmp'] {'create_parents': True, 'cd_access': 'a'}
2018-10-09 14:39:36,721 - Execute[('chmod', '1777', '/tmp')] {'sudo': True}
2018-10-09 14:39:36,744 - XmlConfig['hbase-site.xml'] {'owner': 'hbase', 'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hbase-regionserver/conf', 'configuration_attributes': {}, 'configurations': ...}
2018-10-09 14:39:36,756 - Generating config: /usr/hdp/current/hbase-regionserver/conf/hbase-site.xml
2018-10-09 14:39:36,756 - File['/usr/hdp/current/hbase-regionserver/conf/hbase-site.xml'] {'owner': 'hbase', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None, 'encoding': 'UTF-8'}
2018-10-09 14:39:36,797 - XmlConfig['core-site.xml'] {'owner': 'hbase', 'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hbase-regionserver/conf', 'configuration_attributes': {'final': {'fs.defaultFS': 'true'}}, 'configurations': ...}
2018-10-09 14:39:36,805 - Generating config: /usr/hdp/current/hbase-regionserver/conf/core-site.xml
2018-10-09 14:39:36,805 - File['/usr/hdp/current/hbase-regionserver/conf/core-site.xml'] {'owner': 'hbase', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None, 'encoding': 'UTF-8'}
2018-10-09 14:39:36,831 - XmlConfig['hdfs-site.xml'] {'owner': 'hbase', 'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hbase-regionserver/conf', 'configuration_attributes': {'final': {'dfs.support.append': 'true', 'dfs.datanode.data.dir': 'true', 'dfs.namenode.http-address': 'true', 'dfs.namenode.name.dir': 'true', 'dfs.webhdfs.enabled': 'true', 'dfs.datanode.failed.volumes.tolerated': 'true'}}, 'configurations': ...}
2018-10-09 14:39:36,838 - Generating config: /usr/hdp/current/hbase-regionserver/conf/hdfs-site.xml
2018-10-09 14:39:36,839 - File['/usr/hdp/current/hbase-regionserver/conf/hdfs-site.xml'] {'owner': 'hbase', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None, 'encoding': 'UTF-8'}
2018-10-09 14:39:36,889 - XmlConfig['hdfs-site.xml'] {'owner': 'hdfs', 'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hadoop-client/conf', 'configuration_attributes': {'final': {'dfs.support.append': 'true', 'dfs.datanode.data.dir': 'true', 'dfs.namenode.http-address': 'true', 'dfs.namenode.name.dir': 'true', 'dfs.webhdfs.enabled': 'true', 'dfs.datanode.failed.volumes.tolerated': 'true'}}, 'configurations': ...}
2018-10-09 14:39:36,896 - Generating config: /usr/hdp/current/hadoop-client/conf/hdfs-site.xml
2018-10-09 14:39:36,897 - File['/usr/hdp/current/hadoop-client/conf/hdfs-site.xml'] {'owner': 'hdfs', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None, 'encoding': 'UTF-8'}
2018-10-09 14:39:36,946 - XmlConfig['hbase-policy.xml'] {'owner': 'hbase', 'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hbase-regionserver/conf', 'configuration_attributes': {}, 'configurations': {'security.masterregion.protocol.acl': '*', 'security.admin.protocol.acl': '*', 'security.client.protocol.acl': '*'}}
2018-10-09 14:39:36,954 - Generating config: /usr/hdp/current/hbase-regionserver/conf/hbase-policy.xml
2018-10-09 14:39:36,954 - File['/usr/hdp/current/hbase-regionserver/conf/hbase-policy.xml'] {'owner': 'hbase', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None, 'encoding': 'UTF-8'}
2018-10-09 14:39:36,964 - File['/usr/hdp/current/hbase-regionserver/conf/hbase-env.sh'] {'content': InlineTemplate(...), 'owner': 'hbase', 'group': 'hadoop'}
2018-10-09 14:39:36,965 - Directory['/etc/security/limits.d'] {'owner': 'root', 'create_parents': True, 'group': 'root'}
2018-10-09 14:39:36,968 - File['/etc/security/limits.d/hbase.conf'] {'content': Template('hbase.conf.j2'), 'owner': 'root', 'group': 'root', 'mode': 0644}
2018-10-09 14:39:36,968 - TemplateConfig['/usr/hdp/current/hbase-regionserver/conf/hadoop-metrics2-hbase.properties'] {'owner': 'hbase', 'template_tag': 'GANGLIA-RS'}
2018-10-09 14:39:36,975 - File['/usr/hdp/current/hbase-regionserver/conf/hadoop-metrics2-hbase.properties'] {'content': Template('hadoop-metrics2-hbase.properties-GANGLIA-RS.j2'), 'owner': 'hbase', 'group': None, 'mode': None}
2018-10-09 14:39:36,975 - TemplateConfig['/usr/hdp/current/hbase-regionserver/conf/regionservers'] {'owner': 'hbase', 'template_tag': None}
2018-10-09 14:39:36,977 - File['/usr/hdp/current/hbase-regionserver/conf/regionservers'] {'content': Template('regionservers.j2'), 'owner': 'hbase', 'group': None, 'mode': None}
2018-10-09 14:39:36,978 - TemplateConfig['/usr/hdp/current/hbase-regionserver/conf/hbase_regionserver_jaas.conf'] {'owner': 'hbase', 'template_tag': None}
2018-10-09 14:39:36,979 - File['/usr/hdp/current/hbase-regionserver/conf/hbase_regionserver_jaas.conf'] {'content': Template('hbase_regionserver_jaas.conf.j2'), 'owner': 'hbase', 'group': None, 'mode': None}
2018-10-09 14:39:36,980 - Directory['/var/run/hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0755, 'cd_access': 'a'}
2018-10-09 14:39:36,980 - Directory['/var/log/hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0755, 'cd_access': 'a'}
2018-10-09 14:39:36,981 - File['/usr/hdp/current/hbase-regionserver/conf/log4j.properties'] {'content': ..., 'owner': 'hbase', 'group': 'hadoop', 'mode': 0644}
2018-10-09 14:39:36,981 - HBase: Setup ranger: command retry not enabled thus skipping if ranger admin is down !
2018-10-09 14:39:36,982 - call['ambari-python-wrap /usr/bin/hdp-select status hbase-client'] {'timeout': 20}
2018-10-09 14:39:37,019 - call returned (0, 'hbase-client - 2.5.3.0-37')
2018-10-09 14:39:37,020 - RangeradminV2: Skip ranger admin if it's down !
2018-10-09 14:39:37,104 - checked_call['/usr/bin/kinit -c /var/lib/ambari-agent/tmp/curl_krb_cache/ranger_admin_calls_hbase_cc_7dd63ebcc890e5c63bdbfa2bd6b51aaf -kt /etc/security/keytabs/hbase.service.keytab hbase/[email protected] > /dev/null'] {'user': 'hbase'}
2018-10-09 14:39:37,155 - checked_call returned (0, '')
2018-10-09 14:39:37,156 - call['ambari-sudo.sh su hbase -l -s /bin/bash -c 'curl --location-trusted -k --negotiate -u : -b /var/lib/ambari-agent/tmp/cookies/cbd26bc2-f067-4bda-ab73-242aa6458d8b -c /var/lib/ambari-agent/tmp/cookies/cbd26bc2-f067-4bda-ab73-242aa6458d8b -w '"'"'%{http_code}'"'"' http://hdp06.gzbigdata.org.cn:6080/login.jsp --connect-timeout 10 --max-time 12 -o /dev/null 1>/tmp/tmptoVZk8 2>/tmp/tmpEkgdWB''] {'quiet': False, 'env': {'KRB5CCNAME': '/var/lib/ambari-agent/tmp/curl_krb_cache/ranger_admin_calls_hbase_cc_7dd63ebcc890e5c63bdbfa2bd6b51aaf'}}
2018-10-09 14:39:37,222 - call returned (0, '')
2018-10-09 14:39:37,223 - call['/usr/bin/klist -s /var/lib/ambari-agent/tmp/curl_krb_cache/ranger_admin_calls_hbase_cc_7dd63ebcc890e5c63bdbfa2bd6b51aaf'] {'user': 'hbase'}
2018-10-09 14:39:37,275 - call returned (0, '')
2018-10-09 14:39:37,276 - call['ambari-sudo.sh su hbase -l -s /bin/bash -c 'curl --location-trusted -k --negotiate -u : -b /var/lib/ambari-agent/tmp/cookies/45c51089-fe63-400b-95c5-11daa12e7484 -c /var/lib/ambari-agent/tmp/cookies/45c51089-fe63-400b-95c5-11daa12e7484 '"'"'http://hdp06.gzbigdata.org.cn:6080/service/public/v2/api/service?serviceName=GZ_BIG_DATA_PLAT_hbase&serviceType=hbase&isEnabled=true'"'"' --connect-timeout 10 --max-time 12 -X GET 1>/tmp/tmpdsDwLv 2>/tmp/tmphXpq2M''] {'quiet': False, 'env': {'KRB5CCNAME': '/var/lib/ambari-agent/tmp/curl_krb_cache/ranger_admin_calls_hbase_cc_7dd63ebcc890e5c63bdbfa2bd6b51aaf'}}
2018-10-09 14:39:37,406 - call returned (0, '')
2018-10-09 14:39:37,406 - Hbase Repository GZ_BIG_DATA_PLAT_hbase exist
2018-10-09 14:39:37,408 - File['/usr/hdp/current/hbase-regionserver/conf/ranger-security.xml'] {'content': InlineTemplate(...), 'owner': 'hbase', 'group': 'hadoop', 'mode': 0644}
2018-10-09 14:39:37,409 - Writing File['/usr/hdp/current/hbase-regionserver/conf/ranger-security.xml'] because contents don't match
2018-10-09 14:39:37,409 - Directory['/etc/ranger/GZ_BIG_DATA_PLAT_hbase'] {'owner': 'hbase', 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'}
2018-10-09 14:39:37,410 - Directory['/etc/ranger/GZ_BIG_DATA_PLAT_hbase/policycache'] {'owner': 'hbase', 'group': 'hadoop', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2018-10-09 14:39:37,410 - File['/etc/ranger/GZ_BIG_DATA_PLAT_hbase/policycache/hbaseMaster_GZ_BIG_DATA_PLAT_hbase.json'] {'owner': 'hbase', 'group': 'hadoop', 'mode': 0644}
2018-10-09 14:39:37,411 - File['/etc/ranger/GZ_BIG_DATA_PLAT_hbase/policycache/hbaseRegional_GZ_BIG_DATA_PLAT_hbase.json'] {'owner': 'hbase', 'group': 'hadoop', 'mode': 0644}
2018-10-09 14:39:37,412 - XmlConfig['ranger-hbase-audit.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hbase-regionserver/conf', 'mode': 0744, 'configuration_attributes': {}, 'owner': 'hbase', 'configurations': ...}
2018-10-09 14:39:37,421 - Generating config: /usr/hdp/current/hbase-regionserver/conf/ranger-hbase-audit.xml
2018-10-09 14:39:37,421 - File['/usr/hdp/current/hbase-regionserver/conf/ranger-hbase-audit.xml'] {'owner': 'hbase', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0744, 'encoding': 'UTF-8'}
2018-10-09 14:39:37,435 - XmlConfig['ranger-hbase-security.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hbase-regionserver/conf', 'mode': 0744, 'configuration_attributes': {}, 'owner': 'hbase', 'configurations': ...}
2018-10-09 14:39:37,443 - Generating config: /usr/hdp/current/hbase-regionserver/conf/ranger-hbase-security.xml
2018-10-09 14:39:37,443 - File['/usr/hdp/current/hbase-regionserver/conf/ranger-hbase-security.xml'] {'owner': 'hbase', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0744, 'encoding': 'UTF-8'}
2018-10-09 14:39:37,449 - XmlConfig['ranger-policymgr-ssl.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hbase-regionserver/conf', 'mode': 0744, 'configuration_attributes': {}, 'owner': 'hbase', 'configurations': ...}
2018-10-09 14:39:37,458 - Generating config: /usr/hdp/current/hbase-regionserver/conf/ranger-policymgr-ssl.xml
2018-10-09 14:39:37,458 - File['/usr/hdp/current/hbase-regionserver/conf/ranger-policymgr-ssl.xml'] {'owner': 'hbase', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0744, 'encoding': 'UTF-8'}
2018-10-09 14:39:37,464 - Execute[('/usr/hdp/2.5.3.0-37/ranger-hbase-plugin/ranger_credential_helper.py', '-l', '/usr/hdp/2.5.3.0-37/ranger-hbase-plugin/install/lib/*', '-f', '/etc/ranger/GZ_BIG_DATA_PLAT_hbase/cred.jceks', '-k', 'sslKeyStore', '-v', [PROTECTED], '-c', '1')] {'logoutput': True, 'environment': {'JAVA_HOME': '/usr/local/java/jdk1.8.0_91'}, 'sudo': True}
Using Java:/usr/local/java/jdk1.8.0_91/bin/java
Alias sslKeyStore created successfully!
2018-10-09 14:39:38,193 - Execute[('/usr/hdp/2.5.3.0-37/ranger-hbase-plugin/ranger_credential_helper.py', '-l', '/usr/hdp/2.5.3.0-37/ranger-hbase-plugin/install/lib/*', '-f', '/etc/ranger/GZ_BIG_DATA_PLAT_hbase/cred.jceks', '-k', 'sslTrustStore', '-v', [PROTECTED], '-c', '1')] {'logoutput': True, 'environment': {'JAVA_HOME': '/usr/local/java/jdk1.8.0_91'}, 'sudo': True}
Using Java:/usr/local/java/jdk1.8.0_91/bin/java
Alias sslTrustStore created successfully!
2018-10-09 14:39:38,917 - File['/etc/ranger/GZ_BIG_DATA_PLAT_hbase/cred.jceks'] {'owner': 'hbase', 'group': 'hadoop', 'mode': 0640}
2018-10-09 14:39:38,918 - Execute['/usr/hdp/current/hbase-regionserver/bin/hbase-daemon.sh --config /usr/hdp/current/hbase-regionserver/conf start regionserver'] {'not_if': 'ambari-sudo.sh  -H -E test -f /var/run/hbase/hbase-hbase-regionserver.pid && ps -p `ambari-sudo.sh  -H -E cat /var/run/hbase/hbase-hbase-regionserver.pid` >/dev/null 2>&1', 'user': 'hbase'}
2018-10-09 14:39:38,981 - Skipping Execute['/usr/hdp/current/hbase-regionserver/bin/hbase-daemon.sh --config /usr/hdp/current/hbase-regionserver/conf start regionserver'] due to not_if

Command completed successfully!