1、下载
wget https://grafanarel.s3.amazonaws.com/builds/grafana-4.1.1-1484211277.linux-x64.tar.gz
2、解压
tar -zxvf grafana-4.1.1-1484211277.linux-x64.tar.gz -C ../app/
3、启动
[hadoop@hadoop000 grafana-4.1.1-1484211277]$ sudo ./bin/grafana-server &
[1] 3075
[hadoop@hadoop000 grafana-4.1.1-1484211277]$ INFO[07-09|23:40:10] Starting Grafana logger=main version=4.1.1 commit=v4.1.1 compiled=2017-01-12T16:48:03+0800
INFO[07-09|23:40:10] Config loaded from logger=settings file=/home/hadoop/soul/app/grafana-4.1.1-1484211277/conf/defaults.ini
INFO[07-09|23:40:10] Path Home logger=settings path=/home/hadoop/soul/app/grafana-4.1.1-1484211277
INFO[07-09|23:40:10] Path Data logger=settings path=/home/hadoop/soul/app/grafana-4.1.1-1484211277/data
INFO[07-09|23:40:10] Path Logs logger=settings path=/home/hadoop/soul/app/grafana-4.1.1-1484211277/data/log
INFO[07-09|23:40:10] Path Plugins logger=settings path=/home/hadoop/soul/app/grafana-4.1.1-1484211277/data/plugins
INFO[07-09|23:40:10] Initializing DB logger=sqlstore dbtype=sqlite3
INFO[07-09|23:40:10] Starting DB migration logger=migrator
INFO[07-09|23:40:10] Executing migration logger=migrator id="copy data account to org"
INFO[07-09|23:40:10] Skipping migration condition not fulfilled logger=migrator id="copy data account to org"
INFO[07-09|23:40:10] Executing migration logger=migrator id="copy data account_user to org_user"
INFO[07-09|23:40:10] Skipping migration condition not fulfilled logger=migrator id="copy data account_user to org_user"
INFO[07-09|23:40:10] Starting plugin search logger=plugins
INFO[07-09|23:40:10] Initializing Alerting logger=alerting.engine
INFO[07-09|23:40:10] Initializing CleanUpService logger=cleanup
INFO[07-09|23:40:10] Initializing HTTP Server logger=server address=0.0.0.0:3000 protocol=http subUrl=
INFO[07-09|23:40:14] Request Completed logger=context userId=0 orgId=0 uname= method=GET path=/ status=302 remote_addr=192.168.245.1 time_ms=1ns size=29
INFO[07-09|23:40:14] Request Completed logger=context userId=0 orgId=0 uname= method=GET path=/login status=302 remote_addr=192.168.245.1 time_ms=4ns size=24
4、访问
http://hadoop000:3000/
用户名:admin
密码:admin
网友评论