1. 程式人生 > >定時備份指令碼分享(網站資料和資料庫資料)

定時備份指令碼分享(網站資料和資料庫資料)

1)網站資料備份
將網站資料/var/www/vhost/www.hqsb.com和/var/www/vhost/www.huanqiu.com分別備份到:
/Data/code-backup/www.hqsb.com和/Data/code-backup/www.huanqiu.com下。
 

[[email protected]_web5 code-backup]# cat web_code_backup.sh
#!/bin/bash
   
#備份網站資料
/bin/tar -zvcf /Data/code-backup/www.kevin.com/www.kevin.com_`date +%Y%m%d_%H%M%S`.tar.gz /var/www/vhosts/www.kevin.com
/bin/tar -zvcf /Data/code-backup/www.grace.com/www.grace.com_`date +%Y%m%d_%H%M%S`.tar.gz /var/www/vhosts/www.grace.com
   
#刪除一週之前的備份檔案
find /Data/code-backup/www.kevin.com -type f -mtime +7 -exec rm -f {} \;
find /Data/code-backup/www.grace.com -type f -mtime +7 -exec rm -f {} \;
   
[
[email protected]
_web5 ~]# crontab -l #每天凌晨5點備份網站資料 0 5 * * * /bin/bash -x /Data/code-backup/web_code_backup.sh > /dev/null 2>&1 備份後的效果如下: [[email protected]_web5 ~]# ls /Data/code-backup/www.kevin.com/ www.kevin.com_20170322_174328.tar.gz [[email protected]_web5 ~]# ls /Data/code-backup/www.grace.com/ www.grace.com_20170322_174409.tar.gz

2)資料庫備份(自動刪除10天前的備份檔案)
資料庫服務使用的是阿里雲的mysql,遠端進行定時的全量備份,備份到本地,以防萬一。mysql資料庫遠端備份的資料最好打包壓縮

[[email protected] crontab]# pwd
/Data/Mysql_Bakup/crontab
[[email protected] crontab]# cat backup_db_wangshibo.sh
#!/bin/bash
MYSQL="/usr/bin/mysql"
MYSQLDUMP="/usr/bin/mysqldump"
BACKUP_DIR="/Data/Mysql_Bakup"
#DB_SOCKET="/var/lib/mysql/mysql.sock"
DB_hostname="110.120.11.9"
DBNAME="wangshibo"
DB_USER="db_wangshibo"
DB_PASS="mhxzk3rfzh"
TIME=`date +%Y%m%d%H%M%S`
LOCK_FILE="${BACKUP_DIR}/lock_file.tmp"
BKUP_LOG="/Data/Mysql_Backup/${TIME}_bkup.log"
DEL_BAK=`date -d '10 days ago' '+%Y%m%d'`
##To judge lock_file
if [[ -f $LOCK_FILE ]];then
exit 255
else
echo $$ > $LOCK_FILE
fi
 
##dump databases##
echo ${TIME} >> ${BKUP_LOG}
echo "=======Start Bakup============" >>${BKUP_LOG}
#${MYSQLDUMP} -h ${DB_hostname} -u${DB_USER} -p${DB_PASS} --databases ${DBNAME} | gzip -9 > ${BACKUP_DIR}/${TIME}.${DBNAME}.gz
${MYSQLDUMP} -h ${DB_hostname} -u${DB_USER} -p${DB_PASS} --databases ${DBNAME} |gzip -9 > ${BACKUP_DIR}/${TIME}.${DBNAME}.gz
echo "=======Finished Bakup============" >>${BKUP_LOG}
/bin/rm -f ${LOCK_FILE}
 
##del back 10 days before##
/bin/rm -f ${BACKUP_DIR}/${DEL_BAK}*.gz

定時進行備份

[[email protected] Mysql_Bakup]# crontab -l
10 0,6,12,18 * * * /bin/bash /Data/Mysql_Bakup/crontab/backup_db_wangshibo.sh >/dev/null 2>&1

指令碼執行後的備份效果如下

[[email protected] crontab]# cd /Data/Mysql_Bakup
[[email protected] Mysql_Bakup]# ls
20161202061001.wangshibo.gz

同步線上資料庫到beta環境資料庫(覆蓋beta資料庫):
將上面定時備份的資料包拷貝到beta機器上,然後解壓,登陸mysql,source命令進行手動覆蓋。

--------------------------------------------------------------------再看一例-----------------------------------------------------------------------

[[email protected] online_bak]# cat rsync.sh      (指令碼中的同步:限速3M,保留最近一個月的備份)
#!/bin/bash
 
# ehr data backup----------------------------------------------------------
cd /data/bak/online_bak/192.168.34.27/tomcat_data/
/usr/bin/rsync -e "ssh -p22222" -avpgolr --bwlimit=3072 192.168.34.27:/data/tomcat7/webapps /data/bak/online_bak/192.168.34.27/tomcat_data/`date +%Y%m%d`
/bin/tar -zvcf  `date +%Y%m%d`.tar.gz `date +%Y%m%d`
rm -rf `date +%Y%m%d`
 
cd /data/bak/online_bak/192.168.34.27/tomcat_data/
NUM1=`ls -l|awk '{print $9}'|grep 2017|wc -l`
I1=$( /usr/bin/expr $NUM1 - 30 )
ls -l|awk '{print $9}'|grep 2017|sed -n "1,$I1 p"|xargs rm -rf
 
# zp data backup----------------------------------------------------------
cd /data/bak/online_bak/192.168.34.33/tomcat_data/
/usr/bin/rsync -e "ssh -p22222" -avpgolr --bwlimit=3072 192.168.34.33:/data/tomcat8/webapps /data/bak/online_bak/192.168.34.33/tomcat_data/`date +%Y%m%d`
/bin/tar -zvcf  `date +%Y%m%d`.tar.gz `date +%Y%m%d`
rm -rf `date +%Y%m%d`
 
cd /data/bak/online_bak/192.168.34.33/tomcat_data/
NUM2=`ls -l|awk '{print $9}'|grep 2017|wc -l`
I2=$( /usr/bin/expr $NUM2 - 30 )
ls -l|awk '{print $9}'|grep 2017|sed -n "1,$I2 p"|xargs rm -rf
 
cd /data/bak/online_bak/192.168.34.33/upload
/usr/bin/rsync -e "ssh -p22222" -avpgolr --bwlimit=3072 192.168.34.33:/home/zrx_hr/upload /data/bak/online_bak/192.168.34.33/upload/`date +%Y%m%d`
/bin/tar -zvcf  `date +%Y%m%d`.tar.gz `date +%Y%m%d`
rm -rf `date +%Y%m%d`
 
cd /data/bak/online_bak/192.168.34.33/upload
NUM3=`ls -l|awk '{print $9}'|grep 2017|wc -l`
I3=$( /usr/bin/expr $NUM3 - 30 )
ls -l|awk '{print $9}'|grep 2017|sed -n "1,$I3 p"|xargs rm -rf
 
# zabbix mysql backup----------------------------------------------------------
/bin/mkdir /data/bak/online_bak/192.168.16.21/mysql_data/`date +%Y%m%d`
/data/mysql/bin/mysqldump -hlocalhost -uroot [email protected]@@-12345 --databases zabbix > /data/bak/online_bak/192.168.16.21/mysql_data/`date +%Y%m%d`/zabbix.sql
 
cd /data/bak/online_bak/192.168.16.21/mysql_data/
/bin/tar -zvcf  `date +%Y%m%d`.tar.gz `date +%Y%m%d`
rm -rf `date +%Y%m%d`
 
cd /data/bak/online_bak/192.168.16.21/mysql_data/
NUM4=`ls -l|awk '{print $9}'|grep 2017|wc -l`
I4=$( /usr/bin/expr $NUM4 - 30 )
ls -l|awk '{print $9}'|grep 2017|sed -n "1,$I4 p"|xargs rm -rf
 
[[email protected] online_bak]# pwd
/data/bak/online_bak
[[email protected] online_bak]# ls
192.168.16.21    rsync.sh
192.168.34.27  192.168.34.33 
[[email protected] online_bak]# ll
total 10K
drwxr-xr-x   3 root root   23 Aug 19 17:47 192.168.16.21
drwxr-xr-x   4 root root   41 Aug 19 18:30 192.168.34.27
drwxr-xr-x   4 root root   37 Aug 19 18:17 192.168.34.33
-rwxr-xr-x   1 root root 6.3K Aug 19 19:20 rsync.sh
 
[[email protected] online_bak]# ll 192.168.16.21/
total 4.0K
drwxr-xr-x  2 root root   28 Aug 19 19:43 mysql_data
 
[[email protected] online_bak]# ll 192.168.16.21/mysql_data/
total 1.5G
-rw-r--r-- 1 root root 1.5G Aug 19 19:43 20170819.tar.gz
 
[[email protected] online_bak]# ll 192.168.34.27
total 4.0K
drwxr-xr-x  2 root root 4.0K Aug 19 19:26 tomcat_data
 
[[email protected] online_bak]# ll 192.168.34.27/tomcat_data/
total 3.9G
......
-rw-r--r-- 1 root root 140M Aug 19 11:06 20170818.tar.gz
-rw-r--r-- 1 root root 140M Aug 19 19:26 20170819.tar.gz
 
[[email protected] online_bak]# ll 192.168.34.33
total 8.0K
drwxr-xr-x  2 root root 4.0K Aug 19 19:26 tomcat_data
drwxr-xr-x  2 root root   28 Aug 19 19:30 upload
 
[[email protected] online_bak]# crontab -l
# online backup
0 2 * * * /bin/bash -x /data/bak/online_bak/rsync.sh > /dev/null 2>&1

---------------------------------------------------------------------------------------------------

取一個目錄下,按照檔案/目錄的修改時間來排序,取最後一次修改的檔案
[[email protected] xcspam]$ ls
bin                    xcspam-20170802145542  xcspam-20170807204545  xcspam-20170814115753  xcspam-20170818115806  xcspam-20170824162641  xcspam-20170831173616 
xcspam                 xcspam-20170802194447  xcspam-20170808163425  xcspam-20170815191150  xcspam-20170821122949  xcspam-20170824165020  xcspam-20170831191347
xcspam-20170731154018  xcspam-20170803113809  xcspam-20170808195340  xcspam-20170815210032  xcspam-20170821153300  xcspam-20170829100941  xcspam-20170904105109
xcspam-20170801190647  xcspam-20170807150022  xcspam-20170809103648  xcspam-20170816141022  xcspam-20170822173600  xcspam-20170831135623  xcspam-20170911120519
xcspam-20170802142921  xcspam-20170807164137  xcspam-20170809111246  xcspam-20170816190704  xcspam-20170823101913  xcspam-20170831160115  xcspam-20170911195802
[[email protected] xcspam]$ ls -rtd xcspam* |tail -1
xcspam-20170911195802
 
[[email protected] xcspam]$ ls -rtd xcspam* |tail -2|head -1   //這是倒數第二個被修改的檔案