我司有一个经销商管理系统,下属经销商共有200多个,系统Owner为了便于进行权限管理,给每一个Dealer创建了一个SQL DB。对于集团业务用户来说,目前最大的痛点是,如果想分析整个经销商的表现,就必须要集成所有DB的数据;
但是事情也没那么简单,200多个DB的名称不一致,相同的数据,表名称也不一致,因为有DB名称作为表名称的前缀,因此,想实现这个功能必须要递归所有DB的SCHEMA,查找system view中的TABLE名称。
Solution是使用存储过程,使用动态SQL来递归查询DB的表信息,并将结果插入到目标;
目前已经可以将数据导出,以csv文件的格式存储,但是我们还需要进行异地备份,因此有了下面的批处理命令;
@echo off
rem Author: Zhu Yue
rem Create Date: 2019-12-04
rem Update Date: 2019-12-05
rem Version Number: v1.0
rem Description: Add password and move to shared folder
rem 设置数据对象名称,数据库实例,数据对象后缀名
set Object="Item_Ledger_Entry"
set Instance="10.120.111.121"
set suffix=%Instance:~-3%
rem 设置Log文件目录,数据存储目录,SSIS文件路径
set LogFolder="\DataTransfer\VGIC_DWH\Receive\DMS\Logs"
set DataFolder="\DataTransfer\VGIC_DWH\Receive\DMS"
set SSIS="\DataTransfer\VGIC_DWH\Receive\DMS\SSIS\"%Object%".dtsx"
rem 获取当前日期时间
set batch_number=%date:~6,4%%date:~0,2%%date:~3,2%%time:~0,2%%time:~3,2%%time:~6,2%
rem 设置压缩文件密码
set password=%date:~6,4%%date:~0,2%%date:~3,2%
echo New batch number is %batch_number%, process begin! > %LogFolder%\DMS_Exp_%Object%_%batch_number%.log
rem 执行存储过程,准备数据
sqlcmd -S %Instance% -C -Q "EXECUTE [VGIC_DWH].[dbo].[usp_DMS_Data_Consolidation] @Object = N'$Item Ledger Entry'" >> %LogFolder%\DMS_Exp_%Object%_%batch_number%.log
rem 执行SSIS过程,导出数据
%DT% /f %SSIS% >> %LogFolder%\DMS_Exp_%Object%_%batch_number%.log
rem 切换数据目录
CD D:/ >> %LogFolder%\DMS_Exp_%Object%_%batch_number%.log
CD D:\VGIC_DWH\DMS >> %LogFolder%\DMS_Exp_%Object%_%batch_number%.log
rem 重命名导出数据文件,添加时间戳
rename %Object%.csv %Object%_%batch_number%.csv >> %LogFolder%\DMS_Exp_%Object%_%batch_number%.log
rem 压缩为zip格式,添加密码
%ZIP% a %Object%_%batch_number%.zip -p%password% %Object%_%batch_number%.csv >> %LogFolder%\DMS_Exp_%Object%_%batch_number%.log
rem 移动压缩后文件到Shared Folder目录
move /y %Object%_%batch_number%.zip %DataFolder% >> %LogFolder%\DMS_Exp_%Object%_%batch_number%.log
rem 删除CSV原始文件
del %Object%_%batch_number%.csv
echo End process! Please refer to log file in %LogFolder% >> %LogFolder%\DMS_Exp_%Object%_%batch_number%.log
看似不多的批处理命令,实现了很多功能点,目前使用上还比较高效,后续可能会继续优化;
网友评论