美文网首页
DGA域名识别(一):向量化表示

DGA域名识别(一):向量化表示

作者: 西域记 | 来源:发表于2018-11-13 15:57 被阅读0次

背景

在网络安全领域,许多僵尸网络为了维持与C&C的链接,并有效隐藏C&C服务器的域名,会使用DGA技术让僵尸主机持续解析大量的域名,并将有效C&C域名隐藏其中躲避黑白名单机制。衍生出的安全问题是:如何在大量的域名解析记录中识别出DGA域名。很多安全团队使用机器学习的方法,机器学习第一步需要将域名字符串解析为向量。

-白名单:alexa中排名前一百万的域名。
-黑名单:360netlab公布的DGA域名。

1.CountVectorizer()向量化

定义如下的一个CountVectorizer()

CV = CountVectorizer(ngram_range=(2, 4),
                     token_pattern=r'\w',
                     decode_error='ignore',
                     strip_accents='ascii',
                     stop_words='english',
                     max_df=1.0,
                     min_df=1)
x = load_alexa()
url = CV.fit_transform(x)
print(CV.vocabulary_)
print(len(CV.vocabulary_))
len = url.shape[0]
i = 0
while i < len:
    print("the url is: {} , and the vector is: {}".format(x[i],url[i].toarray()))
    i += 1

运行结果显示,在没有指定max_features属性,序列相关最小为2,最大为4的情况下,共有34972个单词。部分向量表示下:

the url is: google.com , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: youtube.com , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: facebook.com , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: baidu.com , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: wikipedia.org , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: yahoo.com , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: reddit.com , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: google.co.in , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: qq.com , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: twitter.com , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: taobao.com , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: amazon.com , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: google.co.jp , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: sohu.com , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: live.com , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: tmall.com , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: vk.com , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: instagram.com , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: sina.com.cn , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: 360.cn , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: google.de , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: jd.com , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: google.co.uk , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: linkedin.com , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: weibo.com , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: google.fr , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: google.ru , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: yandex.ru , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: google.com.br , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: yahoo.co.jp , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: netflix.com , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: google.com.hk , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: t.co , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: imgur.com , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: hao123.com , and the vector is: [[0 0 0 ... 0 0 0]]
the url is: google.it , and the vector is: [[0 0 0 ... 0 0 0]]

可以看到,在不进行max_feature属性指定时,维度非常大,已经无法正常显示,指定max_feature=30

LOAD ALEXA
  sorted(inconsistent))
{'b o': 0, 'u c o': 28, 'o o g l': 26, 'g o': 10, 'o g l e': 21, 'g l e c': 9, 'o m': 22, 'g l e': 8, 'e c o m': 5, 'u c': 27, 'e c o': 4, 'e d': 6, 'g l': 7, 'o n': 23, 'o o': 24, 'o g l': 20, 'o o g': 25, 'o c o': 17, 'c o': 1, 'o g': 19, 'l e c': 14, 'c o m': 2, 'g o o g': 12, 'e c': 3, 'l e': 13, 'u c o m': 29, 'o c o m': 18, 'g o o': 11, 'l e c o': 15, 'o c': 16}
30
the url is: google.com , and the vector is: [[0 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 0 0 0 1 1 1 1 0 1 1 1 0 0 0]]
the url is: youtube.com , and the vector is: [[0 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0]]
the url is: facebook.com , and the vector is: [[1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0]]
the url is: baidu.com , and the vector is: [[0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 1 1]]
the url is: wikipedia.org , and the vector is: [[0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]]
the url is: yahoo.com , and the vector is: [[0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 1 0 1 0 0 0 0 0]]
the url is: reddit.com , and the vector is: [[0 1 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0]]
the url is: google.co.in , and the vector is: [[0 1 0 1 1 0 0 1 1 1 1 1 1 1 1 1 0 0 0 1 1 1 0 1 1 1 1 0 0 0]]
the url is: qq.com , and the vector is: [[0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0]]
the url is: twitter.com , and the vector is: [[0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0]]
the url is: taobao.com , and the vector is: [[1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 1 0 0 0 0 0 0 0]]
the url is: amazon.com , and the vector is: [[0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0]]
the url is: google.co.jp , and the vector is: [[0 1 0 1 1 0 0 1 1 1 1 1 1 1 1 1 0 0 0 1 1 1 0 0 1 1 1 0 0 0]]
the url is: sohu.com , and the vector is: [[0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 1 1]]

2.TfidfVectorizer()向量化

定义如下的TfidfVectorizer()

TV = TfidfVectorizer(ngram_range=(2, 4),
                     token_pattern=r'\w',
                     decode_error='ignore',
                     strip_accents='ascii',
                     max_features=30,
                     stop_words='english',
                     max_df=1.0,
                     min_df=1)

读入相同的文件,结果如下:

LOAD ALEXA
D:\Program Files\Anaconda3\lib\site-packages\sklearn\feature_extraction\text.py:286: UserWarning: Your stop_words may be inconsistent with your preprocessing. Tokenizing the stop words generated tokens ['b', 'c', 'd', 'e', 'f', 'g', 'h', 'k', 'l', 'm', 'n', 'o', 'p', 'r', 's', 't', 'u', 'v', 'w', 'x', 'y'] not in stop_words.
  sorted(inconsistent))
{'g l': 7, 'e c': 3, 'o g l': 20, 'o m': 22, 'g o': 10, 'u c o m': 29, 'o c o m': 18, 'o n': 23, 'o g': 19, 'e c o m': 5, 'u c o': 28, 'o o': 24, 'o o g': 25, 'l e c o': 15, 'o c o': 17, 'l e c': 14, 'e d': 6, 'e c o': 4, 'o g l e': 21, 'c o': 1, 'g l e c': 9, 'o c': 16, 'l e': 13, 'g o o': 11, 'c o m': 2, 'u c': 27, 'b o': 0, 'g o o g': 12, 'o o g l': 26, 'g l e': 8}
30
the url is: google.com , and the vector is: [[0.         0.10749632 0.12299748 0.2110333  0.2110333  0.26240116
  0.         0.23347228 0.23347228 0.23347228 0.23347228 0.23347228
  0.23347228 0.23347228 0.23347228 0.23347228 0.         0.
  0.         0.23347228 0.23347228 0.23347228 0.12299748 0.
  0.19269932 0.23347228 0.23347228 0.         0.         0.        ]]
the url is: youtube.com , and the vector is: [[0.         0.24052746 0.27521195 0.47219574 0.47219574 0.58713344
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.27521195 0.
  0.         0.         0.         0.         0.         0.        ]]
the url is: facebook.com , and the vector is: [[0.68254156 0.27961273 0.31993338 0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.31993338 0.
  0.50123747 0.         0.         0.         0.         0.        ]]
the url is: baidu.com , and the vector is: [[0.         0.21569465 0.2467982  0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.2467982  0.
  0.         0.         0.         0.52651594 0.52651594 0.52651594]]
the url is: wikipedia.org , and the vector is: [[0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.
  0. 0. 0. 0. 0. 0.]]
the url is: yahoo.com , and the vector is: [[0.         0.20117971 0.23019019 0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.49108463 0.49108463
  0.49108463 0.         0.         0.         0.23019019 0.
  0.36063741 0.         0.         0.         0.         0.        ]]
the url is: reddit.com , and the vector is: [[0.         0.32313599 0.36973278 0.         0.         0.
  0.78878291 0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.36973278 0.
  0.         0.         0.         0.         0.         0.        ]]
the url is: google.co.in , and the vector is: [[0.         0.10916042 0.         0.21430022 0.21430022 0.
  0.         0.23708657 0.23708657 0.23708657 0.23708657 0.23708657
  0.23708657 0.23708657 0.23708657 0.23708657 0.         0.
  0.         0.23708657 0.23708657 0.23708657 0.         0.26646328
  0.19568241 0.23708657 0.23708657 0.         0.         0.        ]]
the url is: qq.com , and the vector is: [[0.         0.52570485 0.60151243 0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.60151243 0.
  0.         0.         0.         0.         0.         0.        ]]
the url is: twitter.com , and the vector is: [[0.         0.52570485 0.60151243 0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.60151243 0.
  0.         0.         0.         0.         0.         0.        ]]
the url is: taobao.com , and the vector is: [[0.46588511 0.19085638 0.21837821 0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.46588511 0.46588511
  0.46588511 0.         0.         0.         0.21837821 0.
  0.         0.         0.         0.         0.         0.        ]]
the url is: amazon.com , and the vector is: [[0.         0.32313599 0.36973278 0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.36973278 0.78878291
  0.         0.         0.         0.         0.         0.        ]]
the url is: google.co.jp , and the vector is: [[0.         0.11325515 0.         0.22233886 0.22233886 0.
  0.         0.24597995 0.24597995 0.24597995 0.24597995 0.24597995
  0.24597995 0.24597995 0.24597995 0.24597995 0.         0.
  0.         0.24597995 0.24597995 0.24597995 0.         0.
  0.20302268 0.24597995 0.24597995 0.         0.         0.        ]]
the url is: sohu.com , and the vector is: [[0.         0.21569465 0.2467982  0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.2467982  0.
  0.         0.         0.         0.52651594 0.52651594 0.52651594]]

相关文章

  • DGA域名识别(一):向量化表示

    背景 在网络安全领域,许多僵尸网络为了维持与C&C的链接,并有效隐藏C&C服务器的域名,会使用DGA技术让僵尸主机...

  • DGA域名

    什么是DGA? dga是一种算法,作用生成随机数的。 什么是dga域名? 是用dga算法生成的域名,这种域名通常硬...

  • unit 13 DGA域名识别

    《web安全之深度学习实战》第十三章:DGA域名识别通过使用Alexa平台的白名单域名和360netlab发布的D...

  • DGA域名识别(二):模型训练

    向量化代码: 1.朴素贝叶斯 模型训练代码: 2.Xgboost: 3.MLP:

  • DGA总结备忘

    1、适合中小企业的DGA域名检测 使用LSTM或者CNN构建的DGA检测模型,这种方法需要使用深度学习自动提取特征...

  • botnet、C&C服务器相关基础

    2020/05/15文章[1]对DGA的域名生成进行了简单的描述。我的理解就是,通过模拟和恶意软件一样的DGA算法...

  • 使用LSTM检测DGA

    前言 DGA可谓是网络安全领域的一个绕不过的话题,针对DGA的检测通常分为两类:一类是通过域名相关字符特征判断一个...

  • 机器学习学习笔记--朴素贝叶斯检测DGA域名

    DGA(域名生成算法)是一种利用随机字符来生成C&C域名,从而逃避域名黑名单检测的技术手段。例如,一个由Crypt...

  • 机器学习学习笔记--朴素贝叶斯实践

    朴素贝叶斯算法是应用最为广泛的分类算法之一。简称NB算法。可以用来检测异常操作,检测DGA域名,检测针对Apach...

  • 【学习笔记】第四周 深层神经网络

    梯度下降公式,包括向量化 深层神经网络表示,注意一些参数的数值 深层神经网络的前向传播,注意右边矢量化的内容 某一...

网友评论

      本文标题:DGA域名识别(一):向量化表示

      本文链接:https://www.haomeiwen.com/subject/phmkfqtx.html