美文网首页交通运输工程
k近邻法的kd Tree搜索

k近邻法的kd Tree搜索

作者: jhttroy | 来源:发表于2018-09-05 18:04 被阅读88次

最近在读李航老师的《统计学习方法》,读到第三章的k近邻算法时,在N>>k时遍历搜索比较费时,为了更高效的搜索可以采用kd Tree的方式组织Training数据,我看到一篇博客,前面的图示理解部分说的比较到位,不过博主的代码有些问题,包括:

  • 代码的完整逻辑是不对的
  • 并没有对已搜索的节点进行标注,导致重复计算(走回头路),这样整个kd Tree的初衷就完全没意义了

于是在该代码基础上改了一版正确的,带了一些调试信息。

  • 更正代码逻辑
  • 代码clean up
  • 更新为python 3兼容
  • 增加up_traced标志避免走回头路,支持多次搜索,每次搜索前做好该标志的清理
# -*- coding: utf-8 -*-

import numpy as np


class Node:
    def __init__(self, data, parent, dim):
        self.data = data
        self.parent = parent
        self.lChild = None
        self.rChild = None
        self.dim = dim
        # only track search_Up process
        self.up_traced = False

    def setLChild(self, lChild):
        self.lChild = lChild

    def setRChild(self, rChild):
        self.rChild = rChild


class KdTree:
    def __init__(self, train):
        self.root = self.__build(train, 1, None)

    def __build(self, train, depth, parent):  # 递归建树
        (m, k) = train.shape

        if m == 0:
            return None

        train = train[train[:, depth % k].argsort()]

        root = Node(train[m//2], parent, depth % k)
        root.setLChild(self.__build(train[:m//2, :], depth+1, root))
        root.setRChild(self.__build(train[m//2+1:, :], depth+1, root))
        return root

    def findNearestPointAndDistance(self, point):  # 查找与point距离最近的点
        point = np.array(point)
        node = self.__findSmallestSubSpace(point, self.root)
        print("Start node:", node.data)
        return self.__searchUp(point, node, node, np.linalg.norm(point - node.data))

    def __searchUp(self, point, node, nearestPoint, nearestDistance):
        if node.parent is None:
            return [nearestPoint, nearestDistance]

        print("UP:", node.parent.data)
        node.parent.up_traced = True
        distance = np.linalg.norm(node.parent.data - point)
        if distance < nearestDistance:
            nearestDistance = distance
            nearestPoint = node.parent

        distance = np.abs(node.parent.data[node.dim] - point[node.parent.dim])
        if distance < nearestDistance:
            [p, d] = self.__searchDown(point, node.parent)
            if d < nearestDistance:
                nearestDistance = d
                nearestPoint = p

        [p, d] = self.__searchUp(point, node.parent, nearestPoint, nearestDistance)
        if d < nearestDistance:
            nearestDistance = d
            nearestPoint = p

        return [nearestPoint, nearestDistance]

    def __searchDown(self, point, node):

        nearestDistance = np.linalg.norm(node.data - point)
        nearestPoint = node

        print("DOWN:", node.data)
        if node.lChild is not None and node.lChild.up_traced is False:
            [p, d] = self.__searchDown(point, node.lChild)
            if d < nearestDistance:
                nearestDistance = d
                nearestPoint = p

        if node.rChild is not None and node.rChild.up_traced is False:
            [p, d] = self.__searchDown(point, node.rChild)
            if d < nearestDistance:
                nearestDistance = d
                nearestPoint = p

        print("---- ", nearestPoint.data, nearestDistance)
        return [nearestPoint, nearestDistance]

    def __findSmallestSubSpace(self, point, node):  # 找到这个点所在的最小的子空间
        """
        从根节点出发,递归地向下访问kd树。如果point当前维的坐标小于切分点的坐标,则
        移动到左子节点,否则移动到右子节点。直到子节点为叶节点为止。
        """
        # New search: clean up up_traced flag for all up path nodes
        node.up_traced = False
        if point[node.dim] < node.data[node.dim]:
            if node.lChild is None:
                return node
            else:
                return self.__findSmallestSubSpace(point, node.lChild)
        else:
            if node.rChild is None:
                return node
            else:
                return self.__findSmallestSubSpace(point, node.rChild)


train = np.array([[2, 3], [5, 4], [9, 6], [4, 7], [8, 1], [7, 2]])
train = np.array([[2, 5], [3, 2], [3, 7], [8, 3], [6, 6], [1, 1], [1, 8]])
kdTree = KdTree(train)


target = np.array([6, 4])
print('##### target :', target)
[p, d] = kdTree.findNearestPointAndDistance(target)

print(p.data, d)
print('---------------------')

(m, k) = train.shape
for i in range(m):
    print(train[i], np.linalg.norm(train[i]-target))


target = np.array([2, 2])
print('')
print('##### target :', target)
[p, d] = kdTree.findNearestPointAndDistance(target)

print(p.data, d)
print('---------------------')

for i in range(m):
    print(train[i], np.linalg.norm(train[i]-target))

相关文章

  • k 近邻法

    k 近邻法 k 近邻算法 k 近邻模型 k 近邻法的实现:kd 树 搜索 kd 树 k 近邻模型实现 k 近邻模型...

  • k近邻法的kd Tree搜索

    最近在读李航老师的《统计学习方法》,读到第三章的k近邻算法时,在N>>k时遍历搜索比较费时,为了更高效的搜索可以采...

  • 最近邻查找算法KDtree

    根据两篇博文实现的KD-tree。k-d tree算法原理及实现,最近邻查找算法kd-tree。

  • Scala实现:KD-Tree(k-dimensional tr

    Scala实现:KD-Tree(k-dimensional tree) kd-tree是一种分割k维数据空间的数据...

  • 02 KNN算法 - KD Tree

    KD Tree是KNN算法中用于计算最近邻的快速简便的构建方式。 当样本量少的时候,用brute直接搜索最近邻的方...

  • k近邻

    kd树搜索怎么从最近邻扩展到k近邻 设计一个最小堆(或优先队列),堆大小限制为k,先搜到近似最近邻点u,这个过程中...

  • KNN算法-4-算法优化-KD树

    KD树 KNN算法的重要步骤是对所有的实例点进行快速k近邻搜索。如果采用线性扫描(linear scan),要计算...

  • 统计学习方法之kNN算法

    k 近邻是什么 k 近邻法是机器学习中最基本的分类和回归方法,也称为kNN算法。通常k近邻法用于分类问题。k近邻法...

  • KNN的实现:kd树(python)

    在寻找输入样本的k个近邻的时候,若进行线性扫描,对于大数据集来说耗时太久,为了加快搜索速度,提出了用kd树实现k个...

  • 模式识别——6 其他分类方法

    6.1 近邻法 6.1.1 最近邻法 6.1.2 K-近邻法 6.1.3 近邻法的快速算法 6.1.4剪辑近邻法 ...

网友评论

    本文标题:k近邻法的kd Tree搜索

    本文链接:https://www.haomeiwen.com/subject/oiqzwftx.html