美文网首页
第一门课第二周作业

第一门课第二周作业

作者: 安于此生__ | 来源:发表于2018-04-27 11:06 被阅读0次

1 - Building basic functions with numpy

1.1 - sigmoid function, np.exp()

math.exp()和np.exp(),前者只能作用于单个的数,后者可以作用于向量。

1.2 - Sigmoid gradient

# GRADED FUNCTION: sigmoid_derivative

def sigmoid_derivative(x):
    """
    Compute the gradient (also called the slope or derivative) of the sigmoid function with respect to its input x.
    You can store the output of the sigmoid function into variables and then use it to calculate the gradient.
    
    Arguments:
    x -- A scalar or numpy array

    Return:
    ds -- Your computed gradient.
    """
    
    ### START CODE HERE ### (≈ 2 lines of code)
    s = 1/(1+np.exp(-x))
    ds = s*(1-s)
    ### END CODE HERE ###
    
    return ds

1.3 - Reshaping arrays

For example, in computer science, an image is represented by a 3D array of shape (length,height,depth=3). However, when you read an image as the input of an algorithm you convert it to a vector of shape (length∗height∗3,1).
if you would like to reshape an array v of shape (a, b, c) into a vector of shape (ab,c) you would do:
v = v.reshape((v.shape[0]
v.shape[1], v.shape[2])) # v.shape[0] = a ; v.shape[1] = b ; v.shape[2] = c

# GRADED FUNCTION: image2vector
def image2vector(image):
    """
    Argument:
    image -- a numpy array of shape (length, height, depth)
    
    Returns:
    v -- a vector of shape (length*height*depth, 1)
    """
    
    ### START CODE HERE ### (≈ 1 line of code)
    v =image.reshape(image.shape[0]*image.shape[1]*image.shape[2],1)
    ### END CODE HERE ###
    
    return v
# This is a 3 by 3 by 2 array, typically images will be (num_px_x, num_px_y,3) where 3 represents the RGB values
image = np.array([[[ 0.67826139,  0.29380381],
        [ 0.90714982,  0.52835647],
        [ 0.4215251 ,  0.45017551]],

       [[ 0.92814219,  0.96677647],
        [ 0.85304703,  0.52351845],
        [ 0.19981397,  0.27417313]],

       [[ 0.60659855,  0.00533165],
        [ 0.10820313,  0.49978937],
        [ 0.34144279,  0.94630077]]])

print ("image2vector(image) = " + str(image2vector(image)))
result:
image2vector(image) = [[ 0.67826139]
 [ 0.29380381]
 [ 0.90714982]
 [ 0.52835647]
 [ 0.4215251 ]
 [ 0.45017551]
 [ 0.92814219]
 [ 0.96677647]
 [ 0.85304703]
 [ 0.52351845]
 [ 0.19981397]
 [ 0.27417313]
 [ 0.60659855]
 [ 0.00533165]
 [ 0.10820313]
 [ 0.49978937]
 [ 0.34144279]
 [ 0.94630077]]

1.4 - Normalizing rows

# GRADED FUNCTION: normalizeRows

def normalizeRows(x):
    """
    Implement a function that normalizes each row of the matrix x (to have unit length).
    
    Argument:
    x -- A numpy matrix of shape (n, m)
    
    Returns:
    x -- The normalized (by row) numpy matrix. You are allowed to modify x.
    """
    
    ### START CODE HERE ### (≈ 2 lines of code)
    # Compute x_norm as the norm 2 of x. Use np.linalg.norm(..., ord = 2, axis = ..., keepdims = True)
    x_norm = np.linalg.norm(x,axis=1,keepdims = True)#求范数,默认的ord=2
    print(x_norm.shape)
    # Divide x by its norm.
    x = x/x_norm
    print(x.shape)
    ### END CODE HERE ###

    return x
参数说明

1.5 - Broadcasting and the softmax function softmax函数

def softmax(x):
    """Calculates the softmax for each row of the input x.

    Your code should work for a row vector and also for matrices of shape (n, m).

    Argument:
    x -- A numpy matrix of shape (n,m)

    Returns:
    s -- A numpy matrix equal to the softmax of x, of shape (n,m)
    """
    
    ### START CODE HERE ### (≈ 3 lines of code)
    # Apply exp() element-wise to x. Use np.exp(...).
    x_exp =np.exp(x)

    # Create a vector x_sum that sums each row of x_exp. Use np.sum(..., axis = 1, keepdims = True).
    x_sum = np.sum(x_exp,axis=1,keepdims = True)
    
    # Compute softmax(x) by dividing x_exp by x_sum. It should automatically use numpy broadcasting.
    s = x_exp/x_sum

    ### END CODE HERE ###
    
    return s

What you need to remember:

  • np.exp(x) works for any np.array x and applies the exponential function to every coordinate
  • the sigmoid function and its gradient
  • image2vector is commonly used in deep learning
  • np.reshape is widely used. In the future, you'll see that keeping your matrix/vector dimensions > - straight will go toward eliminating a lot of bugs.
  • numpy has efficient built-in functions
  • broadcasting is extremely useful

2) Vectorization

Note that np.dot() performs a matrix-matrix or matrix-vector multiplication. This is different from np.multiply() and the * operator (which is equivalent to .* in Matlab/Octave), which performs an element-wise multiplication.

相关文章

  • 第一门课第二周作业

    1 - Building basic functions with numpy 1.1 - sigmoid fun...

  • 考完啦

    ——12月12日,第二十三周 红红火火恍恍惚惚,还没晃过神来,第二学年第一门课就这么震惊的结课了。。。 ...

  • 走出练琴困局

    已经两周没有上钢琴课了,第一周是因为出差,不在北京,上不了课,第二周是因为我没有完成作业,心理压力实在太大,不得已...

  • “上手区块链”重要资料

    《项目白皮书》 《学员规定》 《学员学号》 学员第一周作业 学员第二周作业 学员第三周作业 学员第四周作业

  • 13周跑步计划

    第二周第一课 成功打卡✌️

  • 《阿何写作训练》第一、第二课思维导图

    《阿何写作训练》第一、第二课思维导图 时间:2017年1月 活动发起:早睡早起写作内训课 周次:第一周 第一课 第二课

  • 儿子的“零分”作业

    身为小学生的家长,完成学校老师布置的作业可谓是中国式家长的重要任务。 这学期孩子新开了门“综合”课。开学第二周,综...

  • 开学总结

    女儿,按部就班,开学三周,第一周各科作业资料未发,作业相对少,每天能查缺补漏,第二周作业发全了,作业多起来,没时间...

  • 铃铛子-63号-作业

    交作业、第一节课和第二节课 希望老师指正

  • 二级市场集训第9期-第一次笔记-郭荣(2/4)

    第一周第二课(一共四课)

网友评论

      本文标题:第一门课第二周作业

      本文链接:https://www.haomeiwen.com/subject/lgykhftx.html