with tf.GradientTape () as tape:
L = tf.reduce_sum (tf.square (tf.matmul (X, w) + b - y))
w_grad, b_grad = tape.gradient (L, [w, b])
请问这里 L 涉及到一个求和,是如何求得偏导的
with tf.GradientTape () as tape:
L = tf.reduce_sum (tf.square (tf.matmul (X, w) + b - y))
w_grad, b_grad = tape.gradient (L, [w, b])
请问这里 L 涉及到一个求和,是如何求得偏导的