Skip to content
This repository was archived by the owner on Nov 17, 2023. It is now read-only.

Commit ccd24a8

Browse files
ElaineBaowkcn
authored andcommitted
avoid test relu at the origin due to discontinuous gradient (#16133)
* avoid test relu at the origin due to discontinuous gradient * retrigger CI
1 parent c5383f7 commit ccd24a8

File tree

1 file changed

+5
-1
lines changed

1 file changed

+5
-1
lines changed

tests/python/mkl/test_mkldnn.py

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -337,13 +337,17 @@ def check_pooling_training(stype):
337337
def test_activation():
338338
def check_activation_training(stype):
339339
for shape in [(2, 3, 3), (2, 3, 2, 2)]:
340+
eps = 1e-5
340341
data_tmp = np.random.normal(-0.1, 1, size=shape)
342+
# Avoid finite difference method inaccuracies due to discontinuous gradient at the origin.
343+
# Here we replace small problematic inputs with 1.0. Repro issue with seed 851486559.
344+
data_tmp[abs(data_tmp) < eps] = 1.0
341345

342346
data = mx.symbol.Variable('data', stype=stype)
343347
in_location = [mx.nd.array(data_tmp).tostype(stype)]
344348

345349
test = mx.symbol.Activation(data, act_type="relu")
346-
check_numeric_gradient(test, in_location, numeric_eps=1e-5, rtol=0.16, atol=1e-4)
350+
check_numeric_gradient(test, in_location, numeric_eps=eps, rtol=0.16, atol=1e-4)
347351

348352
stypes = ['row_sparse', 'default']
349353
for stype in stypes:

0 commit comments

Comments
 (0)