test_ops_dropout_ext.py::test_func_dropout_normal failed in daily version
2.3分支已从门禁level0下至level1, 回归时需要重新加回门禁跑
Ascend
/GPU
/CPU
) / 硬件环境:Please delete the backend not involved / 请删除不涉及的后端:
/device ascend910B
Software Environment / 软件环境 (Mandatory / 必填):
-- MindSpore version (e.g., 1.7.0.Bxxx) :
-- Python version (e.g., Python 3.7.5) :
-- OS platform and distribution (e.g., Linux Ubuntu 16.04):
-- GCC/Compiler version (if compiled from source):
Excute Mode / 执行模式 (Mandatory / 必填)(PyNative
/Graph
):
Please delete the mode not involved / 请删除不涉及的模式:
/mode pynative
/mode graph
[gate failed]tests/st/ops
test_ops_dropout_ext.py::test_func_dropout_normal
[2024-05-06T02:52:21.904Z] [35m[task_id] 9df29dfe0b4f11efa1938c2a8e86bdd7[0m
[2024-05-06T02:52:21.904Z] [35m[collect_file_path] /home/****/sault/sault_component/Angel/run_case_log/9df29dfe0b4f11efa1938c2a8e86bdd7/test_ops_dropout_ext_test_func_dropout_normal_8_92_9_215_5.tar.gz[0m
[2024-05-06T02:52:21.904Z] [1;32mtest_ops_dropout_ext.py (ASCEND_ARM_EULEROS_910B(1P)) : [0m
[2024-05-06T02:52:21.904Z] [35m[Command] echo 9df29dfe0b4f11efa1938c2a8e86bdd7; export SAULT_ENV_TYPE=ASCEND_ARM_EULEROS_910B; source /home/****/sault/sault/3rdparty/config/env_testcase.sh -p mindspore -b master ; cd /home/****/mindspore/testcases/testcases/tests/st/ops ; /usr/local/python/python375/bin/pytest -c /home/****/sault/virtual_test/virtualenv_006/sault/config/pytest.ini -s test_ops_dropout_ext.py::test_func_dropout_normal -m level0 > /home/****/sault/sault_component/yosemite/case_run_log/9df29dfe0b4f11efa1938c2a8e86bdd7/test_ops_dropout_ext_test_func_dropout_normal.log 2>&1 && echo 'sault run mindspore case successful'; echo 'sault run case finished'; export SAULT_ENV_TYPE=ASCEND_ARM_EULEROS_910B; source /home/****/sault/sault/3rdparty/config/env_testcase.sh -p mindspore -b master [0m
[2024-05-06T02:52:21.904Z] [31m[failure_reason] run case fail[0m
[2024-05-06T02:52:21.904Z] ============================= test session starts ==============================
[2024-05-06T02:52:21.904Z] platform linux -- Python 3.7.5, pytest-5.4.3, py-1.11.0, pluggy-0.13.1
[2024-05-06T02:52:21.904Z] ****dir: /home/****/mindspore/testcases/testcases/tests/st/ops, inifile: /home/****/sault/virtual_test/virtualenv_006/sault/config/pytest.ini
[2024-05-06T02:52:21.904Z] plugins: mock-3.11.1, xdist-1.32.0, forked-1.6.0
[2024-05-06T02:52:21.904Z] collected 2 items
[2024-05-06T02:52:21.904Z]
[2024-05-06T02:52:21.904Z] test_ops_dropout_ext.py F[WARNING] ME(164646:281473326269504,MainProcess):2024-05-06-10:50:35.916.3 [mindspore/context.py:1108] For 'context.set_context' in Ascend backend, the backend is already initialized, please set it before the definition of any Tensor and Parameter, and the instantiation and execution of any operation and net, otherwise the settings may not take effect.
[2024-05-06T02:52:21.904Z] .
[2024-05-06T02:52:21.904Z]
[2024-05-06T02:52:21.904Z] =================================== FAILURES ===================================
[2024-05-06T02:52:21.904Z] _____________________ test_func_dropout_normal[float16-1] ______________________
[2024-05-06T02:52:21.904Z]
[2024-05-06T02:52:21.904Z] context_mode = 1, dtype = <class 'numpy.float16'>
[2024-05-06T02:52:21.904Z]
[2024-05-06T02:52:21.904Z] @pytest.mark.level0
[2024-05-06T02:52:21.904Z] @pytest.mark.env_onecard
[2024-05-06T02:52:21.904Z] @pytest.mark.platform_arm_ascend_training
[2024-05-06T02:52:21.904Z] @pytest.mark.platform_arm_ascend910b_training
[2024-05-06T02:52:21.904Z] @pytest.mark.parametrize('context_mode', [ms.PYNATIVE_MODE])
[2024-05-06T02:52:21.904Z] @pytest.mark.parametrize('dtype', [np.float16, np.float32])
[2024-05-06T02:52:21.904Z] def test_func_dropout_normal(context_mode, dtype):
[2024-05-06T02:52:21.904Z] """
[2024-05-06T02:52:21.904Z] Feature: pyboost function.
[2024-05-06T02:52:21.904Z] Description: test function dropout normal.
[2024-05-06T02:52:21.904Z] Expectation: expect correct result.
[2024-05-06T02:52:21.904Z] """
[2024-05-06T02:52:21.904Z] ms.context.set_context(mode=context_mode)
[2024-05-06T02:52:21.904Z] if context_mode == ms.GRAPH_MODE:
[2024-05-06T02:52:21.904Z] os.environ['GRAPH_OP_RUN'] = "1"
[2024-05-06T02:52:21.904Z] x = generate_random_input((128, 128), dtype)
[2024-05-06T02:52:21.904Z] p = 0.4
[2024-05-06T02:52:21.904Z] output = dropout_forward_func(ms.Tensor(x), p)
[2024-05-06T02:52:21.904Z] compare_output(x, p, output)
[2024-05-06T02:52:21.904Z]
[2024-05-06T02:52:21.904Z] x1 = generate_random_input((64, 64), dtype)
[2024-05-06T02:52:21.904Z] p1 = 0.3
[2024-05-06T02:52:21.904Z] grad = dropout_backward_func(ms.Tensor(x1), p1)
[2024-05-06T02:52:21.905Z] > compare_grad(x1, p1, grad)
[2024-05-06T02:52:21.905Z]
[2024-05-06T02:52:21.905Z] test_ops_dropout_ext.py:96:
[2024-05-06T02:52:21.905Z] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[2024-05-06T02:52:21.905Z]
[2024-05-06T02:52:21.905Z] x = array([[0.833 , 0.3357 , 0.01714, ..., 0.8867 , 0.951 , 0.0835 ],
[2024-05-06T02:52:21.905Z] [0.11725, 0.7495 , 0.0621 , ..., 0.7593 , 0..., 0.5146 , 0.782 , 0.4685 ],
[2024-05-06T02:52:21.905Z] [0.01187, 0.3403 , 0.334 , ..., 0.9272 , 0.537 , 0.772 ]],
[2024-05-06T02:52:21.905Z] dtype=float16)
[2024-05-06T02:52:21.905Z] p = 0.3
[2024-05-06T02:52:21.905Z] grad = Tensor(shape=[64, 64], dtype=Float16, value=
[2024-05-06T02:52:21.905Z] [[ 1.4277e+00, 1.4277e+00, 1.4277e+00 ... 1.4277e+00, 0.0000e+00, 0....00e+00, 0.0000e+00, 1.4277e+00],
[2024-05-06T02:52:21.905Z] [ 1.4277e+00, 1.4277e+00, 0.0000e+00 ... 0.0000e+00, 0.0000e+00, 0.0000e+00]])
[2024-05-06T02:52:21.905Z]
[2024-05-06T02:52:21.905Z] def compare_grad(x, p, grad):
[2024-05-06T02:52:21.905Z] # check grad
[2024-05-06T02:52:21.905Z] keep_prob = 1 - p
[2024-05-06T02:52:21.905Z] if grad.dtype == mstype.bfloat16:
[2024-05-06T02:52:21.905Z] grad_np = grad.astype(mstype.float32).asnumpy()
[2024-05-06T02:52:21.905Z] else:
[2024-05-06T02:52:21.905Z] grad_np = grad.asnumpy()
[2024-05-06T02:52:21.905Z] elem_count = x.size
[2024-05-06T02:52:21.905Z] nonzero_count = np.count_nonzero(grad_np)
[2024-05-06T02:52:21.905Z] assert (elem_count * (keep_prob - 0.1)) < nonzero_count < (elem_count * (keep_prob + 0.1))
[2024-05-06T02:52:21.905Z] grad_sum = np.sum(grad_np)
[2024-05-06T02:52:21.905Z] > np.testing.assert_allclose(grad_sum * keep_prob, nonzero_count, rtol=1e-3)
[2024-05-06T02:52:21.905Z] E AssertionError:
[2024-05-06T02:52:21.905Z] E Not equal to tolerance rtol=0.001, atol=0
[2024-05-06T02:52:21.905Z] E
[2024-05-06T02:52:21.905Z] E Mismatched elements: 1 / 1 (100%)
[2024-05-06T02:52:21.905Z] E Max absolute difference: 3.
[2024-05-06T02:52:21.905Z] E Max relative difference: 0.00103914
[2024-05-06T02:52:21.905Z] E x: array(2884.)
[2024-05-06T02:52:21.905Z] E y: array(2887)
[2024-05-06T02:52:21.905Z]
[2024-05-06T02:52:21.905Z] test_ops_dropout_ext.py:70: AssertionError
[2024-05-06T02:52:21.905Z] =============================== warnings su
#68865 https://build.mindspore.cn/blue/rest/organizations/jenkins/pipelines/Gitee_Gate_Compile_And_Test/runs/123940/nodes/170/log
#69005 https://build.mindspore.cn/blue/rest/organizations/jenkins/pipelines/Gitee_Gate_Compile_And_Test/runs/123942/nodes/170/log
#69082 https://build.mindspore.cn/blue/rest/organizations/jenkins/pipelines/Gitee_Gate_Compile_And_Test/runs/124099/nodes/170/log
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。
感谢您的提问,您可以评论//mindspore-assistant更快获取帮助:
master分支也有一样的随机问题
https://build.mindspore.cn/blue/organizations/jenkins/Gitee_Gate_Compile_And_Test/detail/Gitee_Gate_Compile_And_Test/134980/pipeline/759
[2024-05-20T10:05:18.881Z] test_ops_dropout_ext.py F.
[2024-05-20T10:05:18.881Z]
[2024-05-20T10:05:18.881Z] =================================== FAILURES ===================================
[2024-05-20T10:05:18.881Z] _____________________ test_func_dropout_normal[float16-1] ______________________
[2024-05-20T10:05:18.881Z]
[2024-05-20T10:05:18.881Z] context_mode = 1, dtype = <class 'numpy.float16'>
[2024-05-20T10:05:18.881Z]
[2024-05-20T10:05:18.881Z] @pytest.mark.level0
[2024-05-20T10:05:18.881Z] @pytest.mark.env_onecard
[2024-05-20T10:05:18.881Z] @pytest.mark.platform_arm_ascend_training
[2024-05-20T10:05:18.881Z] @pytest.mark.platform_arm_ascend910b_training
[2024-05-20T10:05:18.881Z] @pytest.mark.parametrize('context_mode', [ms.PYNATIVE_MODE])
[2024-05-20T10:05:18.881Z] @pytest.mark.parametrize('dtype', [np.float16, np.float32])
[2024-05-20T10:05:18.881Z] def test_func_dropout_normal(context_mode, dtype):
[2024-05-20T10:05:18.881Z] """
[2024-05-20T10:05:18.881Z] Feature: pyboost function.
[2024-05-20T10:05:18.881Z] Description: test function dropout normal.
[2024-05-20T10:05:18.881Z] Expectation: expect correct result.
[2024-05-20T10:05:18.881Z] """
[2024-05-20T10:05:18.881Z] ms.context.set_context(mode=context_mode)
[2024-05-20T10:05:18.881Z] if context_mode == ms.GRAPH_MODE:
[2024-05-20T10:05:18.881Z] os.environ['GRAPH_OP_RUN'] = "1"
[2024-05-20T10:05:18.881Z] x = generate_random_input((128, 128), dtype)
[2024-05-20T10:05:18.881Z] p = 0.4
[2024-05-20T10:05:18.881Z] output = dropout_forward_func(ms.Tensor(x), p)
[2024-05-20T10:05:18.881Z] compare_output(x, p, output)
[2024-05-20T10:05:18.881Z]
[2024-05-20T10:05:18.881Z] x1 = generate_random_input((64, 64), dtype)
[2024-05-20T10:05:18.881Z] p1 = 0.3
[2024-05-20T10:05:18.881Z] grad = dropout_backward_func(ms.Tensor(x1), p1)
[2024-05-20T10:05:18.881Z] > compare_grad(x1, p1, grad)
[2024-05-20T10:05:18.881Z]
[2024-05-20T10:05:18.881Z] test_ops_dropout_ext.py:96:
[2024-05-20T10:05:18.881Z] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[2024-05-20T10:05:18.881Z]
[2024-05-20T10:05:18.881Z] x = array([[0.459 , 0.4756 , 0.4907 , ..., 0.93 , 0.6313 , 0.9336 ],
[2024-05-20T10:05:18.881Z] [0.3662 , 0.9995 , 0.09283, ..., 0.8257 , 0..., 0.0631 , 0.03876, 0.3381 ],
[2024-05-20T10:05:18.881Z] [0.832 , 0.3032 , 0.536 , ..., 0.3079 , 0.1775 , 0.8833 ]],
[2024-05-20T10:05:18.881Z] dtype=float16)
[2024-05-20T10:05:18.881Z] p = 0.3
[2024-05-20T10:05:18.881Z] grad = Tensor(shape=[64, 64], dtype=Float16, value=
[2024-05-20T10:05:18.881Z] [[ 1.4277e+00, 0.0000e+00, 1.4277e+00 ... 1.4277e+00, 0.0000e+00, 0....77e+00, 1.4277e+00, 1.4277e+00],
[2024-05-20T10:05:18.881Z] [ 1.4277e+00, 1.4277e+00, 1.4277e+00 ... 1.4277e+00, 0.0000e+00, 0.0000e+00]])
[2024-05-20T10:05:18.881Z]
[2024-05-20T10:05:18.881Z] def compare_grad(x, p, grad):
[2024-05-20T10:05:18.881Z] # check grad
[2024-05-20T10:05:18.881Z] keep_prob = 1 - p
[2024-05-20T10:05:18.881Z] if grad.dtype == mstype.bfloat16:
[2024-05-20T10:05:18.881Z] grad_np = grad.astype(mstype.float32).asnumpy()
[2024-05-20T10:05:18.881Z] else:
[2024-05-20T10:05:18.881Z] grad_np = grad.asnumpy()
[2024-05-20T10:05:18.881Z] elem_count = x.size
[2024-05-20T10:05:18.881Z] nonzero_count = np.count_nonzero(grad_np)
[2024-05-20T10:05:18.881Z] assert (elem_count * (keep_prob - 0.1)) < nonzero_count < (elem_count * (keep_prob + 0.1))
[2024-05-20T10:05:18.881Z] grad_sum = np.sum(grad_np)
[2024-05-20T10:05:18.881Z] > np.testing.assert_allclose(grad_sum * keep_prob, nonzero_count, rtol=1e-3)
[2024-05-20T10:05:18.881Z] E AssertionError:
[2024-05-20T10:05:18.881Z] E Not equal to tolerance rtol=0.001, atol=0
[2024-05-20T10:05:18.881Z] E
[2024-05-20T10:05:18.881Z] E Mismatched elements: 1 / 1 (100%)
[2024-05-20T10:05:18.881Z] E Max absolute difference: 3.
[2024-05-20T10:05:18.881Z] E Max relative difference: 0.0010442
[2024-05-20T10:05:18.881Z] E x: array(2870.)
[2024-05-20T10:05:18.881Z] E y: array(2873)
[2024-05-20T10:05:18.881Z]
[2024-05-20T10:05:18.881Z] test_ops_dropout_ext.py:70: AssertionError
[2024-05-20T10:05:18.881Z] =============================== warnings summary ===============================
[2024-05-20T10:05:18.881Z] /home/****/.local/lib/python3.7/site-packages/mindspore/ops/_op_impl/_custom_op/batchnorm
登录 后才可以发表评论