-
Notifications
You must be signed in to change notification settings - Fork 1.4k
ENH: support additional dtypes in pad_nd #8672
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: dev
Are you sure you want to change the base?
ENH: support additional dtypes in pad_nd #8672
Conversation
Prefer the PyTorch padding backend when supported and safely fall back to NumPy on error. Add unit tests to validate backend selection and ensure output dtype is preserved. Signed-off-by: Shubham Chandravanshi <shubham.chandravanshi378@gmail.com>
📝 WalkthroughWalkthroughThis change removes the dtype-based guard that previously prevented PyTorch padding for bool and certain integer dtypes in the Estimated code review effort🎯 2 (Simple) | ⏱️ ~12 minutes Pre-merge checks and finishing touches❌ Failed checks (1 warning)
✅ Passed checks (4 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 4
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
monai/transforms/croppad/functional.py (1)
99-110: Critical: NotImplementedError not caught by except clause.Line 103 catches
(ValueError, TypeError, RuntimeError)but line 104 checksisinstance(err, NotImplementedError). NotImplementedError would propagate uncaught, breaking the fallback mechanism. The test attest_pad_falls_back_to_np_if_pt_raisesexpects this fallback but would fail in real execution.🔎 Proposed fix
- except (ValueError, TypeError, RuntimeError) as err: + except (ValueError, TypeError, RuntimeError, NotImplementedError) as err: if isinstance(err, NotImplementedError) or any( k in str(err) for k in ("supported", "unexpected keyword", "implemented", "value") ):
🧹 Nitpick comments (1)
tests/transforms/croppad/test_pad_nd_dtypes.py (1)
49-58: Consider testing additional modes.Current tests only cover "constant" mode. The updated code supports {"reflect", "edge", "replicate", "wrap", "circular"} via PyTorch. Testing dtype preservation across these modes would strengthen coverage.
Optional enhancement
@pytest.mark.parametrize( "mode", ["constant", "reflect", "replicate"] ) @pytest.mark.parametrize( "dtype", [torch.bool, torch.int8, torch.float32] ) def test_pad_modes_with_dtypes(mode, dtype): """Test that pad_nd handles various modes and dtypes correctly.""" img = torch.ones((1, 4, 4), dtype=dtype) to_pad = [(0, 0), (1, 1), (2, 2)] out = pad_nd(img, to_pad, mode=mode, value=0) assert out.shape == (1, 6, 8) assert out.dtype == img.dtype
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Cache: Disabled due to data retention organization setting
Knowledge base: Disabled due to Reviews -> Disable Knowledge Base setting
📒 Files selected for processing (2)
monai/transforms/croppad/functional.pytests/transforms/croppad/test_pad_nd_dtypes.py
🧰 Additional context used
📓 Path-based instructions (1)
**/*.py
⚙️ CodeRabbit configuration file
Review the Python code for quality and correctness. Ensure variable names adhere to PEP8 style guides, are sensible and informative in regards to their function, though permitting simple names for loop and comprehension variables. Ensure routine names are meaningful in regards to their function and use verbs, adjectives, and nouns in a semantically appropriate way. Docstrings should be present for all definition which describe each variable, return value, and raised exception in the appropriate section of the Google-style of docstrings. Examine code for logical error or inconsistencies, and suggest what may be changed to addressed these. Suggest any enhancements for code improving efficiency, maintainability, comprehensibility, and correctness. Ensure new or modified definitions will be covered by existing or new unit tests.
Files:
monai/transforms/croppad/functional.pytests/transforms/croppad/test_pad_nd_dtypes.py
🧬 Code graph analysis (2)
monai/transforms/croppad/functional.py (1)
monai/transforms/utils_pytorch_numpy_unification.py (1)
mode(426-438)
tests/transforms/croppad/test_pad_nd_dtypes.py (1)
monai/transforms/croppad/functional.py (3)
pad_nd(73-110)_pt_pad(59-70)_np_pad(45-56)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (19)
- GitHub Check: flake8-py3 (pytype)
- GitHub Check: quick-py3 (ubuntu-latest)
- GitHub Check: flake8-py3 (mypy)
- GitHub Check: min-dep-pytorch (2.7.1)
- GitHub Check: quick-py3 (windows-latest)
- GitHub Check: quick-py3 (macOS-latest)
- GitHub Check: min-dep-pytorch (2.5.1)
- GitHub Check: flake8-py3 (codeformat)
- GitHub Check: min-dep-pytorch (2.6.0)
- GitHub Check: build-docs
- GitHub Check: min-dep-py3 (3.10)
- GitHub Check: min-dep-os (ubuntu-latest)
- GitHub Check: min-dep-pytorch (2.8.0)
- GitHub Check: min-dep-os (windows-latest)
- GitHub Check: min-dep-os (macOS-latest)
- GitHub Check: min-dep-py3 (3.12)
- GitHub Check: min-dep-py3 (3.11)
- GitHub Check: packaging
- GitHub Check: min-dep-py3 (3.9)
| # Copyright (c) MONAI Consortium | ||
| # Licensed under the Apache License, Version 2.0 (the "License"); | ||
| # you may not use this file except in compliance with the License. | ||
| # You may obtain a copy of the License at | ||
| # http://www.apache.org/licenses/LICENSE-2.0 | ||
| # Unless required by applicable law or agreed to in writing, software | ||
| # distributed under the License is distributed on an "AS IS" BASIS, | ||
| # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
| # See the License for the specific language governing permissions and | ||
| # limitations under the License. | ||
|
|
||
|
|
||
| from __future__ import annotations | ||
|
|
||
| from unittest.mock import Mock, patch | ||
|
|
||
| import pytest | ||
| import torch | ||
|
|
||
| import monai.transforms.croppad.functional as F | ||
| from monai.transforms.croppad.functional import pad_nd | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion | 🟠 Major
Add module docstring.
Per coding guidelines, all modules should have docstrings describing their purpose.
🔎 Suggested module docstring
# limitations under the License.
+"""
+Tests for pad_nd dtype support and backend selection.
+Validates PyTorch padding preference and NumPy fallback behavior.
+"""
from __future__ import annotationsAs per coding guidelines, docstrings are required for all definitions.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| # Copyright (c) MONAI Consortium | |
| # Licensed under the Apache License, Version 2.0 (the "License"); | |
| # you may not use this file except in compliance with the License. | |
| # You may obtain a copy of the License at | |
| # http://www.apache.org/licenses/LICENSE-2.0 | |
| # Unless required by applicable law or agreed to in writing, software | |
| # distributed under the License is distributed on an "AS IS" BASIS, | |
| # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
| # See the License for the specific language governing permissions and | |
| # limitations under the License. | |
| from __future__ import annotations | |
| from unittest.mock import Mock, patch | |
| import pytest | |
| import torch | |
| import monai.transforms.croppad.functional as F | |
| from monai.transforms.croppad.functional import pad_nd | |
| # Copyright (c) MONAI Consortium | |
| # Licensed under the Apache License, Version 2.0 (the "License"); | |
| # you may not use this file except in compliance with the License. | |
| # You may obtain a copy of the License at | |
| # http://www.apache.org/licenses/LICENSE-2.0 | |
| # Unless required by applicable law or agreed to in writing, software | |
| # distributed under the License is distributed on an "AS IS" BASIS, | |
| # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
| # See the License for the specific language governing permissions and | |
| # limitations under the License. | |
| """ | |
| Tests for pad_nd dtype support and backend selection. | |
| Validates PyTorch padding preference and NumPy fallback behavior. | |
| """ | |
| from __future__ import annotations | |
| from unittest.mock import Mock, patch | |
| import pytest | |
| import torch | |
| import monai.transforms.croppad.functional as F | |
| from monai.transforms.croppad.functional import pad_nd |
🤖 Prompt for AI Agents
In tests/transforms/croppad/test_pad_nd_dtypes.py lines 1 to 22, the module is
missing a top-level docstring; add a concise module-level docstring at the very
top (immediately after the future import or before any imports if preferred)
that states the test module’s purpose (e.g., verifies pad_nd behavior across
dtypes), any important context, and optionally references related functions
under test; keep it short, one or two sentences, and follow project docstring
style.
| def test_pad_uses_pt_for_bool(): | ||
| img = torch.ones((1, 4, 4), dtype=torch.bool) | ||
| to_pad = [(0, 0), (1, 1), (2, 2)] | ||
| with patch.object(F, "_pt_pad", wraps=F._pt_pad) as mock_pt, patch.object(F, "_np_pad", wraps=F._np_pad) as mock_np: | ||
| out = pad_nd(img, to_pad, mode="constant", value=0) | ||
|
|
||
| assert mock_pt.called | ||
| assert not mock_np.called | ||
| assert out.dtype == img.dtype |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion | 🟠 Major
Add function docstring.
Per coding guidelines, test functions should have docstrings describing what they test.
🔎 Suggested docstring
def test_pad_uses_pt_for_bool():
+ """Test that pad_nd uses PyTorch backend for bool dtype in constant mode."""
img = torch.ones((1, 4, 4), dtype=torch.bool)As per coding guidelines, docstrings are required for all definitions.
🤖 Prompt for AI Agents
In tests/transforms/croppad/test_pad_nd_dtypes.py around lines 24 to 32, the
test function test_pad_uses_pt_for_bool lacks a docstring; add a one-line
docstring immediately below the def line that succinctly states what the test
verifies (e.g., that padding boolean tensors uses the PyTorch implementation and
preserves dtype) so it follows the project's docstring guideline for test
functions.
| def test_pad_falls_back_to_np_if_pt_raises(): | ||
| img = torch.ones((1, 4, 4), dtype=torch.bool) | ||
| to_pad = [(0, 0), (1, 1), (2, 2)] | ||
| with ( | ||
| patch.object(F, "_pt_pad", new=Mock(side_effect=NotImplementedError("no"))) as mock_pt, | ||
| patch.object(F, "_np_pad", wraps=F._np_pad) as mock_np, | ||
| ): | ||
| out = pad_nd(img, to_pad, mode="constant", value=0) | ||
|
|
||
| assert mock_pt.called | ||
| assert mock_np.called | ||
| assert out.dtype == img.dtype |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion | 🟠 Major
Add function docstring.
Per coding guidelines, test functions should have docstrings.
🔎 Suggested docstring
def test_pad_falls_back_to_np_if_pt_raises():
+ """Test that pad_nd falls back to NumPy when PyTorch raises NotImplementedError."""
img = torch.ones((1, 4, 4), dtype=torch.bool)As per coding guidelines, docstrings are required for all definitions.
🤖 Prompt for AI Agents
In tests/transforms/croppad/test_pad_nd_dtypes.py around lines 35 to 46, the
test function test_pad_falls_back_to_np_if_pt_raises is missing a docstring; add
a concise one-line docstring immediately under the def line that explains the
purpose of the test (e.g., that PyTorch padding fallback to NumPy is exercised
and dtype is preserved), keeping it brief and following project docstring style.
| @pytest.mark.parametrize( | ||
| "dtype", [torch.bool, torch.int8, torch.int16, torch.int32, torch.int64, torch.uint8, torch.float32] | ||
| ) | ||
| def test_pad_dtype_no_error_and_dtype_preserved(dtype): | ||
| img = torch.ones((1, 4, 4), dtype=dtype) | ||
| to_pad = [(0, 0), (1, 1), (2, 2)] | ||
| out = pad_nd(img, to_pad, mode="constant", value=0) | ||
|
|
||
| assert out.shape == (1, 6, 8) | ||
| assert out.dtype == img.dtype |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion | 🟠 Major
Add function docstring.
Per coding guidelines, test functions should have docstrings.
🔎 Suggested docstring
@pytest.mark.parametrize(
"dtype", [torch.bool, torch.int8, torch.int16, torch.int32, torch.int64, torch.uint8, torch.float32]
)
def test_pad_dtype_no_error_and_dtype_preserved(dtype):
+ """Test that pad_nd handles various dtypes without error and preserves dtype."""
img = torch.ones((1, 4, 4), dtype=dtype)As per coding guidelines, docstrings are required for all definitions.
🤖 Prompt for AI Agents
In tests/transforms/croppad/test_pad_nd_dtypes.py around lines 49 to 58, the
test function test_pad_dtype_no_error_and_dtype_preserved is missing a
docstring; add a concise one-line docstring immediately beneath the def that
states the test verifies pad_nd accepts various dtypes without error and
preserves the input dtype and shape (e.g., "Verify pad_nd does not error for
various dtypes and preserves dtype and output shape.").
Prefer the PyTorch padding backend when supported and safely fall back
to NumPy on error. Add unit tests to validate backend selection and
ensure output dtype is preserved.
Fixes #7842
Description
This pull request relaxes dtype restrictions in
pad_ndand prefersthe PyTorch padding backend when supported, with a safe fallback to
NumPy on error. This enables support for additional dtypes (e.g. bool)
that are already handled correctly by recent PyTorch versions.
Unit tests are added to validate backend selection and ensure dtype
preservation.
Types of changes