Skip to content

HDFS-17796. WebHDFS: map storage-capacity exceptions to HTTP 507#8460

Open
1fanwang wants to merge 1 commit intoapache:trunkfrom
1fanwang:HDFS-17796
Open

HDFS-17796. WebHDFS: map storage-capacity exceptions to HTTP 507#8460
1fanwang wants to merge 1 commit intoapache:trunkfrom
1fanwang:HDFS-17796

Conversation

@1fanwang
Copy link
Copy Markdown

Description of PR

JIRA: HDFS-17796

WebHDFS write failures caused by storage-capacity limits today surface as a generic HTTP 403 (or, when the channel times out before the handler runs, 500). Clients have no way to distinguish "out of space" from permission denial or transient server failure.

Both ExceptionHandlers — NameNode-side (o.a.h.hdfs.web.resources.ExceptionHandler) and DataNode-side (o.a.h.hdfs.server.datanode.web.webhdfs.ExceptionHandler) — currently fall through to the generic IOException -> FORBIDDEN (403) mapping for the storage-capacity exception family.

Fix

Add a typed mapping ahead of the generic IOException branch:

  • ClusterStorageCapacityExceededException (and subclasses, including DSQuotaExceededException / NSQuotaExceededException)
  • DiskChecker.DiskOutOfSpaceException

both map to HTTP 507 Insufficient Storage (RFC 4918).

JAX-RS 2.1's Response.Status enum doesn't include 507, so the NN-side handler defines a small Response.StatusType constant. The DN-side handler uses Netty's built-in HttpResponseStatus.INSUFFICIENT_STORAGE.

How was this patch tested?

Two new unit tests, one per handler package:

  • org.apache.hadoop.hdfs.web.resources.TestExceptionHandler — 6 tests
  • org.apache.hadoop.hdfs.server.datanode.web.webhdfs.TestExceptionHandler — 6 tests
[INFO] Tests run: 12, Failures: 0, Errors: 0, Skipped: 0

Each suite covers DSQuotaExceededException, NSQuotaExceededException, DiskOutOfSpaceException, plus regression checks for the existing mappings (IOException -> 403, FileNotFound -> 404, IllegalArgument -> 400).

Out of scope

HDFS-17796 also asks for cleanup of incomplete files left behind by failed writes. That part is intentionally not included here — it requires plumbing the file path through HdfsWriter so the handler can attempt a best-effort delete on failure, and is best handled in a follow-up PR so the status-code fix can land independently. Happy to follow up.

For code changes:

  • Does the title of this PR start with the corresponding JIRA issue id?
  • If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under ASF 2.0? (no new dependencies)

WebHDFS write failures caused by storage-capacity limits surface as a
generic HTTP 403 (or, when the channel times out before the handler
runs, 500), giving clients no way to distinguish "out of space" from
permission denial or transient server failure.

Add a typed mapping in both the NameNode-side
(o.a.h.hdfs.web.resources.ExceptionHandler) and DataNode-side
(o.a.h.hdfs.server.datanode.web.webhdfs.ExceptionHandler) handlers:

  * ClusterStorageCapacityExceededException (and subclasses, including
    DSQuotaExceededException and NSQuotaExceededException)
  * DiskChecker.DiskOutOfSpaceException

both map to HTTP 507 Insufficient Storage (RFC 4918).

JAX-RS 2.1's Response.Status enum does not include 507, so the
NameNode-side handler defines a small Response.StatusType constant.
The DataNode-side handler uses Netty's
HttpResponseStatus.INSUFFICIENT_STORAGE.

Tests: TestExceptionHandler in both packages cover the new mappings
plus regression checks for IOException -> 403, FileNotFound -> 404,
and IllegalArgument -> 400.
@hadoop-yetus
Copy link
Copy Markdown

🎊 +1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 12m 57s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 codespell 0m 0s codespell was not available.
+0 🆗 detsecrets 0m 0s detect-secrets was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 2 new or modified test files.
_ trunk Compile Tests _
+1 💚 mvninstall 40m 43s trunk passed
+1 💚 compile 1m 46s trunk passed with JDK Ubuntu-21.0.10+7-Ubuntu-124.04
+1 💚 compile 1m 47s trunk passed with JDK Ubuntu-17.0.18+8-Ubuntu-124.04.1
+1 💚 checkstyle 1m 49s trunk passed
+1 💚 mvnsite 1m 55s trunk passed
+1 💚 javadoc 1m 32s trunk passed with JDK Ubuntu-21.0.10+7-Ubuntu-124.04
+1 💚 javadoc 1m 30s trunk passed with JDK Ubuntu-17.0.18+8-Ubuntu-124.04.1
+1 💚 spotbugs 4m 13s trunk passed
+1 💚 shadedclient 31m 43s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+1 💚 mvninstall 1m 21s the patch passed
+1 💚 compile 1m 13s the patch passed with JDK Ubuntu-21.0.10+7-Ubuntu-124.04
+1 💚 javac 1m 13s the patch passed
+1 💚 compile 1m 17s the patch passed with JDK Ubuntu-17.0.18+8-Ubuntu-124.04.1
+1 💚 javac 1m 17s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
+1 💚 checkstyle 1m 14s the patch passed
+1 💚 mvnsite 1m 27s the patch passed
+1 💚 javadoc 0m 59s the patch passed with JDK Ubuntu-21.0.10+7-Ubuntu-124.04
+1 💚 javadoc 0m 58s the patch passed with JDK Ubuntu-17.0.18+8-Ubuntu-124.04.1
+1 💚 spotbugs 3m 49s the patch passed
+1 💚 shadedclient 30m 28s patch has no errors when building and testing our client artifacts.
_ Other Tests _
+1 💚 unit 215m 39s hadoop-hdfs in the patch passed.
+1 💚 asflicense 0m 47s The patch does not generate ASF License warnings.
357m 13s
Subsystem Report/Notes
Docker ClientAPI=1.54 ServerAPI=1.54 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-8460/1/artifact/out/Dockerfile
GITHUB PR #8460
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets
uname Linux 2f7ce6222f80 5.15.0-164-generic #174-Ubuntu SMP Fri Nov 14 20:25:16 UTC 2025 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / 48f9878
Default Java Ubuntu-17.0.18+8-Ubuntu-124.04.1
Multi-JDK versions /usr/lib/jvm/java-21-openjdk-amd64:Ubuntu-21.0.10+7-Ubuntu-124.04 /usr/lib/jvm/java-17-openjdk-amd64:Ubuntu-17.0.18+8-Ubuntu-124.04.1
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-8460/1/testReport/
Max. process+thread count 3498 (vs. ulimit of 10000)
modules C: hadoop-hdfs-project/hadoop-hdfs U: hadoop-hdfs-project/hadoop-hdfs
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-8460/1/console
versions git=2.43.0 maven=3.9.11 spotbugs=4.9.7
Powered by Apache Yetus 0.14.1 https://yetus.apache.org

This message was automatically generated.

@hadoop-yetus
Copy link
Copy Markdown

🎊 +1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 13m 23s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 codespell 0m 0s codespell was not available.
+0 🆗 detsecrets 0m 0s detect-secrets was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 2 new or modified test files.
_ trunk Compile Tests _
+1 💚 mvninstall 41m 16s trunk passed
+1 💚 compile 1m 44s trunk passed with JDK Ubuntu-21.0.10+7-Ubuntu-124.04
+1 💚 compile 1m 51s trunk passed with JDK Ubuntu-17.0.18+8-Ubuntu-124.04.1
+1 💚 checkstyle 1m 50s trunk passed
+1 💚 mvnsite 1m 56s trunk passed
+1 💚 javadoc 1m 28s trunk passed with JDK Ubuntu-21.0.10+7-Ubuntu-124.04
+1 💚 javadoc 1m 28s trunk passed with JDK Ubuntu-17.0.18+8-Ubuntu-124.04.1
+1 💚 spotbugs 4m 11s trunk passed
+1 💚 shadedclient 32m 1s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+1 💚 mvninstall 1m 20s the patch passed
+1 💚 compile 1m 16s the patch passed with JDK Ubuntu-21.0.10+7-Ubuntu-124.04
+1 💚 javac 1m 16s the patch passed
+1 💚 compile 1m 16s the patch passed with JDK Ubuntu-17.0.18+8-Ubuntu-124.04.1
+1 💚 javac 1m 16s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
+1 💚 checkstyle 1m 14s the patch passed
+1 💚 mvnsite 1m 24s the patch passed
+1 💚 javadoc 0m 59s the patch passed with JDK Ubuntu-21.0.10+7-Ubuntu-124.04
+1 💚 javadoc 1m 0s the patch passed with JDK Ubuntu-17.0.18+8-Ubuntu-124.04.1
+1 💚 spotbugs 3m 48s the patch passed
+1 💚 shadedclient 30m 46s patch has no errors when building and testing our client artifacts.
_ Other Tests _
+1 💚 unit 217m 32s hadoop-hdfs in the patch passed.
+1 💚 asflicense 0m 49s The patch does not generate ASF License warnings.
360m 36s
Subsystem Report/Notes
Docker ClientAPI=1.54 ServerAPI=1.54 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-8460/2/artifact/out/Dockerfile
GITHUB PR #8460
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets
uname Linux 8172fc582fd8 5.15.0-168-generic #178-Ubuntu SMP Fri Jan 9 19:05:03 UTC 2026 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / 248d478
Default Java Ubuntu-17.0.18+8-Ubuntu-124.04.1
Multi-JDK versions /usr/lib/jvm/java-21-openjdk-amd64:Ubuntu-21.0.10+7-Ubuntu-124.04 /usr/lib/jvm/java-17-openjdk-amd64:Ubuntu-17.0.18+8-Ubuntu-124.04.1
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-8460/2/testReport/
Max. process+thread count 3123 (vs. ulimit of 10000)
modules C: hadoop-hdfs-project/hadoop-hdfs U: hadoop-hdfs-project/hadoop-hdfs
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-8460/2/console
versions git=2.43.0 maven=3.9.11 spotbugs=4.9.7
Powered by Apache Yetus 0.14.1 https://yetus.apache.org

This message was automatically generated.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants