-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-54883][MINOR] Clean up error messages for CLI and add new error mode DEBUG #53659
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
JIRA Issue Information=== Improvement SPARK-54883 === This comment was automatically generated by GitHub Actions |
| // - DEBUG format: Always print stack traces (for debugging) | ||
| // - PRETTY format: Only for internal errors (SQLSTATE XX***) | ||
| // - MINIMAL/STANDARD formats: Never print stack traces (JSON only) | ||
| val shouldPrintStackTrace = format match { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
QQ: is the change for spark-sql only? What about spark-shell and pyspark?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also, shall we add document in https://spark.apache.org/docs/latest/sql-migration-guide.html#upgrading-from-spark-sql-40-to-41 for the behavior change and the configuration to restore the old behavior?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@gengliangwang Right now only spark-sql. If there is a desire I can extend it (with some pointers). But for spark-shell its a bit more invasive. I think we always return a stack trace (?)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok, on second thought, the inconsistency seems reasonable.
However, the new log format DEBUG seems only working for spark-sql, instead of all other CLI and applications. Shall we simply introduce a new configuration for the log output of spark-sql?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok, on second thought, the inconsistency seems reasonable. However, the new log format
DEBUGseems only working forspark-sql, instead of all other CLI and applications. Shall we simply introduce a new configuration for the log output ofspark-sql?
Good point I think it woudl be nice to have it generally.
|
+1 for making the output of |
|
thanks, merging to master! |
What changes were proposed in this pull request?
Two changes in this pull request:
Why are the changes needed?
This makes the experience of using CLI much more consistent without losing capability. It also clearly seperates "consumer" output, from developer output.
Does this PR introduce any user-facing change?
Yes, in CLI error message output is by default more consistent.
How was this patch tested?
Added new tests
Was this patch authored or co-authored using generative AI tooling?
Claude