Skip to content

Conversation

@Yuvraj-cyborg
Copy link
Contributor

Closes #19169

Rationale for this change:

The current implementation of SparkAscii UDF uses the default is_nullable which always returns true. This is incorrect because the output should only be nullable if the input argument is nullable. This change implements proper null propagation behavior by using return_field_from_args .

Changes in PR:

  • Implemented return_field_from_args for SparkAscii to properly compute output nullability based on input argument nullability
  • Changed return_type to return internal_err! since return_field_from_args is now used (following the pattern used by other Spark functions like ilike, concat, elt)
  • Added unit tests verifying the nullability behavior:
    • Output is nullable when input is nullable
    • Output is non-nullable when input is non-nullable

Test Coverage:

Yes, tests are included to verify the change.

User-facing Changes:

No user-facing changes.

@github-actions github-actions bot added the spark label Dec 28, 2025
@Yuvraj-cyborg
Copy link
Contributor Author

cc: @Jefffrey @rluvaton

@Jefffrey Jefffrey added this pull request to the merge queue Dec 29, 2025
Merged via the queue into apache:main with commit 3aa0ab7 Dec 29, 2025
28 checks passed
@Jefffrey
Copy link
Contributor

Thanks @Yuvraj-cyborg

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

spark ascii need to have custom nullability

2 participants