Skip to content

[SOFIE] Fix ELU alpha parentheses in Generate()#21543

Open
Neeraj-x0 wants to merge 3 commits intoroot-project:masterfrom
Neeraj-x0:fix/elu-alpha-parentheses
Open

[SOFIE] Fix ELU alpha parentheses in Generate()#21543
Neeraj-x0 wants to merge 3 commits intoroot-project:masterfrom
Neeraj-x0:fix/elu-alpha-parentheses

Conversation

@Neeraj-x0
Copy link

The Generate() function emitted alpha * std::exp(x) - 1 which due to
operator precedence evaluates as (alpha * exp(x)) - 1, not the correct
ELU formula alpha * (exp(x) - 1). Only affects alpha != 1.0.

Fixes #21539

This Pull Request:

Changes or fixes:

Fixed incorrect parentheses in ROperator_Elu::Generate() in
tmva/sofie/inc/TMVA/ROperator_Elu.hxx. The negative branch of the ELU
operator now correctly emits alpha * (std::exp(x) - 1) instead of
alpha * std::exp(x) - 1.

Checklist:

  • tested changes locally
  • updated the docs (if necessary)

This PR fixes #21539

Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Fixes incorrect ELU code generation in SOFIE where the negative branch for non-default alpha was emitted without parentheses, changing the math due to operator precedence (fixes #21539).

Changes:

  • Correct ELU negative-branch codegen to emit alpha * (exp(x) - 1) instead of (alpha * exp(x)) - 1.
  • Minor formatting/whitespace adjustments in ROperator_Elu.hxx.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +72 to +73
out << SP << SP << "tensor_" << fNY << "[id] = ((tensor_" << fNX << "[id] >= 0 )? tensor_" << fNX
<< "[id] : " << OpName << "_alpha * (std::exp(tensor_" << fNX << "[id]) - 1));\n";
Copy link

Copilot AI Mar 9, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Test coverage: the existing SOFIE ONNX ELU test appears to cover only the default alpha=1.0, which wouldn’t have caught this precedence bug. Consider adding a regression test/model with a non-default alpha (e.g. 0.5 or 2.0) so future changes to ELU codegen are validated for alpha != 1.0.

Copilot uses AI. Check for mistakes.
Comment on lines 70 to +73
out << "\n//------ ELU \n";
out << SP << "for (int id = 0; id < " << length << " ; id++){\n";
out << SP << SP << "tensor_" << fNY << "[id] = ((tensor_" << fNX << "[id] >= 0 )? tensor_" << fNX << "[id] : "<< OpName << "_alpha * std::exp(tensor_"<< fNX<<"[id]) - 1);\n";
out << SP << SP << "tensor_" << fNY << "[id] = ((tensor_" << fNX << "[id] >= 0 )? tensor_" << fNX
<< "[id] : " << OpName << "_alpha * (std::exp(tensor_" << fNX << "[id]) - 1));\n";
Copy link

Copilot AI Mar 9, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Generate() emits generated C++ that calls std::exp(...), but ROperator_Elu doesn’t currently advertise the need for <cmath> (e.g., via GetStdLibs() like Sigmoid/Selu/Erf do, or by calling model.AddNeededStdLib("cmath") in Initialize). This can make generated model headers fail to compile for ELU-only models when <cmath> isn’t otherwise included upstream.

Copilot uses AI. Check for mistakes.
@Neeraj-x0
Copy link
Author

Added a regression test with EluAlpha.onnx (alpha=0.5) in a follow-up commit to cover the non-default alpha case. Expected outputs validated manually with numpy.

@siliataider
Copy link
Contributor

Hi, thanks for the PR! It would be easier to review changes if we had one commit with the changes and one with only the formatting

@Neeraj-x0 Neeraj-x0 force-pushed the fix/elu-alpha-parentheses branch 3 times, most recently from 347a45a to 5ff7408 Compare March 10, 2026 03:26
@Neeraj-x0
Copy link
Author

@siliataider Updated Please check now

Without parentheses, 'alpha * std::exp(x) - 1' evaluates as
'(alpha * exp(x)) - 1' due to operator precedence, which differs
from the correct ELU formula 'alpha * (exp(x) - 1)' when alpha != 1.

Fixes root-project#21539
Adds EluAlpha.onnx (alpha=0.5) and a corresponding test case to
prevent regressions for non-default alpha values in ELU codegen.

Refs root-project#21539
@Neeraj-x0 Neeraj-x0 force-pushed the fix/elu-alpha-parentheses branch from 5ff7408 to 19dd22e Compare March 10, 2026 03:33
@siliataider
Copy link
Contributor

Running the builds now

@siliataider siliataider requested a review from sanjibansg March 10, 2026 09:01
@github-actions
Copy link

Test Results

    22 files      22 suites   3d 4h 36m 10s ⏱️
 3 816 tests  3 814 ✅ 1 💤  1 ❌
76 462 runs  76 440 ✅ 9 💤 13 ❌

For more details on these failures, see this check.

Results for commit 19dd22e.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

ROperator_Elu.hxx generates incorrect C++ inference code when alpha != 1.0.

5 participants