last minute fixes to coherence leaderboard#4679
Conversation
📝 WalkthroughWalkthroughThe update command now weights leaderboard scores based on per-question coverage instead of unused weights. Key changes include modifying gather_data flow, updating entry construction with squared score transformation and coverage calculation, computing minimum prize percentages, and deferring persistence until prize allocation is complete. Changes
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes Possibly related PRs
Poem
🚥 Pre-merge checks | ✅ 3 | ❌ 1❌ Failed checks (1 inconclusive)
✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches📝 Generate docstrings
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Review rate limit: 0/1 reviews remaining, refill in 60 minutes.Comment |
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (1)
scoring/management/commands/update_coherence_tournament_leaderboard.py (1)
267-270: ⚡ Quick winKeep the prize-threshold math in
Decimal.Casting both values back to
floatreintroduces binary rounding right before prize allocation, which can shift borderline entries around the minimum-prize cutoff. If the downstream helpers accept it, keep this calculation inDecimaland only convert at the boundary that needs a float.Suggested fix
- minimum_prize_percent = ( - float(leaderboard.minimum_prize_amount) / float(prize_pool) if prize_pool else 0 - ) + minimum_prize_percent = ( + leaderboard.minimum_prize_amount / prize_pool if prize_pool else Decimal("0") + )🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@scoring/management/commands/update_coherence_tournament_leaderboard.py` around lines 267 - 270, The current calculation casts to float which reintroduces binary rounding; compute minimum_prize_percent using Decimal arithmetic instead: convert leaderboard.minimum_prize_amount and prize_pool to Decimal and divide (or use Decimal('0') when prize_pool is falsy) so the downstream call assign_prize_percentages_ receives a Decimal; if assign_prize_percentages_ requires a float, defer the float conversion to the final boundary inside that helper rather than here. Reference: minimum_prize_percent, leaderboard.minimum_prize_amount, prize_pool, and assign_prize_percentages_.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@scoring/management/commands/update_coherence_tournament_leaderboard.py`:
- Around line 209-221: The current call to
Leaderboard.objects.get_or_create(...) only sets score_type, prize_pool, and
display_config on creation, leaving existing rows unchanged; change this to
ensure existing leaderboards are updated: use
Leaderboard.objects.update_or_create(...) with the same lookup keys (project and
name) and pass the desired defaults for score_type, prize_pool, bot_status and
display_config, or alternately keep get_or_create(...) and if not created assign
leaderboard.score_type, leaderboard.prize_pool, leaderboard.bot_status,
leaderboard.display_config = ... and call leaderboard.save() so the new
scoring/prize/display settings are applied on reruns.
---
Nitpick comments:
In `@scoring/management/commands/update_coherence_tournament_leaderboard.py`:
- Around line 267-270: The current calculation casts to float which reintroduces
binary rounding; compute minimum_prize_percent using Decimal arithmetic instead:
convert leaderboard.minimum_prize_amount and prize_pool to Decimal and divide
(or use Decimal('0') when prize_pool is falsy) so the downstream call
assign_prize_percentages_ receives a Decimal; if assign_prize_percentages_
requires a float, defer the float conversion to the final boundary inside that
helper rather than here. Reference: minimum_prize_percent,
leaderboard.minimum_prize_amount, prize_pool, and assign_prize_percentages_.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 1c45998f-e8cb-4efb-82d4-f6375c3eba18
📒 Files selected for processing (1)
scoring/management/commands/update_coherence_tournament_leaderboard.py
Cleanup: Preview Environment RemovedThe preview environment for this PR has been destroyed.
Cleanup triggered by PR close at 2026-05-07T14:24:26Z |
Summary by CodeRabbit
Release Notes
New Features
Refactor