Transform your pytest HTML reports into beautiful, interactive dashboards!
A comprehensive pytest plugin that enhances HTML reports with interactive charts, intelligent error classification, and modern styling.
๐ฏ Interactive Charts - Visualize test results with Chart.js powered donut and pie charts ๐จ Modern UI - Beautiful gradient styling with responsive design ๐ Smart Error Analysis - Automatic error classification with suggested fixes ๐ Comprehensive Tables - Detailed test information with expandable error details ๐ง Highly Configurable - YAML, CLI, or programmatic configuration โก Zero Config - Works out of the box with sensible defaults ๐ฑ Mobile Responsive - Looks great on all devices ๐ญ Custom Branding - Add your logo, colors, and company name
๐ View Live Interactive Report โ Click to see charts, filters, and interactive features!
- ๐ Test Status Distribution - Visual breakdown of passed/failed/skipped tests with interactive charts
- ๐ Pass Rate Charts - Overall test success metrics with data labels
- ๐ Error Analysis - Categorized failures with remediation suggestions
- ๐ Comprehensive Test Table - Filterable, sortable results with expandable error details
- โก Step Execution Summary - Detailed test step information with status tracking
- ๐จ Modern UI - Beautiful purple gradient design with hover effects
pip install pytest-html-dashboardpytest --html=report.html --self-contained-htmlThat's it! The plugin automatically enhances your HTML report with all features enabled.
Create pytest_html_dashboard.yaml:
branding:
company_name: "My Company"
report_title: "Test Execution Dashboard"
logo_url: "path/to/logo.png" # Or base64 encoded
primary_color: "#667eea"
secondary_color: "#764ba2"
charts:
enable_charts: true
chart_height: 300
chart_animation: true
report:
enable_enhanced_reporting: true
enable_error_classification: true
show_timestamps: truepytest --html=report.html \
--dashboard-company-name="My Company" \
--dashboard-report-title="Test Dashboard" \
--dashboard-primary-color="#667eea"# conftest.py
from pytest_html_dashboard import ReporterConfig, BrandingConfig
def pytest_configure(config):
branding = BrandingConfig(
company_name="My Company",
report_title="Custom Dashboard",
primary_color="#667eea"
)
reporter_config = ReporterConfig(branding=branding)
config._dashboard_config = reporter_config| Option | Default | Description |
|---|---|---|
company_name |
"Test Automation Framework" | Your company/project name |
report_title |
"Test Execution Dashboard" | Report header title |
logo_url |
None | Logo image (URL or base64) |
primary_color |
"#004488" | Primary theme color |
secondary_color |
"#0066CC" | Secondary theme color |
success_color |
"#4CAF50" | Success indicator color |
failure_color |
"#f44336" | Failure indicator color |
| Option | Default | Description |
|---|---|---|
enable_charts |
true | Enable/disable charts |
chart_height |
300 | Chart height in pixels |
chart_animation |
true | Animated chart transitions |
show_pass_rate_chart |
true | Display pass rate visualization |
show_status_distribution_chart |
true | Display status breakdown |
| Option | Default | Description |
|---|---|---|
enable_enhanced_reporting |
true | Enable all enhanced features |
enable_error_classification |
true | Categorize and analyze errors |
enable_comprehensive_table |
true | Show detailed test table |
max_error_message_length |
100 | Truncate long error messages |
show_timestamps |
true | Display test execution times |
show_duration |
true | Show test durations |
- name: Run tests
run: pytest --html=report.html --self-contained-html
- name: Upload report
uses: actions/upload-artifact@v3
with:
name: test-report
path: report.htmlpipeline {
stages {
stage('Test') {
steps {
sh 'pytest --html=report.html --self-contained-html'
}
}
}
post {
always {
publishHTML([
reportDir: '.',
reportFiles: 'report.html',
reportName: 'Test Dashboard'
])
}
}
}test:
script:
- pytest --html=report.html --self-contained-html
artifacts:
when: always
paths:
- report.html
expire_in: 30 daysThe repository contains:
tests/test_dashboard_features.py- Comprehensive test suite demonstrating all featuresconfig/sample_config.yaml- Sample configuration filereports/complete_dashboard_report.html- Sample generated reportexamples/- Additional examples and demos
Run the tests:
pytest tests/test_dashboard_features.py --html=reports/report.html --self-contained-html- โ Click column headers to sort table data
- โ Filter tests by status (passed/failed/skipped)
- โ Filter by error category
- โ Search tests by name
- โ Click "View Error" buttons for detailed error information
- โ Hover over charts for detailed statistics
- โ Modern gradient backgrounds
- โ Animated charts with data labels
- โ Color-coded test status indicators
- โ Responsive layout for mobile devices
- โ Professional typography and spacing
- โ Sticky table headers for easy navigation
Automatically categorizes errors into:
- ๐ด Assertion Failures - Test logic issues
- โฑ๏ธ Timeout Errors - Performance problems
- ๐ Connection Errors - Network/API issues
- โ๏ธ Configuration Errors - Setup problems
- ๐ฆ Import Errors - Dependency issues
- ๐ Runtime Errors - Execution failures
Each error includes:
- Error type and message
- Full stack trace
- Suggested remediation steps
- Context and timestamp
| Feature | pytest-html | pytest-html-dashboard |
|---|---|---|
| Basic HTML reports | โ | โ |
| Interactive charts | โ | โ |
| Error classification | โ | โ |
| Custom branding | Limited | โ Full |
| Filter & sort | โ | โ |
| Suggested actions | โ | โ |
| Mobile responsive | Partial | โ Full |
| Configuration | Limited | โ Extensive |
git clone https://github.com/nireshs/pytest-html-dashboard.git
cd pytest-html-dashboard
python -m venv .venv
source .venv/bin/activate # Windows: .venv\Scripts\activate
pip install -e ".[dev]"pytest tests/ --cov=pytest_html_dashboard --cov-report=htmlpython -m buildContributions welcome! Please:
- Fork the repository
- Create a feature branch
- Make your changes with tests
- Submit a pull request
MIT License - see LICENSE file for details.
- Built on pytest and pytest-html
- Charts by Chart.js
- Modern design inspired by contemporary dashboard UIs
- ๐ง Email: niresh.shanmugam@gmail.com
- ๐ Issues: GitHub Issues
- ๐ฌ Discussions: GitHub Discussions
- Historical test trend analysis โ v1.2.0
- Real-time test execution dashboard โ v1.2.0
- AI-powered error analysis โ v1.2.0
- Test comparison between runs
- PDF export capability
- Additional chart types (bar, line, scatter)
- Custom theme marketplace
- Integration with test management tools
Track test results over time with SQLite database storage:
- Automatic tracking of all test runs with
--enable-history - Trend analysis showing pass rate changes, duration trends
- Flaky test detection identifies tests with inconsistent behavior
- Database storage with configurable path (
--history-db PATH)
pytest --enable-history --html=report.htmlIntelligent error pattern detection and suggestions:
- Pattern-based analysis (local, no API key required)
- Error categorization by type (AssertionError, TypeError, etc.)
- Actionable insights with root cause, quick fixes, prevention tips
- Optional AI providers (OpenAI, Anthropic) for deeper analysis
# Local analysis (default)
pytest --html=report.html
# With OpenAI
pytest --ai-provider=openai --ai-api-key=sk-... --html=report.htmlLive test execution monitoring via WebSocket:
- WebSocket server on port 8888 (configurable)
- Live updates as tests execute
- Session events (start, test results, finish)
- Clean lifecycle management with automatic startup/shutdown
pytest --realtime-dashboard --html=report.htmlpytest --enable-history --realtime-dashboard --html=report.html| Flag | Description | Default |
|---|---|---|
--enable-history |
Enable historical tracking | False |
--disable-history |
Disable historical tracking | - |
--history-db PATH |
Custom database path | test-history.db |
| Flag | Description | Default |
|---|---|---|
--realtime-dashboard |
Enable WebSocket server | False |
--realtime-port PORT |
WebSocket port | 8888 |
| Flag | Description | Default |
|---|---|---|
--ai-provider PROVIDER |
AI provider (local/openai/anthropic) | local |
--ai-api-key KEY |
API key for external AI providers | - |
historical:
enable_tracking: true
database_path: "test-history.db"
show_trends: true
flaky_detection: true
retention_days: 90
realtime:
enable_realtime: false
websocket_port: 8888
poll_interval: 1.0
ai:
enable_ai_analysis: true
provider: "local" # or "openai", "anthropic"
api_key: ""
pattern_matching: true| Feature | Overhead | Impact |
|---|---|---|
| Historical Tracking | ~5-10ms per test run | Minimal |
| AI Pattern Analysis | ~50-100ms total | Low |
| Real-Time WebSocket | ~1-2ms per test | Very Low |
| Total | <1% of test time | Negligible |
- ๐๏ธ Historical Tracking: SQLite database for test trends and flaky detection
- ๐ค AI Error Analysis: Pattern-based error detection with actionable insights
- ๐ก Real-Time Dashboard: WebSocket server for live test monitoring
- ๐ฏ CLI Options: Complete command-line control for all features
- โ๏ธ Config Override: CLI flags override config file settings
- ๐ Trend Visualization: Historical pass rate and duration charts
- ๐ Flaky Detection: Automatic identification of inconsistent tests
- โจ Complete dashboard enhancement system
- ๐ Interactive Chart.js visualizations
- ๐จ Modern gradient styling with responsive design
- ๐ Intelligent error classification
- โ๏ธ Comprehensive configuration system
- ๐ Enhanced test tables with filter/sort
- ๐ Automatic enhancement via pytest hooks
โญ Star us on GitHub if you find this useful! โญ
Made with โค๏ธ for the pytest community
