PECOS automatically tests code examples in documentation to ensure they remain correct as the codebase evolves.
The documentation testing system:
- Extracts Python and Rust code blocks from Markdown files
- Generates pytest test files from the extracted code
- Runs the tests as part of CI
# Generate tests from documentation
uv run python scripts/docs/generate_doc_tests.py
# Run all documentation tests
uv run pytest python/quantum-pecos/tests/docs/generated -v
# Run tests for a specific document
uv run pytest python/quantum-pecos/tests/docs/generated/user-guide/test_getting_started.py -vMarkers are HTML comments placed immediately before code blocks to control test behavior.
Skip a single code block:
<!--skip-->
\```python
# This code won't be tested
from some_unavailable_module import thing
\```Skip with a reason (shown in test output):
<!--skip: requires external hardware-->
\```python
connect_to_quantum_computer()
\```Skip all code blocks in a document by placing a skip marker at the beginning of the file (after the heading):
# Document Title
<!--skip: All examples require external files-->
Content here...Skip tests when CUDA is not available:
<!--skip-if-no-cuda-->
\```python
from pecos.simulators import CudaSim
sim = CudaSim()
\```Test that code raises a specific error:
<!--expect-error: TypeError-->
\```python
"string" + 123 # This will raise TypeError
\```The test passes if the code raises an error containing the specified text.
Verify code produces specific output:
<!--expect-output: Hello, World!-->
\```python
print("Hello, World!")
\```Give a test a specific name for easier identification:
<!--test-name: bell_state_example-->
\```python
# This test will be named test_bell_state_example
\```Add pytest marks to tests:
<!--mark.slow-->
\```python
# This test will have @pytest.mark.slow
\```Copy test data files to the test directory (for Rust cargo tests):
<!--test-data: repetition_code.hugr-->
\```rust
fn main() {
let data = std::fs::read("repetition_code.hugr").unwrap();
}
\```Test data files should be placed in docs/assets/test-data/.
Hidden Preambles
Use hidden code blocks to provide imports or setup code that should be included in tests but not shown in documentation:
\```hidden-python
from pecos.slr import Main, QReg, CReg
from pecos.slr.qeclib import qubit as qb
\```
Now the visible code can use these imports:
\```python
prog = Main(
q := QReg("q", 2),
qb.H(q[0]),
)
\```The hidden preamble is prepended to all subsequent code blocks in the document until a <!--preamble-reset--> marker is encountered.
Reset the accumulated preamble:
<!--preamble-reset-->
\```python
# This code starts fresh without any preamble
\```Python code blocks are tested using exec():
\```python
from pecos import sim
result = sim(Qasm("...")).run(10)
\```Rust code blocks can be tested in two ways:
- Simple Rust (rustc): Code with
fn main()that doesn't use external crates - Cargo Rust: Code using
pecoscrates (detected byuse pecos*::)
Incomplete Rust snippets (no fn main(), traits, impls without full context) are automatically skipped.
For Rust code that uses PECOS crates:
\```rust
use pecos::prelude::*;
fn main() -> Result<(), Box<dyn std::error::Error>> {
let results = sim(Qasm::from_string("...")).run(10)?;
Ok(())
}
\```This creates a temporary Cargo project with the necessary dependencies.
Standard Rust documentation markers are also supported:
\```rust,skip
// This code is skipped
\```
\```rust,ignore
// This code is also skipped
\```
\```rust,no_run
// This code is also skipped
\```- Add hidden preambles for common imports to keep visible code focused
- Use meaningful skip reasons to explain why code can't be tested
- Test error conditions with
<!--expect-error-->markers - Keep code examples self-contained when possible
- Don't skip code unless necessary
- Don't rely on state from previous code blocks (each block is tested independently)
- Don't include real API keys or credentials in examples
- Don't use
exec()in your actual code examples (it's only used internally for testing)
The test generator creates files in:
python/quantum-pecos/tests/docs/generated/
conftest.py # Shared fixtures
test_README.py # Tests from docs/README.md
user-guide/ # Tests from docs/user-guide/
development/ # Tests from docs/development/
These files are gitignored and regenerated before tests run.
When a documentation test fails:
- Check the test docstring for the source file and line number
- The test name includes the source file and block number
- Run the specific test with
-vfor verbose output - Check if the code requires imports that should be in a hidden preamble
Example:
# Run with verbose output
uv run pytest python/quantum-pecos/tests/docs/generated/user-guide/test_gates.py::test_gates_block_5 -v
# The test docstring shows the source:
# """Test from docs/user-guide/gates.md:142."""When adding new documentation:
- Add code examples in fenced code blocks
- Add hidden preambles for common imports if needed
- Add skip markers for code that can't be tested
- Run
uv run python scripts/docs/generate_doc_tests.pyto generate tests - Run the generated tests to verify examples work
- Commit the documentation (generated tests are gitignored)