New modules: Llamacpp-python/run and huggingface/download for allowing to run simple text workloads with local LLMs #11053
New modules: Llamacpp-python/run and huggingface/download for allowing to run simple text workloads with local LLMs #11053toniher wants to merge 22 commits intonf-core:masterfrom
Conversation
famosab
left a comment
There was a problem hiding this comment.
Thank you for your contribution to nf-core! We really appreciate it. I added a few comments to your PR.
We usually recommend to have one module per PR. That makes the review process easier and its more likely that someone will review your PR. You can keep that in mind for the next PRs.
| { assert process.out.output.size() == 1 }, | ||
| { assert process.out.output[0][0] == [ id:'test_model' ] }, | ||
| { assert file(process.out.output[0][1]).name == "gemma-3-1b-it-Q4_K_M.gguf" }, | ||
| { assert file(process.out.output[0][1]).size() > 0 }, | ||
| { assert snapshot(process.out.findAll { key, val -> key.startsWith('versions') }).match() } |
There was a problem hiding this comment.
Can you add other files to this snapshot as well? Ideally we want all outputs to be at least present by name in the snapshot.
| @@ -0,0 +1,7 @@ | |||
| nextflow.enable.moduleBinaries = true | |||
|
|
|||
| process { | |||
There was a problem hiding this comment.
We want the ext.args to be present in the main.nf.test file. That makes everything more readable in one go. See the docs for more info.
| task.ext.when == null || task.ext.when | ||
|
|
||
| script: | ||
| def hf_home_resolved = hf_home ?: "${workflow.projectDir}/hf_cache" |
There was a problem hiding this comment.
I dont think the module should write to a hard coded path in the projectDir
There was a problem hiding this comment.
It is a cache directory needed for hf to work. It is a bit annoying when using Docker (sic). The user can choose another location. What would you suggest instead? Maybe ${workDir}, kind of similarly as singularity images? Any other approach?
There was a problem hiding this comment.
I must admit that is a bit of my bias that I normally like to place singularity images in projectDir instead of workDir 😛
| - ai | ||
| tools: | ||
| - huggingface_hub: | ||
| description: "HuggingFace Hub CLI interface" |
There was a problem hiding this comment.
Can this be described a bit better?
There was a problem hiding this comment.
I tried to provide more details here: 6dfff97
There was a problem hiding this comment.
both files in this folder should be added to test-datasets and not to the modules repo
There was a problem hiding this comment.
They are somehow dummy files (specially the model for the stub, since it was an empty file). Now I generated them on the fly, unless you think it may be worth to have a prompt JSON on test-datasets.
Hi @famosab . Thanks for the feedback and I will go through your comments! I was told about the module submission, but since you need the output of one of the processes for the other, I thought it would help potential users once it could become eventually accepted. But, certainly, is more work for everyone. Sorry about this and I will avoid it in future PRs. |
Not so many assertions for stub test Co-authored-by: Famke Bäuerle <45968370+famosab@users.noreply.github.com>
This pull request, contributed jointly with @lucacozzuto , provides a simple workload for running text inference tasks using llamacpp-python against local LLMs.
This effort was worked on during the nf-core Hackathon in March 2026.
PR checklist
Closes #XXX
topic: versions- See version_topicslabelnf-core modules test <MODULE> --profile dockernf-core modules test <MODULE> --profile singularitynf-core modules test <MODULE> --profile condanf-core subworkflows test <SUBWORKFLOW> --profile dockernf-core subworkflows test <SUBWORKFLOW> --profile singularitynf-core subworkflows test <SUBWORKFLOW> --profile conda