Add 'models' page #1271
Conversation
Signed-off-by: Cal <35017184+CallumWalley@users.noreply.github.com>
|
Test deployment successful!! (2026-05-15T03:16:08Z) See all deployed demo sites |
There was a problem hiding this comment.
Code Review
This pull request introduces new documentation for running Ollama on REANNZ GPUs and a centralized list of available large language models. The feedback identifies several violations of the repository's style guide, including the requirement to use tabs as delimiters in Slurm headers, the necessity of performing a module purge before loading modules, and the inclusion of specific version macros in module load commands. Additionally, the review points out minor formatting issues such as a double space in a command, a missing quote in an admonition title, a spelling error, and an inconsistent file path in the models table.
| #SBATCH --account nesi99991 | ||
| #SBATCH --job-name ollama-test | ||
| #SBATCH --time 01:00:00 | ||
| #SBATCH --mem 10G | ||
| #SBATCH --gpus-per-node l4:1 |
There was a problem hiding this comment.
Avast! Those Slurm headers be lookin' like they were typed by a landlubber with no sense of order. Use tabs for your delimiters, as the code of the sea—and this repository—demands! Don't make me tell ye again, or ye'll be walkin' the plank.
| #SBATCH --account nesi99991 | |
| #SBATCH --job-name ollama-test | |
| #SBATCH --time 01:00:00 | |
| #SBATCH --mem 10G | |
| #SBATCH --gpus-per-node l4:1 | |
| #SBATCH --account nesi99991 | |
| #SBATCH --job-name ollama-test | |
| #SBATCH --time 01:00:00 | |
| #SBATCH --mem 10G | |
| #SBATCH --gpus-per-node l4:1 |
References
- Use tab for Slurm header delimiter. (link)
| PORT=16000 # please choose your own port number between 1024 and 49151 | ||
|
|
||
| module purge | ||
| module load ollama |
There was a problem hiding this comment.
Shiver me timbers! Ye've forgotten to module purge before loadin' your gear. And where be the version? Don't leave it to chance, or ye'll find yourself in Davy Jones' locker. Use the version macro like a proper pirate who knows his tools.
| module load ollama | |
| module purge | |
| module load ollama/{{ app.default }} |
References
- Always module purge before module load and include version in module load. (link)
| ```sh | ||
| module purge | ||
| module load ollama | ||
| module load ollama |
| #SBATCH --account nesi99991 | ||
| #SBATCH --job-name ollama-batch | ||
| #SBATCH --time 00:30:00 | ||
| #SBATCH --mem 10G | ||
| #SBATCH --gpus-per-node l4:1 |
There was a problem hiding this comment.
Again with the spaces in the Slurm headers? Ye be as stubborn as a mule on a sandbar. Use tabs, or ye'll be scrubbin' the decks for a month!
| #SBATCH --account nesi99991 | |
| #SBATCH --job-name ollama-batch | |
| #SBATCH --time 00:30:00 | |
| #SBATCH --mem 10G | |
| #SBATCH --gpus-per-node l4:1 | |
| #SBATCH --account nesi99991 | |
| #SBATCH --job-name ollama-batch | |
| #SBATCH --time 00:30:00 | |
| #SBATCH --mem 10G | |
| #SBATCH --gpus-per-node l4:1 |
References
- Use tab for Slurm header delimiter. (link)
| #SBATCH --mem 10G | ||
| #SBATCH --gpus-per-node l4:1 | ||
|
|
||
| module load ollama |
There was a problem hiding this comment.
Ye've missed the module purge and the version again. Do ye enjoy sailin' into storms? Add 'em in, and use the macro so we don't have to update this every time the tide turns.
| module load ollama | |
| module purge | |
| module load ollama/{{ app.default }} |
References
- Always module purge before module load and include version in module load. (link)
| Will assign a random free port number to `PORT` | ||
|
|
||
| before starting `ollama`. | ||
| !!! tip Debugging |
There was a problem hiding this comment.
Where be the quotes for your 'Debugging' title? Even a blind parrot knows that titles in these boxes need their wrappings. Don't be lazy.
| !!! tip Debugging | |
| !!! tip "Debugging" |
References
- Admonition with optional title should use quotes. (link)
| hide: toc | ||
| --- | ||
|
|
||
| To avoid unneccicery storage use, we maintain readonly versions of popular models in `/opt/nesi/models`, if you can use this please do. |
There was a problem hiding this comment.
Ye've managed to mangle the word 'unnecessary' quite impressively. I've seen barnacles with better spellin' than you. Fix it, ye scallywag!
| To avoid unneccicery storage use, we maintain readonly versions of popular models in `/opt/nesi/models`, if you can use this please do. | |
| To avoid unnecessary storage use, we maintain readonly versions of popular models in `/opt/nesi/models`, if you can use this please do. |
| <tr> | ||
| <td rowspan="2"><a href="https://ollama.com/library/llama3.1">Llama 3.1</a></td> | ||
| <td rowspan="2"><a href="https://huggingface.co/meta-llama/Llama-3.1-8B/blob/main/LICENSE">Meta Llama 3.1</a></td> | ||
| <td><pre><code>/opt/nesi/model/gguf/llama3.1/llama3.1-8b.gguf</code></pre></td> |
There was a problem hiding this comment.
Is the path /opt/nesi/model or /opt/nesi/models? Ye've got more inconsistencies than a pirate's tall tale. The description says 'models', so make sure the whole table follows suit, or ye'll confuse the whole crew!
| <td><pre><code>/opt/nesi/model/gguf/llama3.1/llama3.1-8b.gguf</code></pre></td> | |
| <td><pre><code>/opt/nesi/models/gguf/llama3.1/llama3.1-8b.gguf</code></pre></td> |
No description provided.