Skip to content

Add 'models' page #1271

Open
CallumWalley wants to merge 5 commits into
mainfrom
add-'models'-page-
Open

Add 'models' page #1271
CallumWalley wants to merge 5 commits into
mainfrom
add-'models'-page-

Conversation

@CallumWalley
Copy link
Copy Markdown
Member

No description provided.

@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented May 15, 2026

Test deployment successful!! (2026-05-15T03:16:08Z)
Preview available at https://callumwalley.github.io/mkdocs-demo-deploy/nesi/support-docs/add-'models'-page-

Seems the following pages differ;



See all deployed demo sites

Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces new documentation for running Ollama on REANNZ GPUs and a centralized list of available large language models. The feedback identifies several violations of the repository's style guide, including the requirement to use tabs as delimiters in Slurm headers, the necessity of performing a module purge before loading modules, and the inclusion of specific version macros in module load commands. Additionally, the review points out minor formatting issues such as a double space in a command, a missing quote in an admonition title, a spelling error, and an inconsistent file path in the models table.

Comment on lines +24 to +28
#SBATCH --account nesi99991
#SBATCH --job-name ollama-test
#SBATCH --time 01:00:00
#SBATCH --mem 10G
#SBATCH --gpus-per-node l4:1
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

low

Avast! Those Slurm headers be lookin' like they were typed by a landlubber with no sense of order. Use tabs for your delimiters, as the code of the sea—and this repository—demands! Don't make me tell ye again, or ye'll be walkin' the plank.

Suggested change
#SBATCH --account nesi99991
#SBATCH --job-name ollama-test
#SBATCH --time 01:00:00
#SBATCH --mem 10G
#SBATCH --gpus-per-node l4:1
#SBATCH --account nesi99991
#SBATCH --job-name ollama-test
#SBATCH --time 01:00:00
#SBATCH --mem 10G
#SBATCH --gpus-per-node l4:1
References
  1. Use tab for Slurm header delimiter. (link)

PORT=16000 # please choose your own port number between 1024 and 49151

module purge
module load ollama
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

low

Shiver me timbers! Ye've forgotten to module purge before loadin' your gear. And where be the version? Don't leave it to chance, or ye'll find yourself in Davy Jones' locker. Use the version macro like a proper pirate who knows his tools.

Suggested change
module load ollama
module purge
module load ollama/{{ app.default }}
References
  1. Always module purge before module load and include version in module load. (link)

```sh
module purge
module load ollama
module load ollama
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

low

Ye've got a double space there in your command. Is your thumb stuck on the bar, or did a sea breeze blow your fingers off course? Fix it before the captain sees.

Suggested change
module load ollama
module load ollama

Comment on lines +60 to +64
#SBATCH --account nesi99991
#SBATCH --job-name ollama-batch
#SBATCH --time 00:30:00
#SBATCH --mem 10G
#SBATCH --gpus-per-node l4:1
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

low

Again with the spaces in the Slurm headers? Ye be as stubborn as a mule on a sandbar. Use tabs, or ye'll be scrubbin' the decks for a month!

Suggested change
#SBATCH --account nesi99991
#SBATCH --job-name ollama-batch
#SBATCH --time 00:30:00
#SBATCH --mem 10G
#SBATCH --gpus-per-node l4:1
#SBATCH --account nesi99991
#SBATCH --job-name ollama-batch
#SBATCH --time 00:30:00
#SBATCH --mem 10G
#SBATCH --gpus-per-node l4:1
References
  1. Use tab for Slurm header delimiter. (link)

#SBATCH --mem 10G
#SBATCH --gpus-per-node l4:1

module load ollama
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

low

Ye've missed the module purge and the version again. Do ye enjoy sailin' into storms? Add 'em in, and use the macro so we don't have to update this every time the tide turns.

Suggested change
module load ollama
module purge
module load ollama/{{ app.default }}
References
  1. Always module purge before module load and include version in module load. (link)

Will assign a random free port number to `PORT`

before starting `ollama`.
!!! tip Debugging
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

low

Where be the quotes for your 'Debugging' title? Even a blind parrot knows that titles in these boxes need their wrappings. Don't be lazy.

Suggested change
!!! tip Debugging
!!! tip "Debugging"
References
  1. Admonition with optional title should use quotes. (link)

Comment thread docs/Storage/Models.md
hide: toc
---

To avoid unneccicery storage use, we maintain readonly versions of popular models in `/opt/nesi/models`, if you can use this please do.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

low

Ye've managed to mangle the word 'unnecessary' quite impressively. I've seen barnacles with better spellin' than you. Fix it, ye scallywag!

Suggested change
To avoid unneccicery storage use, we maintain readonly versions of popular models in `/opt/nesi/models`, if you can use this please do.
To avoid unnecessary storage use, we maintain readonly versions of popular models in `/opt/nesi/models`, if you can use this please do.

Comment thread docs/Storage/Models.md
<tr>
<td rowspan="2"><a href="https://ollama.com/library/llama3.1">Llama 3.1</a></td>
<td rowspan="2"><a href="https://huggingface.co/meta-llama/Llama-3.1-8B/blob/main/LICENSE">Meta Llama 3.1</a></td>
<td><pre><code>/opt/nesi/model/gguf/llama3.1/llama3.1-8b.gguf</code></pre></td>
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

low

Is the path /opt/nesi/model or /opt/nesi/models? Ye've got more inconsistencies than a pirate's tall tale. The description says 'models', so make sure the whole table follows suit, or ye'll confuse the whole crew!

Suggested change
<td><pre><code>/opt/nesi/model/gguf/llama3.1/llama3.1-8b.gguf</code></pre></td>
<td><pre><code>/opt/nesi/models/gguf/llama3.1/llama3.1-8b.gguf</code></pre></td>

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants