TuneSalon AI
Resources

GPU Guide

Last updated April 10, 2026

What Are Cloud GPUs?

When you train or chat with a model on TuneSalon, the work runs on powerful cloud hardware called GPUs. You don't need to own one or set anything up. TuneSalon handles it for you. You just pick which GPU tier to use.

A100 vs H200

TuneSalon offers two GPU tiers:

A100Included with all plans
  • 80 GB VRAM
  • Models up to 24B parameters
  • 1x credit cost
  • 8 models available
H200Pro & Max plans
  • 141 GB VRAM
  • Models up to 35B parameters
  • 2x credit cost
  • 2 largest models (32B, 35B)

The H200 has significantly more memory than the A100, which means it can run larger models. It also trains faster. The trade-off is higher credit cost: every operation costs 2x as many credits.

Which Models Run Where?

GPUAvailable Models
A100Qwen3-4B, Mistral-7B, Qwen3-8B, Ministral-8B, Qwen3-14B, Mistral-Small-24B
H200Qwen3-32B (32B), Qwen3.5-35B (35B)

Which Should I Pick?

Starting out or on a budget? A100 handles most use cases. The 8 models available on A100 cover everything from quick chatbots to professional-grade content.

Need the largest, most capable models? Upgrade to Pro or Max for H200 access. Qwen3-32B and Qwen3.5-35B are only available on H200.

Not sure which model to pick? Use our Model Guide to find the right one for your use case.

How to Switch

Go to the System tab and click the GPU card you want to use. Your choice is saved automatically and applies to training and chat.