
Domain-Specific Small Language Models
Want LLM power without the LLM price tag? Crave models that fit your data, laptop, and budget? Stop renting GPUs you cannot afford. Start building Domain-Specific Small Language Models today. Own your AI stack, end to end.
- Model sizing best practices: pick the smallest architecture that still delivers top-tier accuracy.
- Open-source toolchain: leverage Hugging Face, PEFT, and quantization libraries for zero-license freedom.
- Fine-tuning workflows: adapt existing checkpoints to niche datasets in hours, not weeks.
- Commodity hardware deployment: run chat, code, or bio models locally on a single GPU or CPU.
- Retrieval-augmented generation: fuse SLMs with RAG pipelines for grounded, up-to-date answers.
- Cost-control checklists: slash cloud spends and eliminate dependency on expensive foundation APIs.
Domain-Specific Small Language Models, by AI director Guglielmo Iozzia, is a field guide packed with runnable Python code and real-world engineering insight.
Step-by-step chapters demystify transformer architecture, quantization, and PEFT fine-tuning, then walk you through building RAG systems and autonomous agents that rely solely on SLMs. Clear diagrams, annotated notebooks, and troubleshooting tips keep learning smooth.
You will finish with reusable templates, deployment scripts, and the confidence to deliver performant language models under tight hardware and budget constraints.
Perfect for Python-savvy machine-learning engineers, data scientists, and technical leads who need domain-tuned AI now.
- Kirjailija
- Guglielmo Iozzia
- ISBN
- 9781633436701
- Kieli
- englanti
- Paino
- 446 grammaa
- Julkaisupäivä
- 6.4.2026
- Kustantaja
- Manning Publications
- Sivumäärä
- 300