
Building Natural Language and LLM Pipelines: Build production-grade RAG, tool contracts, and context engineering with Haystack and LangGraph
Author(s): Laura Funderburk (Author)
- Publisher finelybook 出版社: Packt Publishing
- Publication Date 出版日期: December 30, 2025
- Edition 版本: 1st
- Language 语言: English
- Print length 页数: 521 pages
- ASIN: B0DM2JPDC5
- ISBN-13: 9781835467008
Book Description
Stop LLM applications from breaking in production. Build deterministic pipelines, enforce strict tool contracts, engineer high-signal context for RAG, and orchestrate resilient multi-agent workflows using two foundational frameworks: Haystack for pipelines and LangGraph for low-level agent orchestration.
DRM-free PDF version + access to Packt’s next-gen Reader*
Key Features
- Design reproducible LLM pipelines using typed components and strict tool contracts
- Build resilient multi-agent systems with LangGraph and modular microservices
- Evaluate and monitor pipeline performance with Ragas and Weights & Biases
Book Description
Modern LLM applications often break in production due to brittle pipelines, loose tool definitions, and noisy context. This book shows you how to build production-ready, context-aware systems using Haystack and LangGraph. You’ll learn to design deterministic pipelines with strict tool contracts and deploy them as microservices. Through structured context engineering, you’ll orchestrate reliable agent workflows and move beyond simple prompt-based interactions.
You’ll start by understanding LLM behavior—tokens, embeddings, and transformer models—and see how prompt engineering has evolved into a full context engineering discipline. Then, you’ll build retrieval-augmented generation (RAG) pipelines with retrievers, rankers, and custom components using Haystack’s graph-based architecture. You’ll also create knowledge graphs, synthesize unstructured data, and evaluate system behavior using Ragas and Weights & Biases. In LangGraph, you’ll orchestrate agents with supervisor-worker patterns, typed state machines, retries, fallbacks, and safety guardrails.
By the end of the book, you’ll have the skills to design scalable, testable LLM pipelines and multi-agent systems that remain robust as the AI ecosystem evolves.
*Email sign-up and proof of purchase required
What you will learn
- Build structured retrieval pipelines with Haystack
- Apply context engineering to improve agent performance
- Serve pipelines as LangGraph-compatible microservices
- Use LangGraph to orchestrate multi-agent workflows
- Deploy REST APIs using FastAPI and Hayhooks
- Track cost and quality with Ragas and Weights & Biases
- Implement retries, circuit breakers, and observability
- Design sovereign agents for high-volume local execution
Who this book is for
LLM engineers, NLP developers, and data scientists looking to build production-grade pipelines, agentic workflows, or RAG systems. Ideal for tech leads looking to move beyond prototypes to scalable, testable solutions, as well as teams modernizing legacy NLP pipelines into orchestration-ready microservices. Proficiency in Python and familiarity with core NLP concepts are recommended.
Table of Contents
- Introduction to Natural Language Processing Pipelines
- Diving Deep into Large Language Models
- Introduction to Haystack by deepset
- Bringing Components Together – Haystack Pipelines for Different Use Cases
- Haystack Pipeline Development with Custom Components
- Building Reproducible and Production-Ready RAG Systems
- Deploying Haystack-Based Applications
- Hands-on Projects
- Future Trends and Beyond
- Epilogue: The Architecture of Agentic AI
finelybook
