bokomslag Quantization Methods for Large Language Models From Theory to Real-World Implementations
Vetenskap & teknik

Quantization Methods for Large Language Models From Theory to Real-World Implementations

Anand Vemula

Pocket

299:-

Funktionen begränsas av dina webbläsarinställningar (t.ex. privat läge).

Uppskattad leveranstid 7-12 arbetsdagar

Fri frakt för medlemmar vid köp för minst 249:-

  • 76 sidor
  • 2024
The book provides an in-depth understanding of quantization techniques and their impact on model efficiency, performance, and deployment.The book starts with a foundational overview of quantization, explaining its significance in reducing the computational and memory requirements of LLMs. It delves into various quantization methods, including uniform and non-uniform quantization, per-layer and per-channel quantization, and hybrid approaches. Each technique is examined for its applicability and trade-offs, helping readers select the best method for their specific needs.The guide further explores advanced topics such as quantization for edge devices and multi-lingual models. It contrasts dynamic and static quantization strategies and discusses emerging trends in the field. Practical examples, use cases, and case studies are provided to illustrate how these techniques are applied in real-world scenarios, including the quantization of popular models like GPT and BERT.

  • Författare: Anand Vemula
  • Format: Pocket/Paperback
  • ISBN: 9798227328335
  • Språk: Engelska
  • Antal sidor: 76
  • Utgivningsdatum: 2024-08-19
  • Förlag: Anand Vemula