DeepLearn 2025
12th International School on Deep Learning
(with a special focus on Large Language Models, Foundation Models and Generative AI)
Porto - Maia, Portugal · July 21-25, 2025
Registration
Downloads
  • Call DeepLearn 2025
  • Poster DeepLearn 2025
  • Lecture Materials
  • Home
  • Lecturers
  • Schedule
  • Sponsors
  • News
  • Info
    • Travel
    • Accommodation
    • UMaia / UP students and staff
    • Visa
    • Code of conduct
    • Testimonials
  • Home
  • Lecturers
  • Schedule
  • Sponsors
  • News
  • Info
    • Travel
    • Accommodation
    • UMaia / UP students and staff
    • Visa
    • Code of conduct
    • Testimonials
Xia (Ben) Hu

Xia “Ben” Hu

Rice University

[introductory/advanced] Efficient LLM Serving: Algorithms and Systems

Summary

Large Language Models (LLMs) have demonstrated strong performance across a wide range of areas. However, the generality and robustness of LLMs are largely attributed to their vast scale. As a result, deploying and serving these LLMs is not cost-efficient. To address the high computational demands and improve accessibility, various techniques have been proposed to make LLMs more efficient, such as model compression. In this lecture, we will delve into the fundamentals of the challenges and opportunities in large language model serving. Particularly, we will examine techniques like model quantization and weight pruning, and see how those advances with “lossy” compression reduce computational demands with minimal performance drop. In the meantime, we will explore various schools of KV cache compression techniques, discuss their characteristics and trade-offs, illustrate benchmark results for some exemplar methods, and highlight interesting caveats in conducting proper long-context evaluations.

Syllabus

  • Introduction to large language models
  • Model quantization
  • Weight pruning
  • KV cache compression
  • Extending to long context scenario
  • Evaluation of the compression method

References

Brown, Tom B. “Language models are few-shot learners.” arXiv:2005.14165 (2020).

Sun, Mingjie, et al. “A simple and effective pruning approach for large language models.” arXiv:2306.11695 (2023).

Lin, Ji, et al. “AWQ: Activation-aware Weight Quantization for On-Device LLM Compression and Acceleration.” Proceedings of Machine Learning and Systems 6 (2024): 87-100.

Liu, Zirui, et al. “Kivi: A tuning-free asymmetric 2bit quantization for kv cache.” arXiv:2402.02750 (2024).

Yuan, Jiayi, et al. “Kv cache compression, but what must we give in return? a comprehensive benchmark of long context capable approaches.” arXiv:2407.01527 (2024).

Jin, Hongye, et al. “Llm maybe longlm: Self-extend llm context window without tuning.” arXiv:2401.01325 (2024).

Xiao, Guangxuan, et al. “Efficient streaming language models with attention sinks.” arXiv:2309.17453 (2023).

Pre-requisites

Machine learning and deep neural network fundamentals, linear algebra, transformers, and Large Language Model basics.

Short bio

Dr. Xia “Ben” Hu is an Associate Professor at Rice University in the Department of Computer Science. Dr. Hu has published over 200 papers in several major academic venues, including NeurIPS, ICLR, ICML, KDD, IJCAI, etc. An open-source package developed by his group, namely AutoKeras, has become the most used automated deep learning system on Github (with over 9,000 stars and 1,000 forks). Also, his works on LLM efficiency, deep collaborative filtering, anomaly detection, knowledge graphs, and fast interpretation have been included in the Hugging Face, TensorFlow, Apple, Bing and Meta production systems, respectively. His papers have received several Best Paper (Candidate) awards from venues such as ICML, WWW, WSDM, ICDM, AMIA and INFORMS. He is the recipient of a NSF CAREER Award and an ACM SIGKDD Rising Star Award. His work has been cited more than 30,000 times with an h-index of 70. He was the conference General Co-Chair for WSDM 2020 and ICHI 2023, and Program Co-Chair for AIHC 2024 and CHASE 2025.

Other Courses

YoninaEldar2023Yonina Eldar
Manuela VelosoManuela Veloso
Pierre BaldiPierre Baldi
Sean BensonSean Benson
Xavier BressonXavier Bresson
Nello CristianiniNello Cristianini
Mark DerdzinskiMark Derdzinski
Samira Ebrahimi KahouSamira Ebrahimi Kahou
Elena GiusarmaElena Giusarma
Shih-Chieh HsuShih-Chieh Hsu
Lu JiangLu Jiang
Jayashree Kalpathy-CramerJayashree Kalpathy-Cramer
Yingbin LiangYingbin Liang
Chen Change LoyChen Change Loy
Fenglong MaFenglong Ma & Cao (Danica) Xiao
Evan ShelhamerEvan Shelhamer
deeeplearn-speakers-wangAtlas Wang
Xiang WangXiang Wang
Rex YingRex Ying

DeepLearn 2025

CO-ORGANIZERS

University of Maia

Institute for Research Development, Training and Advice – IRDTA, Brussels/London

Active links
  • CESArtIn 2026
Past links
  • DeepLearn 2024
  • DeepLearn 2023 Summer
  • DeepLearn 2023 Spring
  • DeepLearn 2023 Winter
  • DeepLearn 2022 Autumn
  • DeepLearn 2022 Summer
  • DeepLearn 2022 Spring
  • DeepLearn 2021 Summer
  • DeepLearn 2019
  • DeepLearn 2018
  • DeepLearn 2017
© IRDTA 2024. All Rights Reserved.
We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept All”, you consent to the use of ALL the cookies. However, you may visit "Cookie Settings" to provide a controlled consent.
Cookie SettingsAccept All
Manage consent

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary
Always Enabled
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
CookieDurationDescription
cookielawinfo-checkbox-advertisement1 yearThe cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Advertisement".
cookielawinfo-checkbox-analytics11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".
cookielawinfo-checkbox-functional11 monthsThe cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".
cookielawinfo-checkbox-necessary11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-others11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.
cookielawinfo-checkbox-performance11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".
PHPSESSIDsessionThis cookie is native to PHP applications. The cookie is used to store and identify a users' unique session ID for the purpose of managing user session on the website. The cookie is a session cookies and is deleted when all the browser windows are closed.
viewed_cookie_policy11 monthsThe cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
Functional
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytics
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
CookieDurationDescription
_ga2 yearsThis cookie is installed by Google Analytics. The cookie is used to calculate visitor, session, campaign data and keep track of site usage for the site's analytics report. The cookies store information anonymously and assign a randomly generated number to identify unique visitors.
_gat_gtag_UA_74880351_91 minuteThis cookie is set by Google and is used to distinguish users.
_gid1 dayThis cookie is installed by Google Analytics. The cookie is used to store information of how visitors use a website and helps in creating an analytics report of how the website is doing. The data collected including the number visitors, the source where they have come from, and the pages visted in an anonymous form.
Advertisement
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.
Others
Other uncategorized cookies are those that are being analyzed and have not been classified into a category as yet.
SAVE & ACCEPT
Powered by CookieYes Logo