Skip to content

Commit 73e7970

Browse files
committed
Update publish date
Signed-off-by: Chris Abraham <[email protected]>
1 parent 6fa30b5 commit 73e7970

File tree

2 files changed

+1
-1
lines changed

2 files changed

+1
-1
lines changed

_community_blog/optimize-llms.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
---
22
title: "Optimize LLMs for Efficiency & Sustainability"
33
ext_url: /blog/optimize-llms/
4-
date: Feb 18, 2025
4+
date: Feb 19, 2025
55
---
66

77
The rapid growth of large language model (LLM) applications is linked to rapid growth in energy demand. According to the International Energy Agency (IEA), data center electricity consumption is projected to roughly double by 2026 primarily driven by AI. This is due to the energy-intensive training requirements for massive LLMs – however, the increase in AI Inferencing workloads also plays a role. For example, compared with traditional search queries, a single AI inference can consume about [10x more energy](https://www.weforum.org/stories/2024/07/generative-ai-energy-emissions/).

0 commit comments

Comments
 (0)