<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/">
  <channel>
    <title>Megatron-LM on AI Tech Blog</title>
    <link>https://jesamkim.github.io/ai-tech-blog/tags/megatron-lm/</link>
    <description>Recent content in Megatron-LM on AI Tech Blog</description>
    <generator>Hugo -- 0.147.6</generator>
    <language>ko</language>
    <lastBuildDate>Wed, 15 Apr 2026 13:00:00 +0900</lastBuildDate>
    <atom:link href="https://jesamkim.github.io/ai-tech-blog/tags/megatron-lm/index.xml" rel="self" type="application/rss+xml" />
    <item>
      <title>분산학습의 이해 Part 4 - Tensor/Hybrid Parallelism과 MoE</title>
      <link>https://jesamkim.github.io/ai-tech-blog/posts/2026-04-16-distributed-training-part4-tensor-hybrid-moe/</link>
      <pubDate>Wed, 15 Apr 2026 13:00:00 +0900</pubDate>
      <guid>https://jesamkim.github.io/ai-tech-blog/posts/2026-04-16-distributed-training-part4-tensor-hybrid-moe/</guid>
      <description>Tensor Parallelism의 Row/Column Split 원리, Megatron-LM의 교대 방식, 2D/3D Hybrid Parallelism 조합 전략, 그리고 MoE와 Expert Parallelism까지 정리합니다. 4대 병렬화 기법의 종합 비교와 의사결정 가이드를 제공합니다.</description>
    </item>
  </channel>
</rss>
