Silicon Analysts
Loading...

AI Accelerator Manufacturing Cost (COGS) — Historical Time Series Data

Estimated manufacturing cost breakdown for NVIDIA A100, H100, H200, B200 and AMD MI300X. Shows HBM shift from 33% to 45%+ of BOM.

AI Accelerator Manufacturing Cost (COGS)

Estimated manufacturing cost breakdown for NVIDIA A100, H100, H200, B200 and AMD MI300X. Shows HBM shift from 33% to 45%+ of BOM.

15 data points·Unit: $·Last updated: Mar 2026

Data Table

PeriodSeriesValue
A100 (2020)
Logic Die
$400
A100 (2020)
HBM
$1,350
A100 (2020)
Packaging & Test
$750
H100 (2023)
Logic Die
$300
H100 (2023)
HBM
$1,350
H100 (2023)
Packaging & Test
$1,670
H200 (2024)
HBM
$1,500
H200 (2024)
Logic Die
$300
H200 (2024)
Packaging & Test
$2,450
MI300X (2024)
Logic Die
$600
MI300X (2024)
HBM
$2,900
MI300X (2024)
Packaging & Test
$1,800
B200 (2025)
Logic Die
$850
B200 (2025)
HBM
$2,900
B200 (2025)
Packaging & Test
$2,650

Methodology & Sources

Sources: Silicon Analysts (H100 Feb 2026), Epoch.ai (B200 Dec 2025), Deep Research (102 sources). NVIDIA/AMD do not disclose COGS; estimates from teardown analysis. Gross margin estimates based on analyst consensus.

Citations:

  • Deep Research (102 sources) (Mar 2026)
  • Silicon Analysts (Feb 2026)
  • Silicon Analysts (Mar 2026)
  • Silicon Analysts / SemiAnalysis (Mar 2026)
  • Epoch.ai (Dec 2025)

Related Tools

Related Analysis

Frequently Asked Questions

How much does it cost NVIDIA to manufacture an H100?
The estimated total manufacturing cost (COGS) for an H100 SXM5 is approximately $3,320: roughly $300 for the 814mm² TSMC 4N logic die, $1,350 for HBM3 memory, $750 for CoWoS-S packaging, and $920 for test/assembly.
Why is the B200 logic die cheaper than the H100 despite being more advanced?
The B200 uses a chiplet design with two ~400mm² N4P dies (~800mm² total) instead of a single monolithic 814mm² die like the H100. Each smaller chiplet yields significantly better, driving per-die cost down to roughly $900 total for the logic despite using a more advanced node.
What percentage of AI chip cost is HBM memory?
HBM as a share of total BOM has grown from roughly 14% (A100) to 33% (H100) to 43% (H200) to 45%+ (MI300X). The B200 reverses this trend slightly at ~38% because its chiplet logic die is relatively cheap, but the absolute HBM cost continues to rise.