B200 SXM6 180GB - available for $4.04/h
Deploy now
AI Cloud
AI Cloud
GPUs
GPUs
Resources
Resources
About
About
.
Login
Sign up
AI Cloud
Instances
1x-8x GPUs with NVLink
Instant Clusters
16x-64x GPUs with InfiniBand™
Bare-metal Clusters
Configured to your specifications
Serverless Containers
Serverless GPUs with auto-scaling
GPUs
GB200 NVL72
36 x 72 CPUs | 192 x 72 GB VRAM
B200 SXM6
30 CPUs | 184 GB RAM | 180 GB VRAM
H200 SXM5
44 CPUs | 182 GB RAM | 141 GB VRAM
H100 SXM5
30 CPUs | 120 GB RAM | 80 GB VRAM
A100
22 CPUs | 120 GB RAM | 80 GB VRAM
See all
L40S, RTX6000 ADA, and more
Resources
Blog
GPU benchmarks and R&D on Efficient AI
Trust Center
Security and compliance at DataCrunch
Compute Credits
Earn compute credits for writing content
Docs
Technical documentation and tutorials
API
Control GPU resources via external code
About
Contact Us
Talk to our engineers or sales
Career
Join us to build Europe's first hyperscaler
Jobs
Explore the open positions
DataCrunch Blog
All
Customer stories
News
Guides
Benchmarks
Technical analyses
Legacy
Deutsch
English
Search
No articles found in this category.