Build and deploy your AI apps on our infrastructure.
Build and deploy your AI apps in your own infrastucture.
This pricing calculator offers an estimate of your Anyscale cost. Actual cost will depend on your usage.
Provider | Name | vCPUs | Memory | GPUs | AC/hr | $/hr |
---|---|---|---|---|---|---|
AWS | m5.2xlarge | 8 | 32 | - | 0.5130 | 0.5130 |
m5.4xlarge | 16 | 64 | - | 1.0395 | 1.0395 | |
m5.8xlarge | 32 | 128 | - | 2.0790 | 2.0790 | |
m5.12xlarge | 48 | 192 | - | 3.1050 | 3.1050 | |
m5.16xlarge | 64 | 256 | - | 4.1445 | 4.1445 | |
m5.24xlarge | 96 | 384 | - | 6.2235 | 6.2235 | |
GCP | n2-standard-8 | 8 | 32 | - | 0.5245 | 0.5245 |
n2-standard-16 | 16 | 64 | - | 1.0488 | 1.0488 | |
n2-standard-32 | 32 | 128 | - | 2.0978 | 2.0978 | |
n2-standard-48 | 48 | 192 | - | 3.1466 | 3.1466 | |
n2-standard-64 | 64 | 256 | - | 4.1955 | 4.1955 | |
n2-standard-96 | 96 | 384 | - | 6.2933 | 6.2933 |
Provider | Name | vCPUs | Memory | GPUs | AC/hr | $/hr |
---|---|---|---|---|---|---|
GCP | a2-ultragpu-1g-nvidia-a100-80gb-1 | 12 | 170 | 1 | 6.7874 | 6.7874 |
a2-ultragpu-2g-nvidia-a100-80gb-2 | 24 | 340 | 2 | 13.5748 | 13.5748 | |
a2-ultragpu-4g-nvidia-a100-80gb-4 | 48 | 680 | 4 | 27.1496 | 27.1496 | |
a2-ultragpu-8g-nvidia-a100-80gb-8 | 96 | 1360 | 8 | 54.2992 | 54.2992 |
Provider | Name | vCPUs | Memory | GPUs | AC/hr | $/hr |
---|---|---|---|---|---|---|
AWS | g5.2xlarge | 8 | 32 | 1 | 1.6335 | 1.6335 |
g5.4xlarge | 16 | 64 | 1 | 2.1870 | 2.1870 | |
g5.8xlarge | 32 | 128 | 1 | 3.3075 | 3.3075 | |
g5.12xlarge | 48 | 192 | 4 | 7.6545 | 7.6545 | |
g5.48xlarge | 192 | 768 | 8 | 21.9915 | 21.9915 |
Provider | Name | vCPUs | Memory | GPUs | AC/hr | $/hr |
---|---|---|---|---|---|---|
GCP | a3-highgpu-8g-nvidia-h100-80gb-8 | 208 | 1872 | 8 | 119.4615 | 119.4615 |
Provider | Name | vCPUs | Memory | GPUs | AC/hr | $/hr |
---|---|---|---|---|---|---|
AWS | g6.xlarge | 4 | 16 | 1 | 1.0868 | 1.0868 |
g6.2xlarge | 8 | 32 | 1 | 1.3203 | 1.3203 | |
g6.4xlarge | 16 | 64 | 1 | 1.7861 | 1.7861 | |
g6.8xlarge | 32 | 128 | 1 | 2.7189 | 2.7189 | |
g6.12xlarge | 48 | 192 | 4 | 6.2127 | 6.2127 | |
g6.16xlarge | 64 | 256 | 1 | 4.5860 | 4.5860 | |
g6.24xlarge | 96 | 384 | 4 | 9.0113 | 9.0113 | |
g6.48xlarge | 192 | 768 | 8 | 18.0225 | 18.0225 | |
GCP | g2-standard-8-nvidia-l4-1 | 8 | 32 | 1 | 1.1524 | 1.1524 |
g2-standard-16-nvidia-l4-1 | 16 | 64 | 1 | 1.5487 | 1.5487 | |
g2-standard-32-nvidia-l4-1 | 32 | 128 | 1 | 2.3414 | 2.3414 | |
g2-standard-48-nvidia-l4-4 | 48 | 192 | 4 | 5.4023 | 5.4023 | |
g2-standard-96-nvidia-l4-8 | 96 | 384 | 8 | 10.8045 | 10.8045 |
Provider | Name | vCPUs | Memory | GPUs | AC/hr | $/hr |
---|---|---|---|---|---|---|
AWS | g4dn.2xlarge | 8 | 32 | 1 | 1.0125 | 1.0125 |
g4dn.4xlarge | 16 | 64 | 1 | 1.6200 | 1.6200 | |
g4dn.8xlarge | 32 | 128 | 1 | 2.9430 | 2.9430 | |
g4dn.12xlarge | 48 | 192 | 4 | 5.2785 | 5.2785 | |
g4dn.16xlarge | 64 | 256 | 1 | 5.8725 | 5.8725 | |
GCP | n1-standard-8-nvidia-t4-16gb-1 | 8 | 30 | 1 | 0.9855 | 0.9855 |
n1-standard-16-nvidia-t4-16gb-1 | 16 | 60 | 1 | 1.4985 | 1.4985 | |
n1-standard-32-nvidia-t4-16gb-1 | 32 | 120 | 1 | 2.5245 | 2.5245 | |
n1-standard-32-nvidia-t4-16gb-4 | 32 | 120 | 4 | 3.9420 | 3.9420 | |
n1-highmem-32-nvidia-t4-16gb-1 | 32 | 208 | 1 | 3.0299 | 3.0299 |
Provider | Name | vCPUs | Memory | GPUs | AC/hr | $/hr |
---|---|---|---|---|---|---|
AWS | p3.2xlarge | 8 | 61 | 1 | 4.1310 | 4.1310 |
p3.8xlarge | 32 | 244 | 4 | 16.5240 | 16.5240 | |
p3.16xlarge | 64 | 488 | 8 | 33.0480 | 33.0480 | |
GCP | n1-highmem-8-nvidia-v100-16gb-1 | 8 | 52 | 1 | 3.9874 | 3.9874 |
n1-highmem-32-nvidia-v100-16gb-4 | 32 | 208 | 4 | 15.9494 | 15.9494 | |
n1-highmem-64-nvidia-v100-16gb-8 | 64 | 416 | 8 | 31.8989 | - |