Home/Providers/tensordock

tensordock GPU Cloud Provider

Marketplace GPU cloud with industry-leading pricing. No waitlists, quotas, or price gouging. Pay-per-second billing starting from $5 deposit. 100+ locations across 20+ countries. 30,000+ GPUs available including 45 different models. 99.99% uptime standard.

tensordock Cloud Provider - GPU Computing Services

Provider Overview

Name tensordock
Total Instances 10
Minimum Price $0.12/hr
Maximum VRAM 80 GB
Available GPU Models

Marketplace GPU cloud with industry-leading pricing. No waitlists, quotas, or price gouging. Pay-per-second billing starting from $5 deposit. 100+ locations across 20+ countries. 30,000+ GPUs available including 45 different models. 99.99% uptime standard.

Get Started with tensordock

Ready to rent GPUs from tensordock? Sign up now to explore available instances and start your AI workloads.

Visit Provider Website →

About the Provider

Regions

North America, Europe, Asia, Oceania, United States, Canada, Germany, United Kingdom, France, Netherlands, Singapore, Japan, Australia

GPU Models

9

Instances

10

Pricing Plans

On-Demand

$0

Pay-per-second GPU cloud with no waitlists

Available Instances

Accelerator Price/Hour VRAM Type Action
H100 $2.27 80 GB GPU View GPU →
A800 80GB $1.81 80 GB GPU View GPU →
A800 80GB $1.51 80 GB GPU View GPU →
Tesla V100 $0.18 16 GB GPU View GPU →
L40S $0.96 48 GB GPU View GPU →
RTX 4090 $0.36 24 GB GPU View GPU →
RTX 3090 $0.21 24 GB GPU View GPU →
RTX 6000 Ada $0.77 48 GB GPU View GPU →
A6000 $0.47 48 GB GPU View GPU →
RTX A4000 $0.12 16 GB GPU View GPU →

Related Resources

GPU Comparison

Compare GPUs side-by-side to find the best match for your workload

Compare GPUs →

Compute Capability

Check CUDA compute capability and AI feature support for different GPUs

View Reference →

All Providers

Browse and compare all GPU cloud providers in one place

Browse Providers →

About tensordock GPU Cloud

tensordock is a leading GPU cloud provider offering 10 instances across 9 different GPU models. With pricing starting at $0.12/hour, they provide competitive options for AI training, inference, and high-performance computing workloads.

Their infrastructure spans 13 regions, making it easy to deploy GPU instances close to your users or data sources. The provider supports popular NVIDIA GPUs including A6000, A800 80GB, H100, enabling a wide range of AI/ML applications from deep learning training to real-time inference.

When choosing tensordock for your GPU cloud needs, consider factors like pricing, regional availability, and supported GPU models. Their platform integrates with popular ML frameworks like PyTorch, TensorFlow, and JAX, making it straightforward to migrate existing workloads or start new projects.

For cost optimization, compare tensordock's pricing with other providers using our cost estimator tool. Many users find that tensordock offers competitive rates for long-running training jobs or high-throughput inference workloads, especially when utilizing their spot or preemptible instance options.

External Resources

Learn more about GPUs from these authoritative sources:

NVIDIA CUDA Documentation →

Official CUDA programming guide

NVIDIA GPU Specifications →

Official NVIDIA GPU specs

TechPowerUp GPU Database →

Comprehensive GPU specifications

CUDA Compute Capability Guide →

GPU compute capability reference

Updated April 21, 2026 • 2026 Edition

Ready to Get Started?

Visit tensordock's website to create an account and start using their GPU instances.

Visit tensordock →