Quantcast
Channel: R&D Forums
Viewing all articles
Browse latest Browse all 9991

Red Hat Virtualization • Re: Feature Request: Proxmox

$
0
0
@Gostev here's some info for our use case for proxmox. MSP type company serving a large amount of SMB customers, 100% of which are all currently running vmware. We deploy vmware on every server just as a best practice to abstract the hardware away and give us flexibility in migrations and hardware refreshes. Even for environments where 1 server = 1 VM.

Usually the lowest tier of licensing just to enable the use of Veeam, for 1 to 3 hosts max, between 1 to 20 VMs each.

We also have some large enterprise customers who are less likely to switch but as our experience with proxmox and SMB space usage grows we will likely start proposing it instead of vmware during the consulting/sales phases, or when hardware/software refreshes/migrations happen. So within the next 5-7 years we will be shifting basically everyone. VMware is firing us as a partner so we're not particularly motivated to continue giving them business.

Estimate about 3500 hosts and 20k VMs across all of them for SMB customers only, if including enterprise maybe 50% more than that.

Our sales teams are going crazy right now trying to figure things out but estimates are that over the next 2-3 years most SMB customers will need to transition to something else as their licenses or support contracts expire, while a substantial number will actually need to take action this year (2024). Enterprise sized customers likely won't feel a need to change immediately but many will still want to save money and do so eventually.

For more technical specs on proxmox use these are the use cases being tested by our R&D:


scenario 1 (very common): single host, 1 - 5 VMs, local storage backed by zfs. dataset/zvol created automatically by proxmox for each virtual disk.
scenario 2 (very common): single host, shared storage accessed via NFS. qcow2 disks.
scenario 3 (less common): 2 - 3 hosts, ~20 VMs, clustered. NFS shared storage, qcow2 disks.
scenario 4 (rare/unlikely at current levels of testing): multiple hosts, clustered, ceph storage, qcow2 disks.

The use of ceph may get more relevant for enterprise customers who make use of vsan, as it's a replacement of that feature.

Some general info about proxmox that may be relevant for veeam integration, specifically how it differs from vmware vsphere:

1) there is no equivalent to vcenter to be deployed. A standalone host is managed the same way as a cluster. In a cluster each host's web UI (you can connect to any of them) manages every other host. What many people do is deploy a reverse proxy so something like proxmox.domain.com points to each host in a round robin type of situation. So many people may be inputting this fqdn into veeam as the management address rather than a single host. It's all just a big API so this should be fine as long as you're expecting it.

2) the VM configs are stored separately from the virtual disks and other VM files. so in a proxmox cluster your equivalent of a .vmx is on dedicated clustered storage accessible to all hosts, while your disk files are stored wherever you decided to put them.

3) proxmox has its own built-in backup system that works very well for simple backup& restore of whole VMs. It just doesn't have anything other than that which is where veeam needs to come in. It may be possible to leverage these backups to do the things that veeam does instead of reinventing the wheel for the backup layer. Worth checking out.

4) in addition to the built in backups, they also offer a backup server which is used as a backup destination, that ties in to the built in backups but offers better retention management and data integrity checks etc. Still none of the features veeam offers though. This may actually be the component you want to compete with. Drop in replacement for proxmox backup server, where the backups are sent to it, and then using those backup files veeam can work its magic for file level recovery, active directory/exchange/sql and all the good stuff.

Statistics: Posted by MelanieTanaka — Jan 13, 2024 6:03 pm



Viewing all articles
Browse latest Browse all 9991

Latest Images

Trending Articles



Latest Images