Skip to content
This repository has been archived by the owner. It is now read-only.
A Container Storage Interface (CSI) Storage Plug-in (SP) for VMware vSphere (deprecated)
Go C
Branch: master
Clone or download
akutz Release 0.1.0
This patch marks release 0.1.0.
Latest commit 5872776 Dec 19, 2017
Type Name Latest commit message Commit time
Failed to load latest commit information.
provider Require Volume Exists Dec 4, 2017
service K8s Support Dec 19, 2017
Gopkg.toml Initial commit Dec 4, 2017
VERSION Release 0.1.0 Dec 19, 2017

CSI Storage Plug-in (SP) for VMware vSphere

CSI-vSphere is a Container Storage Interface (CSI) plug-in that that provides VMDK support to vSphere virtual machine (VM).

Runtime Dependencies

The CSI-vSphere SP has the following requirements:

  1. The SP supports the CSI decentralized model and must be deployed to all VMs that require access to storage.
  2. A VM must be located on a host running the ESXi component of the VMware vSphere Docker Volume Service (vDVS). The SP's RPCs ControllerProbe and NodeProbe will only be successful if the VM is able to access the backend, ESXi component.

The following command may be used to verify a VM can access the ESXi vDVS component:

$ docker run -it golang sh -c \
  "go get && vmdkops"

The above command will complete without error if the VM is able to successfully communicate with the service running on the ESXi host.


CSI-vSphere may be installed with the following command:

$ go get

The resulting binary is located at $GOPATH/bin/csi-vsphere.

Starting the Plug-in

Before starting the plug-in please set the environment variable CSI_ENDPOINT to a valid Go network address such as csi.sock:

$ CSI_ENDPOINT=csi.sock csi-vsphere
INFO[0000] serving                                       address="unix://csi.sock"

The server can be shutdown by using Ctrl-C or sending the process any of the standard exit signals.

Using the Plug-in

The CSI specification uses the gRPC protocol for plug-in communication. The easiest way to interact with a CSI plug-in is via the Container Storage Client (csc) program provided via the GoCSI project:

$ go get


The CSI-vShere SP is configured via environment variables:

Name Default Description
X_CSI_VSPHERE_PORT 1019 The port used to connect to the ESX service
X_CSI_VSPHERE_DATASTORE The datastore from which VMDKs are listed and the default datastore in/from which VMDKs are created/removed.

This SP is built using the GoCSI CSP package and as such may be configured with any of its configuration properties. The following table is a list of the global configuration properties for which CSI-vSphere provides a default value:

Name Value Description
X_CSI_IDEMP true Enables idempotency
X_CSI_IDEMP_REQUIRE_VOL true Instructs the idempotency interceptor to validate the existence of a volume before allowing an operation to proceed
X_CSI_CREATE_VOL_ALREADY_EXISTS true Indicates that a CreateVolume request with a result of AlreadyExists will be changed to success
X_CSI_DELETE_VOL_NOT_FOUND true Indicates that a DeleteVolume request with a result of NotFound will be changed to success
X_CSI_SUPPORTED_VERSIONS 0.0.0, 0.1.0 A list of the CSI versions this SP supports

Access Modes

The CSI-vSphere SP supports the following CSI volume access modes:

Access Mode Supported Description
SINGLE_NODE_WRITER Can only be published once as read/write on a single node, at any given time.
SINGLE_NODE_READER_ONLY Can only be published once as readonly on a single node, at any given time.
MULTI_NODE_READER_ONLY Can be published as readonly at multiple nodes simultaneously
MULTI_NODE_SINGLE_WRITER Can be published at multiple nodes simultaneously. Only one of the node can be used as read/write. The rest will be readonly
MULTI_NODE_MULTI_WRITER Can be published as read/write at multiple nodes simultaneously

While the SP does not support any of the MULTI_NODE access modes, the SP will allow a single volume to be mounted at multiple target paths on a single host.


For any questions or concerns please file an issue with the CSI-vSphere project or join the Slack channel #project-rexray at

You can’t perform that action at this time.