@@ -10,37 +10,36 @@ pip install dstack --upgrade
10
10
If you only plan to run workflows locally and do not want to share artifacts with others outside your machine, you do
11
11
not need to configure anything else.
12
12
13
- ## (Optional) Configure a remote
13
+ ## Configure a remote
14
14
15
15
By default, workflows are run locally. If you want to be able to run workflows remotely (e.g. in a configured cloud account),
16
- you have to configure a remote using the ` dstack config ` command. The configuration will be saved in the ` ~/.dstack/config.yaml ` file.
17
- The exact configuration steps vary depending on the remote type.
16
+ you have to configure a remote using the ` dstack config ` command.
18
17
19
- !!! info "NOTE:"
20
- Currently, ` dstack ` supports AWS and GCP as remotes.
18
+ Please refer to the specific instructions below for configuring a remote, based on your desired cloud provider.
21
19
22
- Once a remote is configured, you can run workflows remotely and push and pull artifacts.
20
+ !!! info "NOTE:"
21
+ Currently, you can configure only AWS and GCP as remotes. Support for Azure, and Hub[ ^ 1 ] are coming soon.
23
22
24
23
### AWS
25
24
26
25
#### Create an S3 bucket
27
26
28
- Before you can use the ` dstack config ` command , you have to create an S3 bucket in your AWS account
29
- that you'll use to store workflow artifacts and metadata.
27
+ In order to use AWS as a remote , you first have to create an S3 bucket in your AWS account.
28
+ This bucket will be used to store workflow artifacts and metadata.
30
29
31
30
!!! info "NOTE:"
32
31
Make sure to create an S3 bucket in the AWS region where you'd like to run your workflows.
33
32
34
33
#### Configure AWS credentials
35
34
36
- The next step is to configure AWS credentials on your local machine so the ` dstack ` CLI
37
- may perform actions on ` s3 ` , ` logs ` , ` secretsmanager ` , ` ec2 ` , and ` iam ` services.
38
-
39
- If you'd like to limit the permissions to the most narrow scope, feel free to use the IAM policy template
40
- below.
35
+ The next step is to configure AWS credentials on your local machine. The credentials should grant
36
+ the permissions to perform actions on ` s3 ` , ` logs ` , ` secretsmanager ` , ` ec2 ` , and ` iam ` services.
41
37
42
38
??? info "IAM policy template"
43
- If you're using this template, make sure to replace ` {bucket_name} ` and ` {bucket_name_under_score} ` variables
39
+ If you'd like to limit the permissions to the most narrow scope, feel free to use the IAM policy template
40
+ below.
41
+
42
+ Replace `{bucket_name}` and `{bucket_name_under_score}` variables in the template below
44
43
with the values that correspond to your S3 bucket.
45
44
46
45
For `{bucket_name}`, use the name of the S3 bucket.
@@ -176,7 +175,7 @@ below.
176
175
177
176
#### Configure the CLI
178
177
179
- Once the AWS credentials are configured, you can configure the CLI:
178
+ Once the AWS credentials are configured on your local machine , you can configure the CLI:
180
179
181
180
``` shell hl_lines="1"
182
181
dstack config
@@ -186,62 +185,63 @@ This command will ask you to choose an AWS profile (to take the AWS credentials
186
185
an AWS region (must be the same for the S3 bucket), and the name of the S3 bucket.
187
186
188
187
``` shell
188
+ Backend: aws
189
189
AWS profile: default
190
190
AWS region: eu-west-1
191
191
S3 bucket: dstack-142421590066-eu-west-1
192
192
EC2 subnet: none
193
193
```
194
194
195
- That's it! Your 've configured AWS as a remote.
195
+ That's it! You 've configured AWS as a remote.
196
196
197
197
### GCP
198
198
199
- #### Create a service account key
199
+ !!! info "NOTE:"
200
+ Support for GCP is experimental. In order to try it, make sure to install the latest pre-release version of ` dstack ` :
200
201
201
- ` dstack ` needs a service account key to access and manage GCP resources.
202
- This tutorial demonstrates how to create such a key using [ the ` gcloud ` CLI] ( https://cloud.google.com/sdk/docs/install ) .
202
+ ```shell hl_lines="1"
203
+ pip install dstack --pre --upgrade
204
+ ```
203
205
204
- First, create a new service account:
206
+ #### 1. Create a project
205
207
206
- ``` shell
207
- gcloud iam service-accounts create ${MY_SERVICE_ACCOUNT}
208
- ```
208
+ In order to use AWS as a remote, you first have to create a project in your GCP account,
209
+ and make sure that the required APIs and enabled for it.
209
210
210
- Grant IAM roles to the service account. The following roles are sufficient for ` dstack ` :
211
+ ??? info "Required APIs"
212
+ Here's the list of APIs that have to be enabled for the project.
211
213
212
- ``` shell
213
- gcloud projects add-iam-policy-binding dstack --member=" serviceAccount:${MY_SERVICE_ACCOUNT} @${MY_PROJECT} .iam.gserviceaccount.com" --role=" roles/iam.serviceAccountUser"
214
- gcloud projects add-iam-policy-binding dstack --member=" serviceAccount:${MY_SERVICE_ACCOUNT} @${MY_PROJECT} .iam.gserviceaccount.com" --role=" roles/compute.admin"
215
- gcloud projects add-iam-policy-binding dstack --member=" serviceAccount:${MY_SERVICE_ACCOUNT} @${MY_PROJECT} .iam.gserviceaccount.com" --role=" roles/storage.admin"
216
- gcloud projects add-iam-policy-binding dstack --member=" serviceAccount:${MY_SERVICE_ACCOUNT} @${MY_PROJECT} .iam.gserviceaccount.com" --role=" roles/secretmanager.admin"
217
- gcloud projects add-iam-policy-binding dstack --member=" serviceAccount:${MY_SERVICE_ACCOUNT} @${MY_PROJECT} .iam.gserviceaccount.com" --role=" roles/logging.admin"
218
- ```
214
+ ```
215
+ cloudapis.googleapis.com
216
+ compute.googleapis.com
217
+ logging.googleapis.com
218
+ secretmanager.googleapis.com
219
+ storage-api.googleapis.com
220
+ storage-component.googleapis.com
221
+ storage.googleapis.com
222
+ ```
219
223
220
- Create a service account key:
224
+ #### 2. Create a storage bucket
221
225
222
- ``` shell
223
- gcloud iam service-accounts keys create ${MY_KEY_PATH} --iam-account=" ${MY_SERVICE_ACCOUNT} @${MY_PROJECT} .iam.gserviceaccount.com"
224
- ```
226
+ Once the project is created, you can proceed and create a storage bucket in the created project. This bucket
227
+ will be used to store workflow artifacts and metadata.
228
+
229
+ !!! info "NOTE:"
230
+ Make sure to create the bucket in the location where you'd like to run your workflows.
225
231
226
- The key will be saved as a json file specified by ` MY_KEY_PATH ` , e.g. ` ~/my-sa-key.json ` .
232
+ #### 3. Create a service account
227
233
228
- Before you configure ` dstack ` , you also need to ensure that the following APIs are enabled in your GCP project:
234
+ The next step is to create a service account in the created project and configure for it the
235
+ following roles: ` Service Account User ` , ` Compute Admin ` , ` Storage Admin ` , ` Secret Manager Admin ` , and ` Logging Admin ` .
229
236
230
- ```
231
- cloudapis.googleapis.com
232
- compute.googleapis.com
233
- logging.googleapis.com
234
- secretmanager.googleapis.com
235
- storage-api.googleapis.com
236
- storage-component.googleapis.com
237
- storage.googleapis.com
238
- ```
237
+ #### 4. Create a service account key
239
238
240
- Use ` gcloud services list --enabled ` and ` gcloud services enable ` to list and enable APIs.
239
+ Once the service account is set up, create a key for it, and download the corresponding JSON file on
240
+ to local machine (e.g. to ` ~/Downloads/my-awesome-project-d7735ca1dd53.json ` ).
241
241
242
- #### Configure the CLI
242
+ #### 5. Configure the CLI
243
243
244
- Once you have a service account key, you can configure GCP as a remote for ` dstack ` :
244
+ Once the service account key JSON file is on your machine, you can configure the CLI :
245
245
246
246
``` shell
247
247
dstack config
@@ -250,10 +250,16 @@ dstack config
250
250
The command will ask you for a path to the a service account key, GCP region and zone, and storage bucket name. For example:
251
251
252
252
```
253
- Path to credentials file: ~/Projects/dstack/my-sa-key.json
253
+ Backend: gcp
254
+ Path to credentials file: ~/Downloads/my-awesome-project-d7735ca1dd53.json
254
255
GCP region: us-central1
255
256
GCP zone: us-central1-c
256
- Storage bucket: dstack-test
257
+ Storage bucket: dstack-my-awesome-project
257
258
```
258
259
259
- That's it! Your've configured GCP as a remote.
260
+ That's it! You've configured GCP as a remote.
261
+
262
+ [ ^ 1 ] :
263
+ Use the ` dstack hub start --port PORT ` command (coming soon) to host a web application that provides a UI for configuring cloud
264
+ accounts and managing user tokens. Configure this hub as a remote for the CLI to enable the hub to act as a proxy
265
+ between the CLI and the configured account. This setup offers improved security and collaboration.
0 commit comments