Skip to content

Commit 0820033

Browse files
- [Docs]: Updated Installation
1 parent 56e4c02 commit 0820033

File tree

1 file changed

+58
-52
lines changed

1 file changed

+58
-52
lines changed

docs/installation.md

Lines changed: 58 additions & 52 deletions
Original file line numberDiff line numberDiff line change
@@ -10,37 +10,36 @@ pip install dstack --upgrade
1010
If you only plan to run workflows locally and do not want to share artifacts with others outside your machine, you do
1111
not need to configure anything else.
1212

13-
## (Optional) Configure a remote
13+
## Configure a remote
1414

1515
By default, workflows are run locally. If you want to be able to run workflows remotely (e.g. in a configured cloud account),
16-
you have to configure a remote using the `dstack config` command. The configuration will be saved in the `~/.dstack/config.yaml` file.
17-
The exact configuration steps vary depending on the remote type.
16+
you have to configure a remote using the `dstack config` command.
1817

19-
!!! info "NOTE:"
20-
Currently, `dstack` supports AWS and GCP as remotes.
18+
Please refer to the specific instructions below for configuring a remote, based on your desired cloud provider.
2119

22-
Once a remote is configured, you can run workflows remotely and push and pull artifacts.
20+
!!! info "NOTE:"
21+
Currently, you can configure only AWS and GCP as remotes. Support for Azure, and Hub[^1] are coming soon.
2322

2423
### AWS
2524

2625
#### Create an S3 bucket
2726

28-
Before you can use the `dstack config` command, you have to create an S3 bucket in your AWS account
29-
that you'll use to store workflow artifacts and metadata.
27+
In order to use AWS as a remote, you first have to create an S3 bucket in your AWS account.
28+
This bucket will be used to store workflow artifacts and metadata.
3029

3130
!!! info "NOTE:"
3231
Make sure to create an S3 bucket in the AWS region where you'd like to run your workflows.
3332

3433
#### Configure AWS credentials
3534

36-
The next step is to configure AWS credentials on your local machine so the `dstack` CLI
37-
may perform actions on `s3`, `logs`, `secretsmanager`, `ec2`, and `iam` services.
38-
39-
If you'd like to limit the permissions to the most narrow scope, feel free to use the IAM policy template
40-
below.
35+
The next step is to configure AWS credentials on your local machine. The credentials should grant
36+
the permissions to perform actions on `s3`, `logs`, `secretsmanager`, `ec2`, and `iam` services.
4137

4238
??? info "IAM policy template"
43-
If you're using this template, make sure to replace `{bucket_name}` and `{bucket_name_under_score}` variables
39+
If you'd like to limit the permissions to the most narrow scope, feel free to use the IAM policy template
40+
below.
41+
42+
Replace `{bucket_name}` and `{bucket_name_under_score}` variables in the template below
4443
with the values that correspond to your S3 bucket.
4544

4645
For `{bucket_name}`, use the name of the S3 bucket.
@@ -176,7 +175,7 @@ below.
176175

177176
#### Configure the CLI
178177

179-
Once the AWS credentials are configured, you can configure the CLI:
178+
Once the AWS credentials are configured on your local machine, you can configure the CLI:
180179

181180
```shell hl_lines="1"
182181
dstack config
@@ -186,62 +185,63 @@ This command will ask you to choose an AWS profile (to take the AWS credentials
186185
an AWS region (must be the same for the S3 bucket), and the name of the S3 bucket.
187186

188187
```shell
188+
Backend: aws
189189
AWS profile: default
190190
AWS region: eu-west-1
191191
S3 bucket: dstack-142421590066-eu-west-1
192192
EC2 subnet: none
193193
```
194194

195-
That's it! Your've configured AWS as a remote.
195+
That's it! You've configured AWS as a remote.
196196

197197
### GCP
198198

199-
#### Create a service account key
199+
!!! info "NOTE:"
200+
Support for GCP is experimental. In order to try it, make sure to install the latest pre-release version of `dstack`:
200201

201-
`dstack` needs a service account key to access and manage GCP resources.
202-
This tutorial demonstrates how to create such a key using [the `gcloud` CLI](https://cloud.google.com/sdk/docs/install).
202+
```shell hl_lines="1"
203+
pip install dstack --pre --upgrade
204+
```
203205

204-
First, create a new service account:
206+
#### 1. Create a project
205207

206-
```shell
207-
gcloud iam service-accounts create ${MY_SERVICE_ACCOUNT}
208-
```
208+
In order to use AWS as a remote, you first have to create a project in your GCP account,
209+
and make sure that the required APIs and enabled for it.
209210

210-
Grant IAM roles to the service account. The following roles are sufficient for `dstack`:
211+
??? info "Required APIs"
212+
Here's the list of APIs that have to be enabled for the project.
211213

212-
```shell
213-
gcloud projects add-iam-policy-binding dstack --member="serviceAccount:${MY_SERVICE_ACCOUNT}@${MY_PROJECT}.iam.gserviceaccount.com" --role="roles/iam.serviceAccountUser"
214-
gcloud projects add-iam-policy-binding dstack --member="serviceAccount:${MY_SERVICE_ACCOUNT}@${MY_PROJECT}.iam.gserviceaccount.com" --role="roles/compute.admin"
215-
gcloud projects add-iam-policy-binding dstack --member="serviceAccount:${MY_SERVICE_ACCOUNT}@${MY_PROJECT}.iam.gserviceaccount.com" --role="roles/storage.admin"
216-
gcloud projects add-iam-policy-binding dstack --member="serviceAccount:${MY_SERVICE_ACCOUNT}@${MY_PROJECT}.iam.gserviceaccount.com" --role="roles/secretmanager.admin"
217-
gcloud projects add-iam-policy-binding dstack --member="serviceAccount:${MY_SERVICE_ACCOUNT}@${MY_PROJECT}.iam.gserviceaccount.com" --role="roles/logging.admin"
218-
```
214+
```
215+
cloudapis.googleapis.com
216+
compute.googleapis.com
217+
logging.googleapis.com
218+
secretmanager.googleapis.com
219+
storage-api.googleapis.com
220+
storage-component.googleapis.com
221+
storage.googleapis.com
222+
```
219223

220-
Create a service account key:
224+
#### 2. Create a storage bucket
221225

222-
```shell
223-
gcloud iam service-accounts keys create ${MY_KEY_PATH} --iam-account="${MY_SERVICE_ACCOUNT}@${MY_PROJECT}.iam.gserviceaccount.com"
224-
```
226+
Once the project is created, you can proceed and create a storage bucket in the created project. This bucket
227+
will be used to store workflow artifacts and metadata.
228+
229+
!!! info "NOTE:"
230+
Make sure to create the bucket in the location where you'd like to run your workflows.
225231

226-
The key will be saved as a json file specified by `MY_KEY_PATH`, e.g. `~/my-sa-key.json`.
232+
#### 3. Create a service account
227233

228-
Before you configure `dstack`, you also need to ensure that the following APIs are enabled in your GCP project:
234+
The next step is to create a service account in the created project and configure for it the
235+
following roles: `Service Account User`, `Compute Admin`, `Storage Admin`, `Secret Manager Admin`, and `Logging Admin`.
229236

230-
```
231-
cloudapis.googleapis.com
232-
compute.googleapis.com
233-
logging.googleapis.com
234-
secretmanager.googleapis.com
235-
storage-api.googleapis.com
236-
storage-component.googleapis.com
237-
storage.googleapis.com
238-
```
237+
#### 4. Create a service account key
239238

240-
Use `gcloud services list --enabled` and `gcloud services enable` to list and enable APIs.
239+
Once the service account is set up, create a key for it, and download the corresponding JSON file on
240+
to local machine (e.g. to `~/Downloads/my-awesome-project-d7735ca1dd53.json`).
241241

242-
#### Configure the CLI
242+
#### 5. Configure the CLI
243243

244-
Once you have a service account key, you can configure GCP as a remote for `dstack`:
244+
Once the service account key JSON file is on your machine, you can configure the CLI:
245245

246246
```shell
247247
dstack config
@@ -250,10 +250,16 @@ dstack config
250250
The command will ask you for a path to the a service account key, GCP region and zone, and storage bucket name. For example:
251251

252252
```
253-
Path to credentials file: ~/Projects/dstack/my-sa-key.json
253+
Backend: gcp
254+
Path to credentials file: ~/Downloads/my-awesome-project-d7735ca1dd53.json
254255
GCP region: us-central1
255256
GCP zone: us-central1-c
256-
Storage bucket: dstack-test
257+
Storage bucket: dstack-my-awesome-project
257258
```
258259

259-
That's it! Your've configured GCP as a remote.
260+
That's it! You've configured GCP as a remote.
261+
262+
[^1]:
263+
Use the `dstack hub start --port PORT` command (coming soon) to host a web application that provides a UI for configuring cloud
264+
accounts and managing user tokens. Configure this hub as a remote for the CLI to enable the hub to act as a proxy
265+
between the CLI and the configured account. This setup offers improved security and collaboration.

0 commit comments

Comments
 (0)