Skip to content

Commit 93453a2

Browse files
authored
Update Readme (#294)
* update readme * update picure name * update issue link * update link * update links * update * update pictures
1 parent 5126c3a commit 93453a2

File tree

4 files changed

+177
-7
lines changed

4 files changed

+177
-7
lines changed

cczoo/confidential_ai/README.md

Lines changed: 177 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -65,7 +65,7 @@ A cloud-based service that verifies the proofness of the remote model serving en
6565

6666
| Component | Version | Purpose |
6767
| -------------------------- | ------------- | --------------------------------------------------------------------------------------------------------- |
68-
| **Ollama** | | Framework for running language models on confidential VMs |
68+
| **Ollama** | `v0.5.7` | Framework for running language models on confidential VMs |
6969
| **DeepSeek-R1** | | High performance reasoning model for inference service |
7070
| **open-webui** | `v0.5.20` | Self-hosted AI interface for user-interaction, running on the same confidential VM to simplify deployment |
7171
| **Cofidential AI(cc-zoo)** | | Patches and compoents from cc-zoo |
@@ -80,18 +80,188 @@ Here we use deepseek-llm-7b-chat model, please refer to the [guide](https://www.
8080
Please refer to [ollama installation guide](https://github.com/ollama/ollama/blob/main/docs/linux.md).
8181

8282
### 4.3 Build openwebui
83-
xxx
84-
xxx
83+
4.3.1 System Requirements
84+
- **Operating System**: Linux
85+
- **Python Version**: Python 3.11+
86+
- **Node.js Version**: 20.18+
87+
88+
4.3.2 Development Setup Instruction
89+
4.3.2.1 Clone the Repository
90+
```bash
91+
git clone https://github.com/your-org/open-webui.git #Replace with actual repository URL(git apply xxx.patch 添加openwebUI对TDX的支持.)
92+
cd open-webui
93+
```
94+
95+
4.3.2.2 Install Node.js
96+
- Ensure Node.js ≥20.18.1 is installed:
97+
```bash
98+
# Install Node Version Manager (n)
99+
sudo npm install -g n
100+
101+
# Install specific Node.js version
102+
sudo npm install 20.18.1
103+
104+
# if meet any connect error, you can use follow way to install
105+
# install nvm(Node Version Manager)
106+
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.1/install.sh | bash
107+
108+
### install node with verison ID
109+
nvm install 20.18.1
110+
111+
### if need change node version
112+
nvm use 20.18.1
113+
```
114+
4.3.3 Install Miniconda
115+
- Download and install Miniconda:
116+
```bash
117+
wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh
118+
bash Miniconda3-latest-Linux-x86_64.sh
119+
### while installing,you can skip reading install information by enter q, and set default choice to finish installation
120+
121+
```
122+
4.3.3.1 Configure environment paths:
123+
```bash
124+
# Add Miniconda to PATH (replace /path/to/ with actual installation path)
125+
export PATH="/path/to/miniconda3/bin:$PATH" ### defaoult path is: /root/miniconda3/bin
126+
127+
# Initialize Conda
128+
conda init
129+
source ~/.bashrc
130+
131+
# Verify installation
132+
conda --version
133+
```
134+
135+
4.3.4 Frontend Build and Test
136+
137+
4.3.4.1 Enter open-webui & Create a `.env` file:
138+
139+
```bash
140+
cd open-webui
141+
cp -RPp .env.example .env
142+
```
143+
144+
4.3.4.2 Update Ollama Serving Address in `.env` and Modify the `.env` file to configure the **Ollama backend URL**. This ensures that requests to `/ollama` are correctly redirected to the specified backend:
145+
146+
```ini
147+
# Ollama URL for the backend to connect
148+
OLLAMA_BASE_URL='http://ip_address:port'
149+
150+
# OpenAI API Configuration (Leave empty if not used)
151+
OPENAI_API_BASE_URL=''
152+
OPENAI_API_KEY=''
153+
154+
# AUTOMATIC1111 API (Uncomment if needed)
155+
# AUTOMATIC1111_BASE_URL="http://localhost:7860"
156+
157+
# Disable Tracking & Telemetry
158+
SCARF_NO_ANALYTICS=true
159+
DO_NOT_TRACK=true
160+
ANONYMIZED_TELEMETRY=false
161+
```
162+
Ensure you replace `ip_address:port` with the actual IP address and port of your **Ollama server** if necessary.
163+
164+
4.3.4.3 Build frontend server(if error occured,please goto [here](#issue_note)):
165+
166+
```bash
167+
npm run build
168+
```
169+
+ After building the frontend, copy the generated `build` directory to the backend and rename it to `frontend`:
170+
171+
```bash
172+
cp -r build ./backend/open-webui/frontend
173+
174+
```
175+
4.3.4.4 Backend Build and Setup
176+
177+
- Navigate to the backend:
178+
179+
```bash
180+
cd backend
181+
```
182+
183+
- Use **Conda** for environment setup:
184+
185+
```bash
186+
conda create --name open-webui python=3.11
187+
conda activate open-webui
188+
```
85189

190+
4.3.4.5 Install python dependencies([Tips](#tips)):
191+
192+
```bash
193+
pip install -r requirements.txt -U
194+
```
195+
196+
4.3.4.6.1 Install TDX-quote_parse-feature enable:
197+
198+
```bash
199+
cd quote_generator
200+
python setup.py install
201+
```
86202
### 4.4 Run openwebui
87203
- Run ollama + AI model
88-
xxx
89-
- Run openwebui
90-
xxx
204+
```bash
205+
ollama run xxxx(model name)
206+
```
91207

92208
- Configure `Attestation Service`
93-
xxx
209+
Build setps:
210+
```bash
211+
cd confidential_ai/attestation_service/ && ./build.sh
212+
```
94213
- Check Attestation status
214+
```bash
215+
./attest_service
216+
```
217+
It will start the service and wait for connection: "Starting TDX Attestation Service on port 8443..."
218+
219+
220+
- Run openwebui
221+
222+
1.open backend service
223+
```bash
224+
conda create --name open-webui python=3.11
225+
conda activate open-webui
226+
cd /path/to/open-webui/backend/ && ./dev.sh
227+
```
228+
![backend service](./images/openwebui-backend.png)
229+
230+
2.open frontend service
231+
232+
```bash
233+
cd utilities/tdx/restful_as/restful_tdx_att_service && ./attest_service
234+
```
235+
![backend service](./images/openwebui-fronted.png)
236+
237+
3.open browser and goto address: https://{ip_address}:18080/(The ip address is your server ip)
238+
239+
4.Example:
240+
get quote data and parse
241+
242+
![backend service](./images/parse.png)
243+
244+
### <h2 id="issue_note">IssueNote:</h2>
245+
- While building, meet with `Cannot find package `,you can try command:
246+
247+
```bash
248+
npm install pyodide
249+
```
250+
251+
### <h2 id="tips">Tips:</h2>
252+
- Downloading packages from remote sites can be slow. To speed up the process, you can specify a local mirror such as **Aliyun** when installing packages:
253+
254+
```bash
255+
pip install torch -i https://mirrors.aliyun.com/pypi/simple/
256+
```
257+
258+
Alternatively, you can set Aliyun as the default mirror by adding the following lines to `~/.pip/pip.conf`, suggest to this method:
259+
260+
```ini
261+
[global]
262+
index-url = https://mirrors.aliyun.com/pypi/simple/
263+
```
264+
95265

96266

97267
### Prerequisites:
Loading
Loading
114 KB
Loading

0 commit comments

Comments
 (0)