-
Notifications
You must be signed in to change notification settings - Fork 1.2k
add intel xpu support for TGI #1475
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Changes from all commits
Commits
Show all changes
10 commits
Select commit
Hold shift + click to select a range
49cd0ce
add intel xpu support for TGI
sywangyi bc069db
fix review comments
sywangyi 515a0ed
add xpu smi support in env runtime
sywangyi 23a1cb0
align to ipex llm ops
sywangyi 02537ec
update docker file
sywangyi 0343a4b
update the API and dockerfile
sywangyi 878101e
no requirements_common.txt, update dockerfile
sywangyi 8815379
re-enable xpu
sywangyi b67ce71
Add build-and-push-image for Intel GPUs
mfuntowicz 1f79e8c
Fix use_v1 after rebase.
Narsil File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -274,12 +274,105 @@ jobs: | |
cache-from: type=registry,ref=registry.internal.huggingface.tech/api-inference/community/text-generation-inference:cache-rocm,mode=min | ||
cache-to: type=registry,ref=registry.internal.huggingface.tech/api-inference/community/text-generation-inference:cache-rocm,mode=min | ||
|
||
build-and-push-image-intel: | ||
concurrency: | ||
group: ${{ github.workflow }}-build-and-push-image-intel-${{ github.head_ref || github.run_id }} | ||
cancel-in-progress: true | ||
needs: | ||
- start-runner | ||
- build-and-push-image # Wait for the main docker image to be built | ||
- integration-tests # Wait for the main integration-tests | ||
runs-on: ${{ needs.start-runner.outputs.label }} # run the job on the newly created runner | ||
permissions: | ||
contents: write | ||
packages: write | ||
# This is used to complete the identity challenge | ||
# with sigstore/fulcio when running outside of PRs. | ||
id-token: write | ||
security-events: write | ||
steps: | ||
- name: Checkout repository | ||
uses: actions/checkout@v3 | ||
- name: Initialize Docker Buildx | ||
uses: docker/[email protected] | ||
with: | ||
install: true | ||
- name: Inject slug/short variables | ||
uses: rlespinasse/[email protected] | ||
- name: Tailscale | ||
uses: tailscale/github-action@7bd8039bf25c23c4ab1b8d6e2cc2da2280601966 | ||
with: | ||
authkey: ${{ secrets.TAILSCALE_AUTHKEY }} | ||
- name: Login to GitHub Container Registry | ||
if: github.event_name != 'pull_request' | ||
uses: docker/login-action@v2 | ||
with: | ||
registry: ghcr.io | ||
username: ${{ github.actor }} | ||
password: ${{ secrets.GITHUB_TOKEN }} | ||
- name: Login to internal Container Registry | ||
uses: docker/[email protected] | ||
with: | ||
username: ${{ secrets.TAILSCALE_DOCKER_USERNAME }} | ||
password: ${{ secrets.TAILSCALE_DOCKER_PASSWORD }} | ||
registry: registry.internal.huggingface.tech | ||
- name: Login to Azure Container Registry | ||
if: github.event_name != 'pull_request' | ||
uses: docker/[email protected] | ||
with: | ||
username: ${{ secrets.AZURE_DOCKER_USERNAME }} | ||
password: ${{ secrets.AZURE_DOCKER_PASSWORD }} | ||
registry: db4c2190dd824d1f950f5d1555fbadf0.azurecr.io | ||
# If pull request | ||
- name: Extract metadata (tags, labels) for Docker | ||
if: ${{ github.event_name == 'pull_request' }} | ||
id: meta-pr | ||
uses: docker/[email protected] | ||
with: | ||
images: | | ||
registry.internal.huggingface.tech/api-inference/community/text-generation-inference | ||
tags: | | ||
type=raw,value=sha-${{ env.GITHUB_SHA_SHORT }}-intel | ||
# If main, release or tag | ||
- name: Extract metadata (tags, labels) for Docker | ||
if: ${{ github.event_name != 'pull_request' }} | ||
id: meta | ||
uses: docker/[email protected] | ||
with: | ||
flavor: | | ||
latest=false | ||
images: | | ||
registry.internal.huggingface.tech/api-inference/community/text-generation-inference | ||
ghcr.io/huggingface/text-generation-inference | ||
db4c2190dd824d1f950f5d1555fbadf0.azurecr.io/text-generation-inference | ||
tags: | | ||
type=semver,pattern={{version}}-intel | ||
type=semver,pattern={{major}}.{{minor}}-intel | ||
type=raw,value=latest-intel,enable=${{ github.ref == format('refs/heads/{0}', github.event.repository.default_branch) }} | ||
type=raw,value=sha-${{ env.GITHUB_SHA_SHORT }}-intel | ||
- name: Build and push Docker image | ||
id: build-and-push | ||
uses: docker/build-push-action@v4 | ||
with: | ||
context: . | ||
file: Dockerfile_intel | ||
push: true | ||
platforms: 'linux/amd64' | ||
build-args: | | ||
GIT_SHA=${{ env.GITHUB_SHA }} | ||
DOCKER_LABEL=sha-${{ env.GITHUB_SHA_SHORT }}-intel | ||
tags: ${{ steps.meta.outputs.tags || steps.meta-pr.outputs.tags }} | ||
labels: ${{ steps.meta.outputs.labels || steps.meta-pr.outputs.labels }} | ||
cache-from: type=registry,ref=registry.internal.huggingface.tech/api-inference/community/text-generation-inference:cache-intel,mode=min | ||
cache-to: type=registry,ref=registry.internal.huggingface.tech/api-inference/community/text-generation-inference:cache-intel,mode=min | ||
|
||
stop-runner: | ||
name: Stop self-hosted EC2 runner | ||
needs: | ||
- start-runner | ||
- build-and-push-image | ||
- build-and-push-image-rocm | ||
- build-and-push-image-intel | ||
- integration-tests | ||
runs-on: ubuntu-latest | ||
env: | ||
|
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,105 @@ | ||
FROM lukemathwalker/cargo-chef:latest-rust-1.75 AS chef | ||
WORKDIR /usr/src | ||
|
||
ARG CARGO_REGISTRIES_CRATES_IO_PROTOCOL=sparse | ||
|
||
FROM chef as planner | ||
COPY Cargo.toml Cargo.toml | ||
COPY rust-toolchain.toml rust-toolchain.toml | ||
COPY proto proto | ||
COPY benchmark benchmark | ||
COPY router router | ||
COPY launcher launcher | ||
RUN cargo chef prepare --recipe-path recipe.json | ||
|
||
FROM chef AS builder | ||
|
||
ARG GIT_SHA | ||
ARG DOCKER_LABEL | ||
|
||
RUN PROTOC_ZIP=protoc-21.12-linux-x86_64.zip && \ | ||
curl -OL https://github.com/protocolbuffers/protobuf/releases/download/v21.12/$PROTOC_ZIP && \ | ||
unzip -o $PROTOC_ZIP -d /usr/local bin/protoc && \ | ||
unzip -o $PROTOC_ZIP -d /usr/local 'include/*' && \ | ||
rm -f $PROTOC_ZIP | ||
|
||
COPY --from=planner /usr/src/recipe.json recipe.json | ||
RUN cargo chef cook --release --recipe-path recipe.json | ||
|
||
COPY Cargo.toml Cargo.toml | ||
COPY rust-toolchain.toml rust-toolchain.toml | ||
COPY proto proto | ||
COPY benchmark benchmark | ||
COPY router router | ||
COPY launcher launcher | ||
RUN cargo build --release | ||
|
||
|
||
# Text Generation Inference base image for Intel | ||
FROM intel/intel-extension-for-pytorch:2.1.10-xpu as base | ||
|
||
USER root | ||
# libssl.so.1.1 is not installed on Ubuntu 22.04 by default, install it | ||
RUN wget http://nz2.archive.ubuntu.com/ubuntu/pool/main/o/openssl/libssl1.1_1.1.1f-1ubuntu2_amd64.deb && \ | ||
dpkg -i ./libssl1.1_1.1.1f-1ubuntu2_amd64.deb | ||
|
||
|
||
RUN wget -O- https://apt.repos.intel.com/intel-gpg-keys/GPG-PUB-KEY-INTEL-SW-PRODUCTS.PUB \ | ||
| gpg --dearmor | tee /usr/share/keyrings/oneapi-archive-keyring.gpg > /dev/null && echo "deb [signed-by=/usr/share/keyrings/oneapi-archive-keyring.gpg] https://apt.repos.intel.com/oneapi all main" | tee /etc/apt/sources.list.d/oneAPI.list | ||
|
||
RUN apt-get update && apt install -y intel-basekit xpu-smi cmake python3-dev ninja-build | ||
|
||
# Text Generation Inference base env | ||
ENV HUGGINGFACE_HUB_CACHE=/data \ | ||
HF_HUB_ENABLE_HF_TRANSFER=1 \ | ||
PORT=80 | ||
|
||
|
||
WORKDIR /usr/src | ||
# Build pytorch and ipex | ||
RUN git clone https://github.com/intel/intel-extension-for-pytorch && cd intel-extension-for-pytorch && git checkout -b xpu_main origin/xpu-main | ||
RUN git clone https://github.com/pytorch/pytorch.git && cd pytorch && git checkout 209f2fa8ff86652f67d75c2f19bf9cb9942fd018 && git apply /usr/src/intel-extension-for-pytorch/torch_patches/00*.patch | ||
|
||
# Install server | ||
COPY proto proto | ||
COPY server server | ||
COPY server/Makefile server/Makefile | ||
RUN cd server && \ | ||
make gen-server && \ | ||
pip install -r requirements_cuda.txt && \ | ||
pip install ".[accelerate, peft, outlines]" --no-cache-dir | ||
|
||
ENV CCL_ROOT=/opt/intel/oneapi/ccl/latest | ||
ENV I_MPI_ROOT=/opt/intel/oneapi/mpi/latest | ||
ENV FI_PROVIDER_PATH=/opt/intel/oneapi/mpi/latest/opt/mpi/libfabric/lib/prov:/usr/lib/x86_64-linux-gnu/libfabric | ||
ENV DIAGUTIL_PATH=/opt/intel/oneapi/compiler/latest/etc/compiler/sys_check/sys_check.sh | ||
ENV CCL_CONFIGURATION=cpu_gpu_dpcpp | ||
ENV MANPATH=/opt/intel/oneapi/mpi/latest/share/man:/opt/intel/oneapi/mpi/latest/share/man:/opt/intel/oneapi/compiler/latest/share/man | ||
ENV CMAKE_PREFIX_PATH=/opt/intel/oneapi/mkl/latest/lib/cmake:/opt/intel/oneapi/compiler/latest | ||
ENV CMPLR_ROOT=/opt/intel/oneapi/compiler/latest | ||
ENV LIBRARY_PATH=/opt/intel/oneapi/mpi/latest/lib:/opt/intel/oneapi/ccl/latest/lib/:/opt/intel/oneapi/mkl/latest/lib/:/opt/intel/oneapi/compiler/latest/lib | ||
ENV OCL_ICD_FILENAMES=libintelocl_emu.so:libalteracl.so:/opt/intel/oneapi/compiler/latest/lib/libintelocl.so | ||
ENV CLASSPATH=/opt/intel/oneapi/mpi/latest/share/java/mpi.jar:/opt/intel/oneapi/mpi/latest/share/java/mpi.jar | ||
ENV LD_LIBRARY_PATH=/opt/intel/oneapi/ccl/latest/lib/:/opt/intel/oneapi/mpi/latest/opt/mpi/libfabric/lib:/opt/intel/oneapi/mpi/latest/lib:/opt/intel/oneapi/mkl/latest/lib:/opt/intel/oneapi/compiler/latest/opt/compiler/lib:/opt/intel/oneapi/compiler/latest/lib:/opt/intel/oneapi/lib:/opt/intel/oneapi/lib/intel64: | ||
ENV MKLROOT=/opt/intel/oneapi/mkl/latest | ||
ENV NLSPATH=/opt/intel/oneapi/mkl/latest/share/locale/%l_%t/%N:/opt/intel/oneapi/compiler/latest/lib/locale/%l_%t/%N | ||
ENV PATH=/opt/intel/oneapi/mpi/latest/opt/mpi/libfabric/bin:/opt/intel/oneapi/mpi/latest/bin:/opt/intel/oneapi/mpi/latest/opt/mpi/libfabric/bin:/opt/intel/oneapi/mkl/latest/bin/:/opt/intel/oneapi/compiler/latest/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin | ||
ENV CPATH=/opt/intel/oneapi/mpi/latest/include:/opt/intel/oneapi/ccl/latest/include:/opt/intel/oneapi/mkl/latest/include | ||
ENV CCL_ZE_IPC_EXCHANGE=sockets | ||
|
||
|
||
RUN pip uninstall -y torch && cd pytorch && git submodule update --init --recursive && python setup.py install | ||
RUN pip uninstall -y intel-extension-for-pytorch && cd intel-extension-for-pytorch && git submodule update --init --recursive && USE_AOT_DEVLIST='pvc' BUILD_SEPARATE_OPS=ON BUILD_WITH_CPU=ON USE_XETLA=ON python setup.py install | ||
|
||
# Install benchmarker | ||
COPY --from=builder /usr/src/target/release/text-generation-benchmark /usr/local/bin/text-generation-benchmark | ||
# Install router | ||
COPY --from=builder /usr/src/target/release/text-generation-router /usr/local/bin/text-generation-router | ||
# Install launcher | ||
COPY --from=builder /usr/src/target/release/text-generation-launcher /usr/local/bin/text-generation-launcher | ||
|
||
# Final image | ||
FROM base | ||
|
||
ENTRYPOINT ["text-generation-launcher"] | ||
CMD ["--json-output"] |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.