Skip to content

getModelJSON fails with a bundler (see reproduction) #366

Closed
@maudnals

Description

@maudnals

Describe the bug
Hi there, I'm trying to use remote models in a basic web app that relies on a bundler (parcel).getModelJSON doesn't seem to load models correctly. I put together a minimal reproduction, see below.
(Note: no issue observed when not using a bundler, and e.g. using CDN transformers instead)
This may be a duplicate of, or related to, #364.
Thanks!

How to reproduce
Bug repro repo: https://github.com/maudnals/bug-repro-transformersjs/tree/main
Steps to reproduce:

  • Clone this repo and navigate to its root
  • Run yarn install
  • Run yarn parcel index.html
  • Observe the code in script.js. We're simply calling await pipeline('sentiment-analysis')
  • Open http://localhost:1234 in your browser, or whichever URL parcel gives you
  • Open DevTools
  • Observe the error logged in the DevTools console upon model loading:
Uncaught (in promise) SyntaxError: Unexpected token '<', "<!DOCTYPE "... is not valid JSON
    at JSON.parse (<anonymous>)
    at getModelJSON (hub.js:551:17)
    at async Promise.all (:61216/index 0)
    at async loadTokenizer (tokenizers.js:52:16)
    at async AutoTokenizer.from_pretrained (tokenizers.js:3826:48)
    at async Promise.all (:61216/index 0)
    at async loadItems (pipelines.js:2193:5)
    at async pipeline (pipelines.js:2139:19)
    at async app.js:9:14
image

Expected behavior
The model should load properly.

Logs/screenshots

Uncaught (in promise) SyntaxError: Unexpected token '<', "<!DOCTYPE "... is not valid JSON
    at JSON.parse (<anonymous>)
    at getModelJSON (hub.js:551:17)
    at async Promise.all (:61216/index 0)
    at async loadTokenizer (tokenizers.js:52:16)
    at async AutoTokenizer.from_pretrained (tokenizers.js:3826:48)
    at async Promise.all (:61216/index 0)
    at async loadItems (pipelines.js:2193:5)
    at async pipeline (pipelines.js:2139:19)
    at async app.js:9:14

Environment

  • Transformers.js version: "@xenova/transformers": "^2.6.2"
  • Browser (if applicable): Chrome 118
  • Operating system (if applicable): MacOS

Addtional context
When adding a breakpoint in getModelJSON, the model path seem fine (openai/...).
I wonder if the issue lays in path rewriting somewhere alongside the bundling steps?

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions