Skip to content

Support reponse_format: {"type": "json_object"} without any constrained schema #2899

Open
@lhoestq

Description

@lhoestq

In other inference APIs, response_format={"type": "json_object"} restricts the model output to be a valid JSON object without enforcing a schema.

Right now this is not supported:

Failed to deserialize the JSON body into the target type: response_format: missing field `value` at line 1 column 168

I ended up with this error while using lotus-ai which uses the litellm library with response_format={ "type": "json_object" }

To reproduce:

from huggingface_hub import InferenceClient

c = InferenceClient("meta-llama/Llama-3.3-70B-Instruct")
c.chat_completion([{"role": "user", "content": "Give me a dummy json of a person"}], response_format={"type": "json_object"})

Original issue: huggingface/huggingface_hub#2744

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions