Skip to content

[Bug]: Prompt Optmizer using invalid model #2076

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
1 task done
YgorCastor opened this issue May 22, 2025 · 5 comments
Open
1 task done

[Bug]: Prompt Optmizer using invalid model #2076

YgorCastor opened this issue May 22, 2025 · 5 comments
Assignees

Comments

@YgorCastor
Copy link

File Name

https://github.com/GoogleCloudPlatform/generative-ai/blob/main/gemini/prompts/prompt_optimizer/vertex_ai_prompt_optimizer_ui.ipynb

What happened?

Looks like somewhere in the code is trying to use gemini 1.5 pro and referencing it through an invalid url.

Relevant log output

Detailed error: Error: The job failed: 404 Publisher Model `projects/XXXXX/locations/us-central1/publishers/google/models/gemini-1.5-pro-002` was not found or your project does not have access to it.

Code of Conduct

  • I agree to follow this project's Code of Conduct
@GeeekyBoy
Copy link

Is there any timeline for the fix? Unfortunately, Gemini 1.5 Pro is used in the pipeline of the optimizer despite the model removal. I couldn't fix this myself since it seems that the usage of the model is hard coded in the docker image of the optimizer. Are there any workarounds?

@raybell-md
Copy link

raybell-md commented May 27, 2025

I also came across this. I thought it might be because it's retired (https://cloud.google.com/vertex-ai/generative-ai/docs/learn/model-versions#legacy-stable) but it's not.

I also threw the stack trace into Gemini:

Stack Trace Analysis:

The provided stack trace indicates the error's origin and propagation through the application:

Origin: The error surfaces within a Python application, seemingly part of an optimization process (optimize_main.par/..., instruction_optimizer.py).
Libraries Involved: The trace shows calls progressing through:
Google API core libraries (e.g., google.api_core.grpc_helpers, google.api_core.gapic_v1.method).
Google Cloud AI Platform client libraries (e.g., google.cloud.aiplatform.vertexai.generative_models._generative_models.py, google.cloud.aiplatform.aiplatform_v1beta1.services.prediction_service.client.py).
A framework potentially named onetwo (various modules under onetwo.core, onetwo.builtins, etc.), which appears to handle the model interaction logic.
Point of Failure: The exception is raised when the application attempts to make a generate_content call to the Vertex AI service:
The call sequence includes self._text_generation_model.generate_content(...) within google/cloud/aiplatform/vertexai/generative_models/_generative_models.py.
This, in turn, calls self._prediction_client.generate_content(request=request) within the same module.
Exception Type: The immediate exception caught and re-raised by the Google API client is google.api_core.exceptions.NotFound: 404 .... This wraps a lower-level _open_source_grpc._channel._InactiveRpcError with status = StatusCode.NOT_FOUND. This confirms that the "Not Found" status was returned by the remote gRPC service.

@aycaecemgul
Copy link

I have the same problem, setting the target_model to another gemini version did not help...

@holtskinner
Copy link
Collaborator

holtskinner commented May 27, 2025

@want-to-be-relaxed or @inardini can you look into this?

@hootan-na
Copy link
Contributor

Sorry for the inconvenience. The prompt optimizer was using Gemini 1.5 Pro as the optimizer model to rewrite the prompts. Since Gemini 1.5 Pro is deprecated, we just released a fix using a more recent model.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants