Description
Hey all! I am using Jupyter Notebook in a containerized environment where I can mount secrets as environment variables and use the exposed key/values in my notebook. When I want to run the same code as a pipeline, I need to define the same environment variables from Elyra configuration again. I totally understand that the pipeline I create has a life of its own once I submit it. It becomes another Kubernetes object (Argo Workflow or Tekton takes it and runs)..which makes sense, cause then I can run my pipelines without having to run my Notebook. But I was wondering if there is a way to utilize the same environment variables without having to define them in Elyra config again, like fetching the Notebook's variables..or does it make sense to have such feature