Description
This probably is more a "how-to" issue than an "issue", but I am not sure where it falls.
I have elyra installed in kubernetes with enterprise gateway with configuration that every notebook execution in interactive mode starts a new pod in kubernetes.
I want to to know, how can I share common vairables in multiple notebooks using Elyra at development as well as scheduling those in pipeline. I am talking python kernels here.
I will take this standard pipeline from Elyra website as an example.
In this case lets assume there is a database table name "t_save_data_here". This is to be used in all the notebooks and scripts.
Currently, I am declaring a variable in all the notebooks to connect to it.
e.g. table_name = "t_save_data_here".
But how can I do it by just declaring it once e.g. in a parameters.py, which can be imported as a first line in all notebooks and I get all global variables defined? This way e.g. if the table name changes, I just to need to change it once.
I read through documentation and some threads,but it mostly talks about pipeline parameters, env variables or exporting "output" from previous stage to next.
These become very specific to pipeline config and won't be available when I am developing a given notebook or script...
Is there a documentation that I could not find ?
Or is this is just wrong way of trying to solve the problem? Is there a better way?