-
Notifications
You must be signed in to change notification settings - Fork 107
Bidirectional GRU and SimpleRNN support #413
Bidirectional GRU and SimpleRNN support #413
Conversation
This pull request introduces 11 alerts and fixes 1 when merging 7c9aecb into ee2945c - view on LGTM.com new alerts:
fixed alerts:
|
All of the Bidirectional tests now cover @wenbingl, can you review this PR? |
Looks like there is a bug in Python 3.5. I'll circle back on this later. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks a lot for the PR.
Python 3.5 failure is related to some code branch of opset < 9. please have a fixing and I'll merge the PR.
Thanks @wenbingl. In 82035fc I left the setting of initial states for Bidirectional layers as a future task. There is currently no explicit test, but I can add the test and functionality in a separate PR. The test was failing because of the same issue that we discussed in #400. When I updated the |
Yes, I noticed it as well. I remember the test was that opset < 9 would only run on batch_size == 1 case, Can we do it as well? |
I'm happy to keep backward compatibility. Is there a way that we can simulate the environment (e.g. a specific version of onnxruntime that we can tests against)? Should that be addressed in the follow up PR? |
Hi @cjermain,
Tried higher target_opset but I'm getting:
ValueError: Unsupported class for Bidirectional layer: <class 'tensorflow.python.keras.layers.recurrent_v2.GRU'> |
@DronistB, I started a new issue (#607) to track this -- please see my comments there. |
This PR introduces support for Bidirectional GRU and SimpleRNN layers, and resolves #175. It implements a major refactor of the RNN code to simplify the compos-ability of the pieces, so that code can be effectively reused. The bidirectional converter now leverages the code within the
lstm
,gru
, andsimplernn
modules. This code uses abidirectional
flag to determine support for this feature.This diff is probably best viewed in split mode. :) Happy to try to split it up if that will help with the review.