-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conv2DKAN giving graph execution error for non-sequential models. #19
Comments
I will check this error 😊 Could you please provide more error info (the traceback) about this error and what is the |
Please, find below the additional details pertaining to the error:
The model fitting function:
The model plotting and compilation occurs without raising any error. However, for the model fitting cell raises the following error:
Further the execution occurs smoothly when the Conv2DKAN layers are replaced by Conv2D layers. Also using the DenseKAN layer does not result in an error in both Sequential and non-sequential models. The TimeDistributed layer in Keras is a wrapper that applies a specified layer independently to each time step in a sequence of inputs. In my case it allows me to apply the same processing to each frame in the video sequence. Once again thank you for your support 😊 |
Let's check the inputs shape together. From my understanding of your description, I believe your input should be a 5-dimensional tensor with import tensorflow as tf
from tfkan.layers import Conv2DKAN
from keras.layers import TimeDistributed
layer = TimeDistributed(Conv2DKAN(64, (3, 3), padding='same'))
x = tf.random.normal((2, 10, 32, 32, 3)) # x is a 5D tensor
y = layer(x)
print(y.shape) # (2, 10, 32, 32, 64) Let me know the inputs tensor you use can help us better find the problem! |
The input shape for the example code is as follows:
The following is the condensed representation of the model:
|
I test your code and it works fine in my env as below: import tensorflow as tf
from tfkan.layers import Conv2DKAN, DenseKAN
from keras.layers import TimeDistributed, Input, MaxPooling2D, Dropout, Flatten, LSTM, Dense
from tensorflow.keras.models import Model
def kan_single_branch_non_sequential(do=0.005, rl1=0, rl2=0):
# input for RGB frames
input_rgb = Input(shape=(10, 224, 224, 3),dtype=tf.float32, name='rgb_input')
x_rgb = TimeDistributed(Conv2DKAN(16, (3, 3), padding='same'))(input_rgb)
x_rgb = TimeDistributed(MaxPooling2D((4, 4)))(x_rgb)
x_rgb = TimeDistributed(Dropout(do))(x_rgb)
# ...
x_rgb = TimeDistributed(Flatten())(x_rgb)
x = x_rgb
x = LSTM(32)(x)
x = DenseKAN(32)(x)
# x=ActivityRegularization(l1=rl1,l2=rl2)(x)
output = Dense(6, activation='softmax')(x)
# model definition
model = Model(inputs=[input_rgb], outputs=output)
return model
# build and compile
model = kan_single_branch_non_sequential()
model.build(input_shape=(None, 10, 224, 224, 3))
model.compile(
optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy']
)
# generate mock data
bz = 1
x = tf.random.normal((bz, 10, 224, 224, 3))
y = tf.random.uniform(shape=(bz,), maxval=6, dtype=tf.int32)
# fit the model
model.fit(x, y, batch_size=bz, epochs=1) and the model shows the training progress successfully: 1/1 [==============================] - 1s 1s/step - loss: 1.8257 - accuracy: 0.0000e+00 Maybe the graph execution error is caused by OOM? or the version of You might try to reduce the |
Thank you very much for your valuable input! The graph execution error was indeed due to OOM. I was able to rectify it by reducing frame size from 224x224 to 64x64. I was struggling to pinpoint the issue for weeks. Thank you very much for your time, effort and interest 😊. |
you are welcome, best😄 |
Sir/ma'am
When the user fits a non-sequential models (as in the case of multi-branch models), a graph execution error is encountered.
In particular no error is encountered with the model declaration:
model = tf.keras.Sequential([
...
TimeDistributed(Conv2DKAN(16, (3, 3), padding='same')),
...
])
However, using the form
x=TimeDistributed(Conv2DKAN(32, (3, 3), padding='same'))(x)
raises the said error.
Please, look into matter at your earlier convenience.
Thank you for the help in advance.
The text was updated successfully, but these errors were encountered: