You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I note that the feature matrix of all nodes is feed into the initialization of model (encoder, aggregator), which will cause great memory when the feature matrix is too large.
The corresponding code is here:
I think this method support large feature matrix if we do not implement it like this. I mean maybe we can assign the batch feature matrix into the model each time.
And this will make the code more applicable.
Any advice? Guys.
The text was updated successfully, but these errors were encountered:
I note that the feature matrix of all nodes is feed into the initialization of model (encoder, aggregator), which will cause great memory when the feature matrix is too large.
The corresponding code is here:
agg = MeanAggregator(features, cuda=True)
enc = Encoder(features, n_feats_dim, args.n_hidden, refined_adj_lists, agg, gcn=args.gcn, cuda=args.cuda)
I think this method support large feature matrix if we do not implement it like this. I mean maybe we can assign the batch feature matrix into the model each time.
And this will make the code more applicable.
Any advice? Guys.
The text was updated successfully, but these errors were encountered: