GraphNeuralNetworks v0.6.21
Merged pull requests:
- Add
DCGRU
temporal layer (#448) (@aurorarossi) - move mldatasets2gnngraph (#458) (@CarloLucibello)
- create GNNLux.jl package (#460) (@CarloLucibello)
- [GNNLux] GCNConv, ChebConv, GNNChain (#462) (@CarloLucibello)
- [GNNLux] more layers (#463) (@CarloLucibello)
- use GNNlib in GNN.jl (#464) (@CarloLucibello)
- [GNNlib] fix cuda ext (#465) (@CarloLucibello)
- tests for GNNlib (#466) (@CarloLucibello)
- fix docs (#467) (@CarloLucibello)
- [GNNLux] fix tests (#468) (@CarloLucibello)
- [GNNLux] more layers (#469) (@CarloLucibello)
- [GNNLux] TGCN temporal layer (#470) (@aurorarossi)
- [GNNLux] more layers pt. 3 (#471) (@CarloLucibello)
- [GNNGraphs] implement
remove_edges(g, p)
(#474) (@rbSparky) - [GNNLux] Added SGConv (#475) (@rbSparky)
- fix dense test (#479) (@CarloLucibello)
- [GNNLux] Adding MegNetConv Layer (#480) (@rbSparky)
- rng instead of seed for rand_graph (#482) (@CarloLucibello)
@functor
->@layer
(#484) (@CarloLucibello)- [GNNLux] Add A3TGCN temporal layer (#485) (@aurorarossi)
- [GNNLux] Add GConvLSTM, GConvGRU and DCGRU temporal layers (#487) (@aurorarossi)
- fix NNConv docs (#488) (@CarloLucibello)
- Add
EvolveGCNO
temporal layer (#489) (@aurorarossi) - [GNNLux] updates for Lux v1.0 (#490) (@CarloLucibello)
- [GNNLux] Adding NNConv Layer (#491) (@rbSparky)
- [GNNLux] add GMMConv, ResGatedGraphConv (#494) (@CarloLucibello)
Closed issues:
- Question about temporal graph neural networks (#112)
- add show methods for
WithGraph
andGNNChain
(#178) - Dropout inside GATConv layer (#258)
- Graph classification: multiple graphs associated with a common label (#282)
- convenience feature setter (#284)
@functor
default forGNNLayer
s (#288)- documentation proposal (#357)
- support Lux (#372)
- turn this into a monorepo (#433)
- use Flux.@layer instead of Flux.@functor (#452)
- random graph generators should take an
rng
instead of aseed
(#459) - Cannot create GNNGraph with unconnected nodes (#472)
- Implementation of recommender system based on GNN (#473)
- GNNs.jl's CI is failing for
GRAPH_T = :dense
(#476) GCNConv
layer fails when theGNNGraph
comes from an adjacency matrix (#486)