Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reconstruction algorithm - SbRP #1

Merged
merged 9 commits into from
Jul 4, 2024
Merged
Show file tree
Hide file tree
Changes from 5 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/CODE_OF_CONDUCT.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Code of Conduct for the Network Centrality Library Project
# Code of Conduct for the Network Source Detection Library) Project

## Our Pledge

Expand Down
14 changes: 7 additions & 7 deletions .github/CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,19 +22,19 @@ We warmly welcome contributions to NSDLib! This document provides guidelines for

### Implementation Requirements

- **Centrality Measures Implementation**:
- Each centrality measure must be implemented in a separate file within the `nsdlib/algorithms` directory.
- The file name should match the centrality measure's name.
- Each file must contain a single function, named after the centrality measure, that calculates this measure. This function should accept a NetworkX graph as input and return a dictionary mapping nodes to their centrality values.
- Each centrality measure function must be exposed in the `nsdlib/algorithms` package to be accessible for external use.
- Add an entry for the new centrality measure in the `Centrality` enum to ensure it's recognized and accessible through a standardized interface.
- **Source Detection Method Implementation**:
- Each new method must be implemented in a separate file within the `nsdlib/algorithms` directory in appropriate package according to its intended purpose e.g. reconstruction algorithm should be placed in `reconstruction` package.
- The file name should match the method's name.
- Each file must contain a single function, named after the new method name.
- Each alg function must be exposed in the `nsdlib/algorithms` package to be accessible for external use.
- Add an entry for the new alg in the appropiate taxonomy class, e.g. for reconstruction algorithm new entry should be placed into `PropagationReconstructionAlgorithm` enum to ensure it's recognized and accessible through a standardized interface.

- **Testing**:
- Contributions must include tests covering the new functionality. We require at least 80% test coverage for changes.
- Use the `pytest` framework for writing tests.

- **Documentation**:
- Update the project documentation to reflect the addition of new centrality measures or any other significant changes.
- Update the project documentation to reflect the addition of new method or any other significant changes.
- Ensure that examples, usage guides, and API documentation are clear and updated.

### Making Changes
Expand Down
2 changes: 1 addition & 1 deletion .github/ISSUE_TEMPLATE/bug_report.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
name: Bug Report
about: Create a report to help us improve the network centrality library
about: Create a report to help us improve the NSDLib
title: "[BUG]"
labels: bug
assignees: ''
Expand Down
2 changes: 1 addition & 1 deletion .github/ISSUE_TEMPLATE/feature_request.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
name: Feature Request
about: Suggest an idea for the network centrality library
about: Suggest an idea for the NSDLib
title: "[FEATURE]"
labels: enhancement
assignees: ''
Expand Down
40 changes: 1 addition & 39 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,45 +1,7 @@
### JetBrains template
# Covers JetBrains IDEs: IntelliJ, RubyMine, PhpStorm, AppCode, PyCharm, CLion, Android Studio, WebStorm and Rider
# Reference: https://intellij-support.jetbrains.com/hc/en-us/articles/206544839

# User-specific stuff
.idea/**/workspace.xml
.idea/**/tasks.xml
.idea/**/usage.statistics.xml
.idea/**/dictionaries
.idea/**/shelf

# AWS User-specific
.idea/**/aws.xml

# Generated files
.idea/**/contentModel.xml

# Sensitive or high-churn files
.idea/**/dataSources/
.idea/**/dataSources.ids
.idea/**/dataSources.local.xml
.idea/**/sqlDataSources.xml
.idea/**/dynamic.xml
.idea/**/uiDesigner.xml
.idea/**/dbnavigator.xml

# Gradle
.idea/**/gradle.xml
.idea/**/libraries

# Gradle and Maven with auto-import
# When using Gradle or Maven with auto-import, you should exclude module files,
# since they will be recreated, and may cause churn. Uncomment if using
# auto-import.
# .idea/artifacts
# .idea/compiler.xml
# .idea/jarRepositories.xml
# .idea/modules.xml
# .idea/*.iml
# .idea/modules
# *.iml
# *.ipr
.idea

# CMake
cmake-build-*/
Expand Down
8 changes: 0 additions & 8 deletions .idea/.gitignore

This file was deleted.

6 changes: 0 additions & 6 deletions .idea/inspectionProfiles/profiles_settings.xml

This file was deleted.

7 changes: 0 additions & 7 deletions .idea/misc.xml

This file was deleted.

8 changes: 0 additions & 8 deletions .idea/modules.xml

This file was deleted.

18 changes: 0 additions & 18 deletions .idea/nsdlib.iml

This file was deleted.

6 changes: 0 additions & 6 deletions .idea/vcs.xml

This file was deleted.

14 changes: 2 additions & 12 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,16 +3,6 @@
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [0.2.1] - 2024-02-23
## [0.1.0] - 2024-07-08
### Added
- All common modules exported in `__init__.py` file

## [0.2.0] - 2024-02-22
### Added
- new centrality measure - hubbell centrality has been added
- updated maintenance related files
- extended documentation

## [0.1.1] - 2024-01-09
### Added
- nsdlib version 0.1.1 release
- NSDlib version 0.1.0 release
54 changes: 30 additions & 24 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,22 +1,21 @@
# NSDlib

NSDlib (Network source detection library) is a tool to compute a wide range of centrality measures for a given network. The
library is designed to work with Python Networkx library.
NSDlib (Network source detection library) is a comprehensive library designed for detecting sources of propagation in networks. This library offers a variety of algorithms that help researchers and developers analyze and identify the origins of information (epidemic etc.) spread within networks.

## Overview

The goal of NSDlib is to offer a comprehensive repository for implementing a broad spectrum of centrality measures. Each
year, new measures are introduced through scientific papers, often with only pseudo-code descriptions, making it
difficult for researchers to evaluate and compare them with existing methods. While implementations of well-known
centrality measures exist, recent innovations are frequently absent. NSDlib strives to bridge this gap. It references the
renowned CentiServer portal for well-known centrality measures and their originating papers, aiming to encompass all
these measures in the future.
NSDLib is a complex library designed for easy integration into existing projects. It aims to be a comprehensive repository
of source detection methods, outbreak detection techniques, and propagation graph reconstruction tools. Researchers worldwide are encouraged to contribute and utilize this library,
facilitating the development of new techniques to combat misinformation and improve propagation analysis.
Each year, new techniques are introduced through scientific papers, often with only pseudo-code descriptions, making it
difficult for researchers to evaluate and compare them with existing methods. NSDlib tries to bridge this gap and enhance researchers to put their implementations here.

## Code structure

All custom implementations are provided under `nsdlib/algorithms` package. Each centrality measure is implemented in a separate file, named after the measure itself. Correspondingly, each file contains a function, named identically to the file, which calculates the centrality measure. This function accepts a NetworkX graph as input (and other params if applicable) and returns a dictionary, mapping nodes to their centrality values. Ultimately, every custom implementation is made available through the `nsdlib/algorithms` package.
## Implemented centrality measures:
All custom implementations are provided under `nsdlib/algorithms` package. Each method is implemented in a separate file, named after the method itself and in appropriate package according to its intended purpose e.g. reconstruction algorithm should be placed in `reconstruction` package. . Correspondingly, each file contains a function, named identically to the file, which does appropriate logic. Ultimately, every custom implementation is made available through the `nsdlib/algorithms` package.
## Implemented features:

### Node evaluation algorithms
- [Algebraic](https://www.centiserver.org/centrality/Algebraic_Centrality/)
- [Average Distance](https://www.centiserver.org/centrality/Average_Distance/)
- [Barycenter](https://www.centiserver.org/centrality/Barycenter_Centrality/)
Expand Down Expand Up @@ -58,6 +57,12 @@ All custom implementations are provided under `nsdlib/algorithms` package. Each
- [Topological](https://www.centiserver.org/centrality/Topological_Coefficient/)
- [Trophic Levels](https://networkx.org/documentation/stable/reference/algorithms/generated/networkx.algorithms.centrality.trophic_levels.html)

### Outbreak detection algorithms
- test

### Graph reconstruction algorithms
- SbRP

## How to use
Library can be installed using pip:

Expand All @@ -69,31 +74,32 @@ pip install nsdlib

Provided algorithms can be executed in the following ways:

- by invoking a specific function from `nsdlib.algorithms` package, which computes a given centrality measure for a
given graph.
- by utilizing 'SourceDetector' class and configuring it with 'SourceDetectionConfig' object. This approach allows for seamless source detection and result evaluation.

```python
import networkx as nx
import nsdlib as ncl

# Create a graph
from nsdlib.common.models import SourceDetectionConfig
from nsdlib.source_detection import SourceDetector
from nsdlib.taxonomies import NodeEvaluationAlgorithm


G = nx.karate_club_graph()

# Compute degree centrality
degree_centrality = ncl.degree_centrality(G)
config = SourceDetectionConfig(
node_evaluation_algorithm=NodeEvaluationAlgorithm.NETSLEUTH,
)

source_detector = SourceDetector(config)

# Compute betweenness centrality
betweenness_centrality = ncl.betweenness_centrality(G)
result, evaluation = source_detector.detect_sources_and_evaluate(G=G,
IG=G, real_sources=[0,33])
print(evaluation)

# Compute closeness centrality
closeness_centrality = ncl.closeness_centrality(G)

# Compute eigenvector centrality
eigenvector_centrality = ncl.eigenvector_centrality(G)
```

- invoking `compute_centrality` method of `CentralityService` class, which allows to compute centrality for a given
centrality measure.
- by importing and using specific method:

```python
from typing import Any
Expand Down
16 changes: 6 additions & 10 deletions src/aa.py
Original file line number Diff line number Diff line change
@@ -1,22 +1,18 @@
import networkx as nx
import netcenlib as ncl

from nsdlib.common.models import SourceDetectionConfig
from nsdlib.source_detection import SourceDetector
from nsdlib.taxonomies import OutbreaksDetectionAlgorithm
from nsdlib.taxonomies import NodeEvaluationAlgorithm

# Create a graph
G = nx.karate_club_graph()

config = SourceDetectionConfig(
selection_threshold=None,
outbreaks_detection_algorithm=OutbreaksDetectionAlgorithm.LEIDEN,
node_evaluation_algorithm=NodeEvaluationAlgorithm.NETSLEUTH,
)

source_detector = SourceDetector(config)

result, evaluation = source_detector.detect_sources_and_evaluate(G=G,
IG=G, real_sources=[0,33])
result, evaluation = source_detector.detect_sources_and_evaluate(
G=G, IG=G, real_sources=[0, 33]
)
print(result.global_scores)
print(ncl.degree_centrality(G))

print(evaluation)
91 changes: 91 additions & 0 deletions src/nsdlib/algorithms/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,91 @@
# flake8: noq

from nsdlib.algorithms.outbreaks_detection import (
CPM_Bipartite as outbreaks_detection_CPM_Bipartite,
agdl as outbreaks_detection_agdl,
angel as outbreaks_detection_angel,
aslpaw as outbreaks_detection_aslpaw,
async_fluid as outbreaks_detection_async_fluid,
attribute_clustering as outbreaks_detection_attribute_clustering,
bayan as outbreaks_detection_bayan,
belief as outbreaks_detection_belief,
bimlpa as outbreaks_detection_bimlpa,
bipartite_clustering as outbreaks_detection_bipartite_clustering,
coach as outbreaks_detection_coach,
condor as outbreaks_detection_condor,
conga as outbreaks_detection_conga,
congo as outbreaks_detection_congo,
core_expansion as outbreaks_detection_core_expansion,
cpm as outbreaks_detection_cpm,
crisp_partition as outbreaks_detection_crisp_partition,
dcs as outbreaks_detection_dcs,
demon as outbreaks_detection_demon,
der as outbreaks_detection_der,
dpclus as outbreaks_detection_dpclus,
ebgc as outbreaks_detection_ebgc,
edge_clustering as outbreaks_detection_edge_clustering,
ego_networks as outbreaks_detection_ego_networks,
eigenvector as outbreaks_detection_eigenvector,
em as outbreaks_detection_em,
endntm as outbreaks_detection_endntm,
eva as outbreaks_detection_eva,
frc_fgsn as outbreaks_detection_frc_fgsn,
ga as outbreaks_detection_ga,
gdmp2 as outbreaks_detection_gdmp2,
girvan_newman as outbreaks_detection_girvan_newman,
graph_entropy as outbreaks_detection_graph_entropy,
greedy_modularity as outbreaks_detection_greedy_modularity,
head_tail as outbreaks_detection_head_tail,
hierarchical_link_community as outbreaks_detection_hierarchical_link_community,
ilouvain as outbreaks_detection_ilouvain,
infomap as outbreaks_detection_infomap,
infomap_bipartite as outbreaks_detection_infomap_bipartite,
internal as outbreaks_detection_internal,
internal_dcd as outbreaks_detection_internal_dcd,
ipca as outbreaks_detection_ipca,
kclique as outbreaks_detection_kclique,
kcut as outbreaks_detection_kcut,
label_propagation as outbreaks_detection_label_propagation,
lais2 as outbreaks_detection_lais2,
leiden as outbreaks_detection_leiden,
lemon as outbreaks_detection_lemon,
lfm as outbreaks_detection_lfm,
louvain as outbreaks_detection_louvain,
lpam as outbreaks_detection_lpam,
lpanni as outbreaks_detection_lpanni,
lswl as outbreaks_detection_lswl,
lswl_plus as outbreaks_detection_lswl_plus,
markov_clustering as outbreaks_detection_markov_clustering,
mcode as outbreaks_detection_mcode,
mod_m as outbreaks_detection_mod_m,
mod_r as outbreaks_detection_mod_r,
multicom as outbreaks_detection_multicom,
node_perception as outbreaks_detection_node_perception,
overlapping_partition as outbreaks_detection_overlapping_partition,
overlapping_seed_set_expansion as outbreaks_detection_overlapping_seed_set_expansion,
paris as outbreaks_detection_paris,
percomvc as outbreaks_detection_percomvc,
principled_clustering as outbreaks_detection_principled_clustering,
pycombo as outbreaks_detection_pycombo,
r_spectral_clustering as outbreaks_detection_r_spectral_clustering,
rb_pots as outbreaks_detection_rb_pots,
rber_pots as outbreaks_detection_rber_pots,
ricci_community as outbreaks_detection_ricci_community,
sbm_dl as outbreaks_detection_sbm_dl,
sbm_dl_nested as outbreaks_detection_sbm_dl_nested,
scan as outbreaks_detection_scan,
siblinarity_antichain as outbreaks_detection_siblinarity_antichain,
significance_communities as outbreaks_detection_significance_communities,
slpa as outbreaks_detection_slpa,
spectral as outbreaks_detection_spectral,
spinglass as outbreaks_detection_spinglass,
surprise_communities as outbreaks_detection_surprise_communities,
temporal_partition as outbreaks_detection_temporal_partition,
threshold_clustering as outbreaks_detection_threshold_clustering,
tiles as outbreaks_detection_tiles,
umstmo as outbreaks_detection_umstmo,
walkscan as outbreaks_detection_walkscan,
walktrap as outbreaks_detection_walktrap,
wCommunity as outbreaks_detection_wCommunity,
)
from nsdlib.algorithms.reconstruction import sbrp as reconstruction_sbrp
Loading
Loading