The study of network robustness is critical to the understanding of complex interconnected systems. For example, consider an example of a power grid network that is susceptible to both natural failures and targeted attacks. A natural failure occurs when a single power substation fails due to erosion of parts or natural disasters. However, when one substation fails, additional load is routed to alternative substations, potentially causing a series of cascading failures. Not all failures originate from natural causes, some come from targeted attacks, such as enemy states hacking into the grid to sabotage key equipment to maximally damage the operations of the electrical grid. A natural counterpart to network robustness is vulnerability, defined as measure of a network’s susceptibility to the dissemination of entities across the network, such as how quickly a virus spreads across a computer network.
In this lab, we show how to use cuGraph and TIGER to conduct state-of-the-art GPU accelerated graph vulnerability and robustness analysis. Specifically, we will look at how to:
pip install graph-tiger
Collecting graph-tiger
Using cached https://files.pythonhosted.org/packages/eb/56/fab06f97525cc5071ebe6d4a693e4924a22c7bfec9129b98d33d831486f4/graph-tiger-0.1.5.tar.gz
Requirement already satisfied: numpy in /usr/local/lib/python3.7/dist-packages (from graph-tiger) (1.19.5)
Requirement already satisfied: networkx in /usr/local/lib/python3.7/dist-packages (from graph-tiger) (2.5.1)
Requirement already satisfied: tqdm in /usr/local/lib/python3.7/dist-packages (from graph-tiger) (4.41.1)
Requirement already satisfied: scipy in /usr/local/lib/python3.7/dist-packages (from graph-tiger) (1.4.1)
Requirement already satisfied: pandas in /usr/local/lib/python3.7/dist-packages (from graph-tiger) (1.1.5)
Requirement already satisfied: six in /usr/local/lib/python3.7/dist-packages (from graph-tiger) (1.15.0)
Requirement already satisfied: pillow in /usr/local/lib/python3.7/dist-packages (from graph-tiger) (7.1.2)
Requirement already satisfied: numba in /usr/local/lib/python3.7/dist-packages (from graph-tiger) (0.51.2)
Requirement already satisfied: matplotlib in /usr/local/lib/python3.7/dist-packages (from graph-tiger) (3.2.2)
Requirement already satisfied: fa2 in /usr/local/lib/python3.7/dist-packages (from graph-tiger) (0.3.5)
Collecting onnx
Downloading https://files.pythonhosted.org/packages/3f/9b/54c950d3256e27f970a83cd0504efb183a24312702deed0179453316dbd0/onnx-1.9.0-cp37-cp37m-manylinux2010_x86_64.whl (12.2MB)
|████████████████████████████████| 12.2MB 253kB/s
Requirement already satisfied: PyYAML in /usr/local/lib/python3.7/dist-packages (from graph-tiger) (3.13)
Requirement already satisfied: ipython in /usr/local/lib/python3.7/dist-packages (from graph-tiger) (5.5.0)
Collecting stopit
Downloading https://files.pythonhosted.org/packages/35/58/e8bb0b0fb05baf07bbac1450c447d753da65f9701f551dca79823ce15d50/stopit-1.1.2.tar.gz
Collecting datashader
Downloading https://files.pythonhosted.org/packages/df/24/22f96084785d9cc424f1e70541a2803eec807c82e6bdab87c4b71fd96d10/datashader-0.12.1-py2.py3-none-any.whl (15.8MB)
|████████████████████████████████| 15.8MB 210kB/s
Requirement already satisfied: dask in /usr/local/lib/python3.7/dist-packages (from graph-tiger) (2.12.0)
Requirement already satisfied: scikit-image in /usr/local/lib/python3.7/dist-packages (from graph-tiger) (0.16.2)
Collecting bezier==2020.5.19
Downloading https://files.pythonhosted.org/packages/d7/d5/1aa44ca9346f87ad252bbbf42fbe1c72ffa52b8fe9236295b42bb0b341a5/bezier-2020.5.19-cp37-cp37m-manylinux2010_x86_64.whl (1.4MB)
|████████████████████████████████| 1.4MB 50.3MB/s
Requirement already satisfied: decorator<5,>=4.3 in /usr/local/lib/python3.7/dist-packages (from networkx->graph-tiger) (4.4.2)
Requirement already satisfied: python-dateutil>=2.7.3 in /usr/local/lib/python3.7/dist-packages (from pandas->graph-tiger) (2.8.1)
Requirement already satisfied: pytz>=2017.2 in /usr/local/lib/python3.7/dist-packages (from pandas->graph-tiger) (2018.9)
Requirement already satisfied: setuptools in /usr/local/lib/python3.7/dist-packages (from numba->graph-tiger) (54.2.0)
Requirement already satisfied: llvmlite<0.35,>=0.34.0.dev0 in /usr/local/lib/python3.7/dist-packages (from numba->graph-tiger) (0.34.0)
Requirement already satisfied: cycler>=0.10 in /usr/local/lib/python3.7/dist-packages (from matplotlib->graph-tiger) (0.10.0)
Requirement already satisfied: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 in /usr/local/lib/python3.7/dist-packages (from matplotlib->graph-tiger) (2.4.7)
Requirement already satisfied: kiwisolver>=1.0.1 in /usr/local/lib/python3.7/dist-packages (from matplotlib->graph-tiger) (1.3.1)
Requirement already satisfied: typing-extensions>=3.6.2.1 in /usr/local/lib/python3.7/dist-packages (from onnx->graph-tiger) (3.7.4.3)
Requirement already satisfied: protobuf in /usr/local/lib/python3.7/dist-packages (from onnx->graph-tiger) (3.12.4)
Requirement already satisfied: pygments in /usr/local/lib/python3.7/dist-packages (from ipython->graph-tiger) (2.6.1)
Requirement already satisfied: pexpect; sys_platform != "win32" in /usr/local/lib/python3.7/dist-packages (from ipython->graph-tiger) (4.8.0)
Requirement already satisfied: traitlets>=4.2 in /usr/local/lib/python3.7/dist-packages (from ipython->graph-tiger) (5.0.5)
Requirement already satisfied: prompt-toolkit<2.0.0,>=1.0.4 in /usr/local/lib/python3.7/dist-packages (from ipython->graph-tiger) (1.0.18)
Requirement already satisfied: simplegeneric>0.8 in /usr/local/lib/python3.7/dist-packages (from ipython->graph-tiger) (0.8.1)
Requirement already satisfied: pickleshare in /usr/local/lib/python3.7/dist-packages (from ipython->graph-tiger) (0.7.5)
Requirement already satisfied: pyct>=0.4.4 in /usr/local/lib/python3.7/dist-packages (from datashader->graph-tiger) (0.4.8)
Requirement already satisfied: bokeh in /usr/local/lib/python3.7/dist-packages (from datashader->graph-tiger) (2.3.1)
Requirement already satisfied: param>=1.6.0 in /usr/local/lib/python3.7/dist-packages (from datashader->graph-tiger) (1.10.1)
Requirement already satisfied: colorcet>=0.9.0 in /usr/local/lib/python3.7/dist-packages (from datashader->graph-tiger) (2.0.6)
Requirement already satisfied: xarray>=0.9.6 in /usr/local/lib/python3.7/dist-packages (from datashader->graph-tiger) (0.15.1)
Collecting datashape>=0.5.1
Downloading https://files.pythonhosted.org/packages/a6/5b/95b2ed56b61e649b69c9a5b1ecb32ff0a5cd68b9f69f5aa7774540e6b444/datashape-0.5.2.tar.gz (76kB)
|████████████████████████████████| 81kB 12.8MB/s
Requirement already satisfied: toolz>=0.7.4 in /usr/local/lib/python3.7/dist-packages (from datashader->graph-tiger) (0.11.1)
Requirement already satisfied: PyWavelets>=0.4.0 in /usr/local/lib/python3.7/dist-packages (from scikit-image->graph-tiger) (1.1.1)
Requirement already satisfied: imageio>=2.3.0 in /usr/local/lib/python3.7/dist-packages (from scikit-image->graph-tiger) (2.4.1)
Requirement already satisfied: ptyprocess>=0.5 in /usr/local/lib/python3.7/dist-packages (from pexpect; sys_platform != "win32"->ipython->graph-tiger) (0.7.0)
Requirement already satisfied: ipython-genutils in /usr/local/lib/python3.7/dist-packages (from traitlets>=4.2->ipython->graph-tiger) (0.2.0)
Requirement already satisfied: wcwidth in /usr/local/lib/python3.7/dist-packages (from prompt-toolkit<2.0.0,>=1.0.4->ipython->graph-tiger) (0.2.5)
Requirement already satisfied: Jinja2>=2.7 in /usr/local/lib/python3.7/dist-packages (from bokeh->datashader->graph-tiger) (2.11.3)
Requirement already satisfied: tornado>=5.1 in /usr/local/lib/python3.7/dist-packages (from bokeh->datashader->graph-tiger) (5.1.1)
Requirement already satisfied: packaging>=16.8 in /usr/local/lib/python3.7/dist-packages (from bokeh->datashader->graph-tiger) (20.9)
Collecting multipledispatch>=0.4.7
Downloading https://files.pythonhosted.org/packages/89/79/429ecef45fd5e4504f7474d4c3c3c4668c267be3370e4c2fd33e61506833/multipledispatch-0.6.0-py3-none-any.whl
Requirement already satisfied: MarkupSafe>=0.23 in /usr/local/lib/python3.7/dist-packages (from Jinja2>=2.7->bokeh->datashader->graph-tiger) (1.1.1)
Building wheels for collected packages: graph-tiger, stopit, datashape
Building wheel for graph-tiger (setup.py) ... done
Created wheel for graph-tiger: filename=graph_tiger-0.1.5-cp37-none-any.whl size=38287 sha256=c6a9a538f7f033a26e4f47b901f5e0d1d3262e778c18c574e696388799b16880
Stored in directory: /root/.cache/pip/wheels/7d/51/5b/73b222616f44d7d78956d5b5283528f8c259aac9318f247910
Building wheel for stopit (setup.py) ... done
Created wheel for stopit: filename=stopit-1.1.2-cp37-none-any.whl size=11954 sha256=3a756c4bcdfbb9014d3c68e047a937077366f30ae7b6c7e91a70bdfbd3ec156c
Stored in directory: /root/.cache/pip/wheels/3c/85/2b/2580190404636bfc63e8de3dff629c03bb795021e1983a6cc7
Building wheel for datashape (setup.py) ... done
Created wheel for datashape: filename=datashape-0.5.2-cp37-none-any.whl size=59430 sha256=353976323aa4c0110954ace5dc87303d18c2d03798f6fd62ae9cf4d49299672c
Stored in directory: /root/.cache/pip/wheels/8d/06/05/c1cba3d57bdcfd3960e3f60a9fdc97e4baef2ef09af0ad1ef8
Successfully built graph-tiger stopit datashape
Installing collected packages: onnx, stopit, multipledispatch, datashape, datashader, bezier, graph-tiger
Successfully installed bezier-2020.5.19 datashader-0.12.1 datashape-0.5.2 graph-tiger-0.1.5 multipledispatch-0.6.0 onnx-1.9.0 stopit-1.1.2
# Install RAPIDS
!git clone https://github.com/rapidsai/rapidsai-csp-utils.git
!bash rapidsai-csp-utils/colab/rapids-colab.sh stable
import sys, os
dist_package_index = sys.path.index('/usr/local/lib/python3.7/dist-packages')
sys.path = sys.path[:dist_package_index] + ['/usr/local/lib/python3.7/site-packages'] + sys.path[dist_package_index:]
sys.path
exec(open('rapidsai-csp-utils/colab/update_modules.py').read(), globals())
Cloning into 'rapidsai-csp-utils'...
remote: Enumerating objects: 213, done.
remote: Counting objects: 100% (42/42), done.
remote: Compressing objects: 100% (42/42), done.
remote: Total 213 (delta 22), reused 3 (delta 0), pack-reused 171
Receiving objects: 100% (213/213), 64.29 KiB | 12.86 MiB/s, done.
Resolving deltas: 100% (84/84), done.
PLEASE READ
********************************************************************************************************
Changes:
1. IMPORTANT SCRIPT CHANGES: Colab has updated to Python 3.7, and now runs our STABLE and NIGHTLY versions (0.18 and 0.19)! PLEASE update your older install script code as follows:
!bash rapidsai-csp-utils/colab/rapids-colab.sh 0.18
import sys, os
dist_package_index = sys.path.index('/usr/local/lib/python3.7/dist-packages')
sys.path = sys.path[:dist_package_index] + ['/usr/local/lib/python3.7/site-packages'] + sys.path[dist_package_index:]
sys.path
exec(open('rapidsai-csp-utils/colab/update_modules.py').read(), globals())
2. IMPORTANT NOTICE: CuGraph's Louvain requires a Volta+ GPU (T4, V100). If you get a P4 or P100 and intend to use Louvain, please FACTORY RESET your instance and try to get a compatible GPU
3. Default stable version is now 0.18. Nightly is now 0.19.
3. You can declare your RAPIDSAI version as a CLI option and skip the user prompts (ex: '0.18' or '0.19', between 0.17 to 0.19, without the quotes):
"!bash rapidsai-csp-utils/colab/rapids-colab.sh <version/label>"
Examples: '!bash rapidsai-csp-utils/colab/rapids-colab.sh 0.18', or '!bash rapidsai-csp-utils/colab/rapids-colab.sh stable', or '!bash rapidsai-csp-utils/colab/rapids-colab.sh s'
'!bash rapidsai-csp-utils/colab/rapids-colab.sh 0.19, or '!bash rapidsai-csp-utils/colab/rapids-colab.sh nightly', or '!bash rapidsai-csp-utils/colab/rapids-colab.sh n'
Enjoy using RAPIDS! If you have any issues with or suggestions for RAPIDSAI on Colab, please create a bug request on https://github.com/rapidsai/rapidsai-csp-utils/issues/new.
For a near instant entry into a RAPIDS Library experience, or if we haven't fixed a fatal issue yet, please use https://app.blazingsql.com/. Thanks!
Starting to prep Colab for install RAPIDS Version 0.18 stable
Checking for GPU type:
***********************************************************************
Woo! Your instance has the right kind of GPU, a Tesla T4!
***********************************************************************
Removing conflicting packages, will replace with RAPIDS compatible versions
Uninstalling dask-2.12.0:
Successfully uninstalled dask-2.12.0
Uninstalling distributed-1.25.3:
Successfully uninstalled distributed-1.25.3
Uninstalling xgboost-0.90:
Successfully uninstalled xgboost-0.90
Uninstalling pyarrow-3.0.0:
Successfully uninstalled pyarrow-3.0.0
Uninstalling numba-0.51.2:
Successfully uninstalled numba-0.51.2
Uninstalling llvmlite-0.34.0:
Successfully uninstalled llvmlite-0.34.0
Uninstalling PySocks-1.7.1:
Successfully uninstalled PySocks-1.7.1
Uninstalling requests-2.23.0:
Successfully uninstalled requests-2.23.0
Uninstalling six-1.15.0:
Successfully uninstalled six-1.15.0
Uninstalling urllib3-1.24.3:
Successfully uninstalled urllib3-1.24.3
Uninstalling cffi-1.14.5:
Successfully uninstalled cffi-1.14.5
Installing conda
--2021-04-21 00:16:31-- https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh
Resolving repo.anaconda.com (repo.anaconda.com)... 104.16.130.3, 104.16.131.3, 2606:4700::6810:8303, ...
Connecting to repo.anaconda.com (repo.anaconda.com)|104.16.130.3|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 94235922 (90M) [application/x-sh]
Saving to: ‘Miniconda3-latest-Linux-x86_64.sh’
Miniconda3-latest-L 100%[===================>] 89.87M 203MB/s in 0.4s
2021-04-21 00:16:32 (203 MB/s) - ‘Miniconda3-latest-Linux-x86_64.sh’ saved [94235922/94235922]
PREFIX=/usr/local
Unpacking payload ...
Collecting package metadata (current_repodata.json): - \ | done
Solving environment: - \ done
## Package Plan ##
environment location: /usr/local
added / updated specs:
- _libgcc_mutex==0.1=main
- brotlipy==0.7.0=py38h27cfd23_1003
- ca-certificates==2020.10.14=0
- certifi==2020.6.20=pyhd3eb1b0_3
- cffi==1.14.3=py38h261ae71_2
- chardet==3.0.4=py38h06a4308_1003
- conda-package-handling==1.7.2=py38h03888b9_0
- conda==4.9.2=py38h06a4308_0
- cryptography==3.2.1=py38h3c74f83_1
- idna==2.10=py_0
- ld_impl_linux-64==2.33.1=h53a641e_7
- libedit==3.1.20191231=h14c3975_1
- libffi==3.3=he6710b0_2
- libgcc-ng==9.1.0=hdf63c60_0
- libstdcxx-ng==9.1.0=hdf63c60_0
- ncurses==6.2=he6710b0_1
- openssl==1.1.1h=h7b6447c_0
- pip==20.2.4=py38h06a4308_0
- pycosat==0.6.3=py38h7b6447c_1
- pycparser==2.20=py_2
- pyopenssl==19.1.0=pyhd3eb1b0_1
- pysocks==1.7.1=py38h06a4308_0
- python==3.8.5=h7579374_1
- readline==8.0=h7b6447c_0
- requests==2.24.0=py_0
- ruamel_yaml==0.15.87=py38h7b6447c_1
- setuptools==50.3.1=py38h06a4308_1
- six==1.15.0=py38h06a4308_0
- sqlite==3.33.0=h62c20be_0
- tk==8.6.10=hbc83047_0
- tqdm==4.51.0=pyhd3eb1b0_0
- urllib3==1.25.11=py_0
- wheel==0.35.1=pyhd3eb1b0_0
- xz==5.2.5=h7b6447c_0
- yaml==0.2.5=h7b6447c_0
- zlib==1.2.11=h7b6447c_3
The following NEW packages will be INSTALLED:
_libgcc_mutex pkgs/main/linux-64::_libgcc_mutex-0.1-main
brotlipy pkgs/main/linux-64::brotlipy-0.7.0-py38h27cfd23_1003
ca-certificates pkgs/main/linux-64::ca-certificates-2020.10.14-0
certifi pkgs/main/noarch::certifi-2020.6.20-pyhd3eb1b0_3
cffi pkgs/main/linux-64::cffi-1.14.3-py38h261ae71_2
chardet pkgs/main/linux-64::chardet-3.0.4-py38h06a4308_1003
conda pkgs/main/linux-64::conda-4.9.2-py38h06a4308_0
conda-package-han~ pkgs/main/linux-64::conda-package-handling-1.7.2-py38h03888b9_0
cryptography pkgs/main/linux-64::cryptography-3.2.1-py38h3c74f83_1
idna pkgs/main/noarch::idna-2.10-py_0
ld_impl_linux-64 pkgs/main/linux-64::ld_impl_linux-64-2.33.1-h53a641e_7
libedit pkgs/main/linux-64::libedit-3.1.20191231-h14c3975_1
libffi pkgs/main/linux-64::libffi-3.3-he6710b0_2
libgcc-ng pkgs/main/linux-64::libgcc-ng-9.1.0-hdf63c60_0
libstdcxx-ng pkgs/main/linux-64::libstdcxx-ng-9.1.0-hdf63c60_0
ncurses pkgs/main/linux-64::ncurses-6.2-he6710b0_1
openssl pkgs/main/linux-64::openssl-1.1.1h-h7b6447c_0
pip pkgs/main/linux-64::pip-20.2.4-py38h06a4308_0
pycosat pkgs/main/linux-64::pycosat-0.6.3-py38h7b6447c_1
pycparser pkgs/main/noarch::pycparser-2.20-py_2
pyopenssl pkgs/main/noarch::pyopenssl-19.1.0-pyhd3eb1b0_1
pysocks pkgs/main/linux-64::pysocks-1.7.1-py38h06a4308_0
python pkgs/main/linux-64::python-3.8.5-h7579374_1
readline pkgs/main/linux-64::readline-8.0-h7b6447c_0
requests pkgs/main/noarch::requests-2.24.0-py_0
ruamel_yaml pkgs/main/linux-64::ruamel_yaml-0.15.87-py38h7b6447c_1
setuptools pkgs/main/linux-64::setuptools-50.3.1-py38h06a4308_1
six pkgs/main/linux-64::six-1.15.0-py38h06a4308_0
sqlite pkgs/main/linux-64::sqlite-3.33.0-h62c20be_0
tk pkgs/main/linux-64::tk-8.6.10-hbc83047_0
tqdm pkgs/main/noarch::tqdm-4.51.0-pyhd3eb1b0_0
urllib3 pkgs/main/noarch::urllib3-1.25.11-py_0
wheel pkgs/main/noarch::wheel-0.35.1-pyhd3eb1b0_0
xz pkgs/main/linux-64::xz-5.2.5-h7b6447c_0
yaml pkgs/main/linux-64::yaml-0.2.5-h7b6447c_0
zlib pkgs/main/linux-64::zlib-1.2.11-h7b6447c_3
Preparing transaction: / - \ done
Executing transaction: / - \ | / - \ | / - \ | / done
installation finished.
WARNING:
You currently have a PYTHONPATH environment variable set. This may cause
unexpected behavior when running the Python interpreter in Miniconda3.
For best results, please verify that your PYTHONPATH only points to
directories of packages that are compatible with the Python interpreter
in Miniconda3: /usr/local
Installing RAPIDS 0.18 packages from the stable release channel
Please standby, this will take a few minutes...
Collecting package metadata (current_repodata.json): - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ done
Solving environment: / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | failed with initial frozen solve. Retrying with flexible solve.
Solving environment: - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \
That's it! Now we can run a variety of GPU acclerated graph mining algorithms.
While CPU calculations work well for sparse graphs, GPU acceleration significantly speeds-up analysis for dense graphs. To see this, lets run the code below that measures the robustness of a Barabási Albert (BA) graph at varying levels of density (i.e., number of edges per node).
import time
from tqdm import tqdm
from graph_tiger.measures import run_measure
from graph_tiger.graphs import graph_loader
# controls graph density by varying the number of non-zeroes per row (i.e., number of edges per node in graph)
nnz_per_row = list(range(50, 501, 50))
cpu_times = []
gpu_times = []
for nnz in tqdm(nnz_per_row):
graph = graph_loader(graph_type='BA', n=1000, m=nnz, seed=1)
start_cpu = time.time()
robustness_index = run_measure(graph, measure='average_vertex_betweenness', k=int(0.05 * len(graph)))
end_cpu = time.time()
start_gpu = time.time()
robustness_index = run_measure(graph, measure='average_vertex_betweenness', k=12, use_gpu=True) # ****** Replace with cuGraph version: https://docs.rapids.ai/api/cugraph/stable/api.html#module-cugraph.centrality.betweenness_centrality ******
end_gpu = time.time()
cpu_times.append(round(end_cpu - start_cpu, 2))
gpu_times.append(round(end_gpu - start_gpu, 2))
Now lets plot the results (lower is better).
%matplotlib inline
import matplotlib.pyplot as plt
plt.plot(nnz_per_row, cpu_times, label='CPU')
plt.plot(nnz_per_row, gpu_times, label='GPU')
plt.xlabel('Number of edges per node (NNZ)')
plt.ylabel('Time (seconds)')
plt.title('Measuring Graph Robustness Runtime (CPU vs. GPU)')
plt.legend()
plt.show()
Cascading failures often arise as a result of natural failures or targeted attacks in a network. There are 3 main processes governing the network simulation:
the capacity of each node () in the network, e.g., power substation capacity.
the load of each node () in the network, e.g., power substation load level
network redundancy (r) representing the amount of reserve capacity present in the network i.e., auxiliary support systems.
When a node is attacked it becomes "overloaded", causing it to fail and requiring the load be distributed to its neighbors. When defending a node, we increase it’s capacity to protect against attacks. With just these 3 parameters, we can setup a cascading failure simulation. Below, we show how to load a graph representing the U.S. electrical grid and setup the simulation parameters.
from graph_tiger.graphs import graph_loader
graph = graph_loader('electrical')
params = {
'runs': 1, # number of times to run the simulation
'steps': 100, # number of time steps to run each simulation
'seed': 1, # for repoducibility
'l': 0.8, # network load [0, 1]
'r': 0.2, # network redundancey [0, 1]
'c': int(0.1 * len(graph)), # load capacity approximation
'robust_measure': 'largest_connected_component', # measure of network health
}
To run the attack we just have to modify a few simulation parameters. We set the attack to remove 30 nodes in the graph (e.g., power grid substations) with highest degree centrality "id_node". As you can imagine, there are many different strategies that can be used to attack the grid, however, by selecting degree centrality we can find "central" nodes in the network with many power lines (edges) connected to the substations (nodes).
params.update({
'k_a': 30, # number of nodes to attack
'attack': 'id_node', # attack strategy
})
Now lets run the simulation and plot the results!
from graph_tiger.cascading import Cascading
cascading = Cascading(graph, **params)
results = cascading.run_simulation()
cascading.plot_results(results)
Running simulation 1 times
<Figure size 432x288 with 0 Axes>
Now try modifying the code to find the minimal attack necessary to collapse the electrical grid. To do this, plot the "health" of the network as measured by the robustness measure (i.e., "largest_connected_component") at the end of each simulation, against the attack strength ("k_a").
Hint: electrical networks are fragile to targeted attacks, try removing just a few nodes.
sim_results =[]
params['attack'] = 'id_node'
attack_strengths = list(range(0, 6, 1))
for k_a in tqdm(attack_strengths):
params['k_a'] = k_a
cascading = Cascading(graph, **params)
results = cascading.run_simulation()
lcc_at_end = results[-1]
sim_results.append(lcc_at_end)
%matplotlib inline
import matplotlib.pyplot as plt
plt.plot(attack_strengths, sim_results)
plt.xlabel('Attack strength')
plt.ylabel('Largest connected component')
plt.title('Effect of Attack Strength in Cascading Failures')
0%| | 0/5 [00:00<?, ?it/s]
Running simulation 1 times
20%|██ | 1/5 [00:21<01:26, 21.68s/it]
Running simulation 1 times
40%|████ | 2/5 [00:43<01:05, 21.76s/it]
Running simulation 1 times
60%|██████ | 3/5 [01:06<00:44, 22.03s/it]
Running simulation 1 times
80%|████████ | 4/5 [01:28<00:22, 22.14s/it]
Running simulation 1 times
100%|██████████| 5/5 [01:50<00:00, 22.17s/it]
The flu-like susceptible-infected-susceptible (SIS) is a popular model that allows us to simulate the spread of viruses across networks (graphs). Each node in the SIS model can be in one of two states, infected I or susceptible S, and at each time step t, an infected node has a probability β of infecting each of it's uninfected neighbors. After this, each infected node has a probability δ of healing and becoming susceptible again.
It’s been shown there's a direct correlation between the graph's topology as measured through the spectral radius (largest eigenvalue) of the graph, and the virus remaining endemic. The exact relationship between a virus's strength (s), birth rate (β), death rate (δ) and spectral radius (λ1) is s=λ1⋅b/d, where a larger s means a stronger virus. With just these 3 parameters, we can setup a computer virus simulation.
Below, we (1) load the Autonomous systems AS-733 network, which is a graph of routers comprising the internet; and (2) setup the simulation parameters.
from graph_tiger.graphs import graph_loader
graph = graph_loader('as_733')
sis_params = {
'runs': 1, # number of simulations to run
'steps': 5000, # number of time steps to run simulation
'model': 'SIS',
'b': 0.00208, # virus birth rate
'd': 0.01, # virus death rate
'c': 0.3, # fraction of the network that starts infected
}
Now lets run the simulation and plot the results! In the figure below, we see that without intervention the virus remains endemic on the router network.
from graph_tiger.diffusion import Diffusion
diffusion = Diffusion(graph, **sis_params)
results = diffusion.run_simulation()
diffusion.plot_results(results)
Running simulation 1 times
<Figure size 432x288 with 0 Axes>
While we do not have control over the virus strength (s), we can maniuplate the underlying toplogy of the router network to make it harder for the virus to spread. The question is, how do we optimally modify the network to reduce the spread of the virus? While a naive solution may be to disconnect the whole network, this isn't very practical since everyone would loose internet access! Instead, we need a strategy that carefully vaccinates a few nodes (routers) against the virus.
Now lets compare the efficacy of 4 vaccination strategies when vaccinating only 3 nodes in the network:
To implement a defense strategy you just have to modify a few simulation parameters.
sis_params.update({
'diffusion': 'min', # we want to minimize the ability of the virus to propagate,
'method': 'ns_node', # use the Netshield technique
'k': 15 # vaccinate 5 nodes according the selected strategy
})
Does each strategy manage to contain the virus (i.e., less than 1\% infected population)? Which strategy has the lowest infected population at the end of the simulation? Setup and run each simulation and compare the results to the unvaccinated network.
methods = ['ns_node', 'id_node', 'rd_node', 'pr_node']
for method in methods:
sis_params['method'] = method
diffusion = Diffusion(graph, **sis_params)
results = diffusion.run_simulation()
print('Percent of network that remains infected at end of simulation using {} vaccination technique is {}%'.format(method, round((results[-1] / len(graph)) * 100, 2)))
diffusion.plot_results(results)
Running simulation 1 times Percent of network that remains infected at end of simulation using ns_node vaccination technique is: 3.85%
Running simulation 1 times Percent of network that remains infected at end of simulation using id_node vaccination technique is: 2.09%
<Figure size 432x288 with 0 Axes>
Running simulation 1 times Percent of network that remains infected at end of simulation using rd_node vaccination technique is: 1.69%
<Figure size 432x288 with 0 Axes>
Running simulation 1 times Percent of network that remains infected at end of simulation using pr_node vaccination technique is: 4.18%
<Figure size 432x288 with 0 Axes>
<Figure size 432x288 with 0 Axes>