Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

update user interface/circle ci fingerprint #129

Merged
merged 1 commit into from
Sep 2, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
184 changes: 92 additions & 92 deletions .circleci/config.yml
Original file line number Diff line number Diff line change
@@ -1,93 +1,93 @@
# Use the latest 2.1 version of CircleCI pipeline process engine.
# See: https://circleci.com/docs/2.0/configuration-reference
version: 2.1
# Orbs are reusable packages of CircleCI configuration that you may share across projects, enabling you to create encapsulated, parameterized commands, jobs, and executors that can be used across multiple projects.
# See: https://circleci.com/docs/2.0/orb-intro/
orbs:
# The python orb contains a set of prepackaged CircleCI configuration you can use repeatedly in your configuration files
# Orb commands and jobs help you with common scripting around a language/tool
# so you dont have to copy and paste it everywhere.
# See the orb documentation here: https://circleci.com/developer/orbs/orb/circleci/python
python: circleci/[email protected]
# Define a job to be invoked later in a workflow.
# See: https://circleci.com/docs/2.0/configuration-reference/#jobs
jobs:
build-and-test: # This is the name oAch the steps below will be executed - below will use a python 3.10.2 container
# Change the version below to your required version of python
docker:
- image: cimg/python:3.8
# Checkout the code as the first step. This is a dedicated CircleCI step.
# The python orb's install-packages step will install the dependencies from a Pipfile via Pipenv by default.
# Here we're making sure we use just use the system-wide pip. By default it uses the project root's requirements.txt.
# Then run your tests!
# CircleCI will report the results back to your VCS provider.
steps:
- checkout
- run:
name: install openblas
command: |
sudo apt-get update
sudo apt-get -y install libblas-dev liblapack-dev
- python/install-packages:
pkg-manager: pip
# app-dir: ~/project/package-directory/ # If you're requirements.txt isn't in the root directory.
# pip-dependency-file: test-requirements.txt # if you have a different name for your requirements file, maybe one that combines your runtime and test requirements.
- run:
name: Run tests
# This assumes pytest is installed via the install-package step above
command: |
pytest --durations=0
pip install primme==3.2.*
pytest --durations=0 renormalizer/mps/tests/test_gs.py::test_multistate
- run:
name: Run examples
command: |
cd example; bash run.sh
cd ..
- run:
name: Build docs
command: cd doc; make html
- persist_to_workspace:
root: doc/
paths: html
# ref: https://circleci.com/blog/deploying-documentation-to-github-pages-with-continuous-integration/
docs-deploy:
docker:
- image: node:8.10.0
steps:
- checkout
- attach_workspace:
at: doc/
- run:
name: Disable jekyll builds
command: touch doc/html/.nojekyll
- run:
name: Install and configure dependencies
command: |
npm install -g --silent [email protected]
git config user.email "[email protected]"
git config user.name "liwt31"
- add_ssh_keys:
fingerprints:
- "cc:a3:29:5d:14:bb:bd:1b:e4:52:e2:c2:ac:e6:41:fa"
- run:
name: Deploy docs to gh-pages branch
command: gh-pages --dotfiles --message "[skip ci] Updates" --dist doc/html
# Invoke jobs via workflows
# See: https://circleci.com/docs/2.0/configuration-reference/#workflows
workflows:
test: # This is the name of the workflow, feel free to change it to better match your workflow.
# Inside the workflow, you define the jobs you want to run.
jobs:
- build-and-test
- docs-deploy:
requires:
- build-and-test
filters:
branches:
# Use the latest 2.1 version of CircleCI pipeline process engine.
# See: https://circleci.com/docs/2.0/configuration-reference
version: 2.1

# Orbs are reusable packages of CircleCI configuration that you may share across projects, enabling you to create encapsulated, parameterized commands, jobs, and executors that can be used across multiple projects.
# See: https://circleci.com/docs/2.0/orb-intro/
orbs:
# The python orb contains a set of prepackaged CircleCI configuration you can use repeatedly in your configuration files
# Orb commands and jobs help you with common scripting around a language/tool
# so you dont have to copy and paste it everywhere.
# See the orb documentation here: https://circleci.com/developer/orbs/orb/circleci/python
python: circleci/[email protected]

# Define a job to be invoked later in a workflow.
# See: https://circleci.com/docs/2.0/configuration-reference/#jobs
jobs:
build-and-test: # This is the name oAch the steps below will be executed - below will use a python 3.10.2 container
# Change the version below to your required version of python
docker:
- image: cimg/python:3.8
# Checkout the code as the first step. This is a dedicated CircleCI step.
# The python orb's install-packages step will install the dependencies from a Pipfile via Pipenv by default.
# Here we're making sure we use just use the system-wide pip. By default it uses the project root's requirements.txt.
# Then run your tests!
# CircleCI will report the results back to your VCS provider.
steps:
- checkout
- run:
name: install openblas
command: |
sudo apt-get update
sudo apt-get -y install libblas-dev liblapack-dev
- python/install-packages:
pkg-manager: pip
# app-dir: ~/project/package-directory/ # If you're requirements.txt isn't in the root directory.
# pip-dependency-file: test-requirements.txt # if you have a different name for your requirements file, maybe one that combines your runtime and test requirements.
- run:
name: Run tests
# This assumes pytest is installed via the install-package step above
command: |
pytest --durations=0
pip install primme==3.2.*
pytest --durations=0 renormalizer/mps/tests/test_gs.py::test_multistate
- run:
name: Run examples
command: |
cd example; bash run.sh
cd ..
- run:
name: Build docs
command: cd doc; make html

- persist_to_workspace:
root: doc/
paths: html

# ref: https://circleci.com/blog/deploying-documentation-to-github-pages-with-continuous-integration/
docs-deploy:
docker:
- image: node:8.10.0
steps:
- checkout
- attach_workspace:
at: doc/
- run:
name: Disable jekyll builds
command: touch doc/html/.nojekyll
- run:
name: Install and configure dependencies
command: |
npm install -g --silent [email protected]
git config user.email "[email protected]"
git config user.name "liwt31"
- add_ssh_keys:
fingerprints:
- "e6:24:b4:ab:d5:82:2b:6a:b5:a9:17:71:99:6d:0f:53"
- run:
name: Deploy docs to gh-pages branch
command: gh-pages --dotfiles --message "[skip ci] Updates" --dist doc/html

# Invoke jobs via workflows
# See: https://circleci.com/docs/2.0/configuration-reference/#workflows
workflows:
test: # This is the name of the workflow, feel free to change it to better match your workflow.
# Inside the workflow, you define the jobs you want to run.
jobs:
- build-and-test
- docs-deploy:
requires:
- build-and-test
filters:
branches:
only: master
39 changes: 22 additions & 17 deletions renormalizer/model/h_qc.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,28 +39,33 @@ def read_fcidump(fname, norb):
else:
nuc = integral

nsorb = norb*2
sh, aseri = int_to_h(h, eri)

logger.info(f"nuclear repulsion: {nuc}")

return sh, aseri, nuc


def int_to_h(h, eri):
nsorb = len(h) * 2
seri = np.zeros((nsorb, nsorb, nsorb, nsorb))
sh = np.zeros((nsorb,nsorb))
for p, q, r, s in itertools.product(range(nsorb),repeat=4):
# a_p^\dagger a_q^\dagger a_r a_s
sh = np.zeros((nsorb, nsorb))
for p, q, r, s in itertools.product(range(nsorb), repeat=4):
# a_p^\dagger a_q^\dagger a_r a_s
if p % 2 == s % 2 and q % 2 == r % 2:
seri[p,q,r,s] = eri[p//2,s//2,q//2,r//2]
for q, s in itertools.product(range(nsorb),repeat=2):
seri[p, q, r, s] = eri[p // 2, s // 2, q // 2, r // 2]

for q, s in itertools.product(range(nsorb), repeat=2):
if q % 2 == s % 2:
sh[q,s] = h[q//2,s//2]
sh[q, s] = h[q // 2, s // 2]

aseri = np.zeros((nsorb, nsorb, nsorb, nsorb))
for q, s in itertools.product(range(nsorb),repeat=2):
for q, s in itertools.product(range(nsorb), repeat=2):
for p, r in itertools.product(range(q), range(s)):
#aseri[p,q,r,s] = seri[p,q,r,s] - seri[q,p,r,s]
aseri[p,q,r,s] = seri[p,q,r,s] - seri[p,q,s,r]

logger.info(f"nuclear repulsion: {nuc}")

return sh, aseri, nuc

# aseri[p,q,r,s] = seri[p,q,r,s] - seri[q,p,r,s]
aseri[p, q, r, s] = seri[p, q, r, s] - seri[p, q, s, r]

return sh, aseri

def qc_model(h1e, h2e):
"""
Expand Down
10 changes: 5 additions & 5 deletions renormalizer/mps/gs.py
Original file line number Diff line number Diff line change
Expand Up @@ -86,11 +86,11 @@ def optimize_mps(mps: Mps, mpo: Mpo, omega: float = None) -> Tuple[List, Mps]:

# ensure that mps is left or right-canonical
# TODO: start from a mix-canonical MPS
if mps.is_left_canon:
mps.ensure_right_canon()
if mps.is_left_canonical:
mps.ensure_right_canonical()
env = "R"
else:
mps.ensure_left_canon()
mps.ensure_left_canonical()
env = "L"

# construct the environment matrix
Expand Down Expand Up @@ -135,10 +135,10 @@ def optimize_mps(mps: Mps, mpo: Mpo, omega: float = None) -> Tuple[List, Mps]:
assert res_mps is not None
# remove the redundant basis near the edge
if mps.optimize_config.nroots == 1:
res_mps = res_mps.normalize("mps_only").ensure_left_canon().canonicalise()
res_mps = res_mps.normalize("mps_only").ensure_left_canonical().canonicalise()
logger.info(f"{res_mps}")
else:
res_mps = [mp.normalize("mps_only").ensure_left_canon().canonicalise() for mp in res_mps]
res_mps = [mp.normalize("mps_only").ensure_left_canonical().canonicalise() for mp in res_mps]
logger.info(f"{res_mps[0]}")
return macro_iteration_result, res_mps

Expand Down
22 changes: 12 additions & 10 deletions renormalizer/mps/mp.py
Original file line number Diff line number Diff line change
Expand Up @@ -110,14 +110,19 @@ def bond_dims(self) -> List:
# return a list so that the logging result is more pretty
return bond_dims


vbond_list = vbond_dims = bond_list = bond_dims

@property
def bond_dims_mean(self) -> int:
return int(round(np.mean(self.bond_dims)))

@property
def pbond_list(self):
def pbond_dims(self):
return self.model.pbond_list

pbond_list = pbond_dims

def build_empty_qn(self):
self.qntot = 0
# set qnidx to the right to be consistent with most MPS/MPO setups
Expand Down Expand Up @@ -171,20 +176,20 @@ def check_right_canonical(self, rtol=1e-5, atol=1e-8):
return True

@property
def is_left_canon(self):
def is_left_canonical(self):
"""
check the qn center in the L-canonical structure
"""
return self.qnidx == self.site_num - 1

@property
def is_right_canon(self):
def is_right_canonical(self):
"""
check the qn center in the R-canonical structure
"""
return self.qnidx == 0

def ensure_left_canon(self, rtol=1e-5, atol=1e-8):
def ensure_left_canonical(self, rtol=1e-5, atol=1e-8):
if self.to_right or self.qnidx != self.site_num-1 or \
(not self.check_left_canonical(rtol, atol)):
self.move_qnidx(0)
Expand All @@ -193,7 +198,7 @@ def ensure_left_canon(self, rtol=1e-5, atol=1e-8):
else:
return self

def ensure_right_canon(self, rtol=1e-5, atol=1e-8):
def ensure_right_canonical(self, rtol=1e-5, atol=1e-8):
if (not self.to_right) or self.qnidx != 0 or \
(not self.check_right_canonical(rtol, atol)):
self.move_qnidx(self.site_num - 1)
Expand Down Expand Up @@ -437,7 +442,7 @@ def compress(self, temp_m_trunc=None, ret_s=False):
if not self.is_mpo:
# ensure mps is canonicalised. This is time consuming.
# to disable this, run python as `python -O`
if self.is_left_canon:
if self.is_left_canonical:
assert self.check_left_canonical()
else:
assert self.check_right_canonical()
Expand Down Expand Up @@ -513,7 +518,7 @@ def variational_compress(self, mpo=None, guess=None):
# the attributes of guess would be the same as self
guess = compressed_mpo.apply(compressed_mps)
mps = guess
mps.ensure_left_canon()
mps.ensure_left_canonical()
logger.info(f"initial guess bond dims: {mps.bond_dims}")

procedure = mps.compress_config.vprocedure
Expand Down Expand Up @@ -1022,9 +1027,6 @@ def total_bytes(self):
def _get_sigmaqn(self, idx):
raise NotImplementedError

def set_threshold(self, val):
self.compress_config.threshold = val

def __eq__(self, other):
for m1, m2 in zip(self, other):
if not allclose(m1, m2):
Expand Down
2 changes: 0 additions & 2 deletions renormalizer/mps/mpo.py
Original file line number Diff line number Diff line change
Expand Up @@ -482,5 +482,3 @@ def from_mp(cls, model, mp):
mpo.append(mt)
mpo.build_empty_qn()
return mpo


6 changes: 3 additions & 3 deletions renormalizer/mps/mps.py
Original file line number Diff line number Diff line change
Expand Up @@ -914,7 +914,7 @@ def mpo_t(t, *args, **kwargs):

# only not canonicalise when force_ovlp=True and to_right=False
if not (self.evolve_config.force_ovlp and not self.to_right):
self.ensure_left_canon()
self.ensure_left_canonical()

# `self` should not be modified during the evolution
if imag_time:
Expand Down Expand Up @@ -1119,7 +1119,7 @@ def _evolve_tdvp_mu_cmf(self, mpo, evolve_dt) -> "Mps":
else:
coef = 1j

self.ensure_left_canon()
self.ensure_left_canonical()

# `self` should not be modified during the evolution
# mps: the mps to return
Expand Down Expand Up @@ -1683,7 +1683,7 @@ def calc_bond_entropy(self) -> np.ndarray:
# Make sure that the bond entropy is from the left to the right and not
# destroy the original mps
mps = self.copy()
mps.ensure_right_canon()
mps.ensure_right_canonical()
_, s_list = mps.compress(temp_m_trunc=np.inf, ret_s=True)
return np.array([calc_vn_entropy(sigma ** 2) for sigma in s_list])

Expand Down
2 changes: 1 addition & 1 deletion renormalizer/mps/tda.py
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,7 @@ def kernel(self, restart=False, include_psi0=False):

if not restart:
# make sure that M is not redundant near the edge
mps = self.mps.ensure_right_canon().canonicalise().normalize("mps_and_coeff").canonicalise()
mps = self.mps.ensure_right_canonical().canonicalise().normalize("mps_and_coeff").canonicalise()
logger.debug(f"reference mps shape, {mps}")
mps_r_cano = mps.copy()
assert mps.to_right
Expand Down
Loading