-
Notifications
You must be signed in to change notification settings - Fork 8
Add AiiDA infrastructure #11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Changes from 4 commits
Commits
Show all changes
51 commits
Select commit
Hold shift + click to select a range
18ee6f2
Add universal_simple_to_aiida
GeigerJ2 df41384
Add universal_qe_to_aiida
GeigerJ2 ae7a1de
Add CI setup to include AiiDA
GeigerJ2 ca37531
Changes to quantum_espresso_workflow.py
GeigerJ2 b7fe9e2
Update .github/workflows/aiida.yml
jan-janssen 375e15e
Merge branch 'main' into aiida-pythonjob
jan-janssen 324a408
Merge branch 'main' into aiida-pythonjob
jan-janssen a5b5098
Update quantum_espresso_workflow.py
jan-janssen e3a22b7
Fix `aiida_to_jobflow_simple`
GeigerJ2 568c7cc
Fix `aiida_to_pyiron_base_simple`
GeigerJ2 437a661
Add gitignore
GeigerJ2 02b60ff
Make AiiDA WG construction for QE WF a helper function.
GeigerJ2 a6e9045
Add aiida_to_jobflow_qe.ipynb. Still some KeyError occurring.
GeigerJ2 275e9f2
Merge remote-tracking branch 'upstream/main' into aiida-pythonjob
GeigerJ2 2ecc1f9
Introduce `workflow_json_filename` variable
GeigerJ2 cc52f6e
Add `jobflow_to_aiida_simple.ipynb`
GeigerJ2 c1a30dc
Set WG version to latest commit on main.
GeigerJ2 707694f
Add `verdi presto` profile setup to `aiida.yml`
GeigerJ2 d14048b
Fix aiida.yml
GeigerJ2 1d911ae
Add AiiDA profile creation to pyiron and jobflow CI YAML.
GeigerJ2 5941621
Update jobflow.yml
jan-janssen ae86016
Update pyiron.yml
jan-janssen 533825e
Fix `nodes` in JSON file created by AiiDA QE WF. Next debug jobflow run.
GeigerJ2 7b6b3cf
Found bug in WorkGraph. Continue now with task order.
GeigerJ2 5a8a66c
Tasks are now in same order. Just need to add links.
GeigerJ2 a55574d
Manual AiiDA WG creation now seems fixed.
GeigerJ2 9a5585b
Current WG working version
GeigerJ2 0295262
Adding python files used for being able to enter debugger. Remove onc…
GeigerJ2 37a6279
Add intermediate JSON files of universal workflow reprs for debugging.
GeigerJ2 6e33244
Bump WorkGraph to latest and fix a couple of issues
superstar54 587235f
All AiiDA notebooks should be there.
GeigerJ2 6be97b8
Set aiida-workgraph version in `environment.yaml`
GeigerJ2 c144821
Fix load_profile
GeigerJ2 9c05728
Fix typos in CI YAML files
GeigerJ2 a1d6e4b
Cleaan up quantum_espresso_workflow.py
GeigerJ2 2c250a8
Add upterm to connect to Runner to debug AiiDA
GeigerJ2 df5177c
Merge branch 'main' into aiida-pythonjob
jan-janssen 0d31b77
Merge branch 'main' into aiida-pythonjob
jan-janssen f4443b4
Update .github/workflows/aiida.yml
jan-janssen d8bf13d
Update .github/workflows/aiida.yml
jan-janssen 34eb703
Fix load_profile and remove upterm
GeigerJ2 e82e7e1
Add pyiron_base_to_aiida_simple.ipynb
GeigerJ2 af7ea44
Start cleaning up repo.
GeigerJ2 e21cdf0
Only need to fix pyiron_base_to_aiida_qe.ipynb
GeigerJ2 76cd86a
Adapt `workflow_json_filename`s
GeigerJ2 701efb7
Latest file versions. Should we use nbstripout?
GeigerJ2 f0d3faf
Fix universal_qe_to_aiida.ipynb
GeigerJ2 abfba34
Comment out last cell of pyiron_base_to_aiida_qe.ipynb
GeigerJ2 56b98a1
Uncomment last cell so that CI doesn't seem successfull even though t…
GeigerJ2 a5faaaa
Update pyiron_base.py
jan-janssen 09844b3
Update pyiron_base.py
jan-janssen File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,36 @@ | ||
| name: pyiron | ||
|
|
||
| on: | ||
| push: | ||
| branches: [ main ] | ||
| pull_request: | ||
|
|
||
| jobs: | ||
| build: | ||
|
|
||
| runs-on: ubuntu-latest | ||
|
|
||
| steps: | ||
| - uses: actions/checkout@v4 | ||
| - uses: conda-incubator/setup-miniconda@v3 | ||
| with: | ||
| auto-update-conda: true | ||
| python-version: "3.12" | ||
| environment-file: environment.yml | ||
| auto-activate-base: false | ||
| - name: Tests | ||
| shell: bash -l {0} | ||
| run: | | ||
| pip install -e adis_tools | ||
| pip install -e python_workflow_definition | ||
| conda install -c conda-forge jupyter papermill | ||
| export ESPRESSO_PSEUDO=$(pwd)/espresso/pseudo | ||
jan-janssen marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
|
||
| papermill universal_simple_to_aiida.ipynb universal_simple_to_aiida_out.ipynb -k "python3" | ||
| papermill universal_qe_to_aiida.ipynb universal_qe_to_aiida_out.ipynb -k "python3" | ||
| papermill aiida_to_pyiron_base_simple.ipynb aiida_to_pyiron_base_simple_out.ipynb -k "python3" | ||
| papermill aiida_to_pyiron_base_qe.ipynb pyiron_base_to_jobflow_qe_out.ipynb -k "python3" | ||
| papermill aiida_to_jobflow_simple.ipynb aiida_to_jobflow_simple_out.ipynb -k "python3" | ||
| papermill aiida_to_jobflow_qe.ipynb aiida_to_jobflow_qe_out.ipynb -k "python3" | ||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
133 changes: 133 additions & 0 deletions
133
python_workflow_definition/src/python_workflow_definition/aiida.py
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,133 @@ | ||
| from importlib import import_module | ||
| import traceback | ||
| from aiida_workgraph import WorkGraph, task | ||
| import json | ||
|
|
||
| def pickle_node(value): | ||
| """Handle data nodes""" | ||
| return value | ||
|
|
||
|
|
||
| def get_item(data: dict, key: str): | ||
| """Handle get item from the outputs""" | ||
| return data[key] | ||
|
|
||
| # I defined the function mapping here | ||
| # but in principle, this is not needed, because one can import the function from the module path | ||
| # func_mapping = { | ||
| # "simple_workflow.add_x_and_y": task.pythonjob()(add_x_and_y), | ||
| # "simple_workflow.add_x_and_y_and_z": task.pythonjob()(add_x_and_y_and_z), | ||
| # "quantum_espresso_workflow.get_bulk_structure": task.pythonjob()( | ||
| # get_bulk_structure | ||
| # ), | ||
| # "quantum_espresso_workflow.calculate_qe": task.pythonjob()(calculate_qe), | ||
| # "quantum_espresso_workflow.plot_energy_volume_curve": task.pythonjob()( | ||
| # plot_energy_volume_curve | ||
| # ), | ||
| # "python_workflow_definition.pyiron_base.get_dict": task.pythonjob()(get_dict), | ||
| # "python_workflow_definition.pyiron_base.get_list": task.pythonjob()(get_list), | ||
| # "quantum_espresso_workflow.generate_structures": task.pythonjob()( | ||
| # generate_structures | ||
| # ), | ||
| # } | ||
|
|
||
|
|
||
| def write_workflow_json(wg): | ||
| wgdata = wg.to_dict() | ||
| data = {"nodes": {}, "edges": []} | ||
| node_name_mapping = {} | ||
| i = 0 | ||
| for name, node in wgdata["tasks"].items(): | ||
| node_name_mapping[name] = i | ||
| callable_name = node["executor"]["callable_name"] | ||
| data["nodes"][i] = callable_name | ||
| if callable_name == "pickle_node": | ||
| data["nodes"][i] = node["inputs"]["value"]["property"]["value"].value | ||
| i += 1 | ||
|
|
||
| for link in wgdata["links"]: | ||
| if wgdata["tasks"][link["from_node"]]["executor"]["callable_name"] == "pickle_node": | ||
| link["from_socket"] = None | ||
| link["from_node"] = node_name_mapping[link["from_node"]] | ||
| link["to_node"] = node_name_mapping[link["to_node"]] | ||
| data["edges"].append(link) | ||
|
|
||
| return data | ||
|
|
||
|
|
||
| def load_workflow_json(filename): | ||
|
|
||
| with open(filename) as f: | ||
| data = json.load(f) | ||
|
|
||
| wg = WorkGraph() | ||
| task_name_mapping = {} | ||
| # add tasks | ||
| name_counter = {} | ||
|
|
||
| for name, identifier in data["nodes"].items(): | ||
| # if isinstance(identifier, str) and identifier in func_mapping: | ||
| if isinstance(identifier, str) and "." in identifier: | ||
| p, m = identifier.rsplit('.', 1) | ||
| mod = import_module(p) | ||
| _func = getattr(mod, m) | ||
| func = task.pythonjob()(_func) | ||
| # func = func_mapping[identifier] | ||
| # I use the register_pickle_by_value, because the function is defined in a local file | ||
| try: | ||
| wg.add_task(func, register_pickle_by_value=True, name=m) | ||
| except ValueError: | ||
| if m in name_counter: | ||
| name_counter[m] += 1 | ||
| else: | ||
| name_counter[m] = 0 | ||
| name_ = f"{m}_{name_counter[m]}" | ||
|
|
||
| wg.add_task(func, register_pickle_by_value=True, name=name_) | ||
|
|
||
| # Remove the default result output, because we will add the outputs later from the data in the link | ||
| del wg.tasks[-1].outputs["result"] | ||
| else: | ||
| # data task | ||
| wg.add_task(pickle_node, value=identifier, name=name) | ||
|
|
||
| task_name_mapping[name] = wg.tasks[-1].name | ||
| # add links | ||
| for link in data["edges"]: | ||
| if link["sourceHandle"] is None: | ||
| link["sourceHandle"] = "result" | ||
| try: | ||
| from_task = wg.tasks[task_name_mapping[str(link["source"])]] | ||
| # because we are not define the outputs explicitly during the pythonjob creation | ||
| # we add it here, and assume the output exit | ||
| if link["sourceHandle"] not in from_task.outputs: | ||
| from_socket = from_task.add_output( | ||
| "workgraph.any", | ||
| name=link["sourceHandle"], | ||
| metadata={"is_function_output": True}, | ||
| ) | ||
| else: | ||
| from_socket = from_task.outputs[link["sourceHandle"]] | ||
| to_task = wg.tasks[task_name_mapping[str(link["target"])]] | ||
| # if the input is not exit, it means we pass the data into to the kwargs | ||
| # in this case, we add the input socket | ||
| if link["targetHandle"] not in to_task.inputs: | ||
| # | ||
| to_socket = to_task.add_input( | ||
| "workgraph.any", | ||
| name=link["targetHandle"], | ||
| metadata={"is_function_input": True}, | ||
| ) | ||
| else: | ||
| to_socket = to_task.inputs[link["targetHandle"]] | ||
| wg.add_link(from_socket, to_socket) | ||
| except Exception as e: | ||
| traceback.print_exc() | ||
| print("Failed to link", link, "with error:", e) | ||
| return wg | ||
|
|
||
| def get_list(**kwargs): | ||
| return list(kwargs.values()) | ||
|
|
||
| def get_dict(**kwargs): | ||
| return {k: v for k, v in kwargs.items()} |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,106 @@ | ||
| { | ||
| "cells": [ | ||
| { | ||
| "cell_type": "code", | ||
| "execution_count": 1, | ||
| "metadata": {}, | ||
| "outputs": [], | ||
| "source": [ | ||
| "\n", | ||
| "from python_workflow_definition.aiida import load_workflow_json" | ||
| ] | ||
| }, | ||
| { | ||
| "cell_type": "code", | ||
| "execution_count": 2, | ||
| "metadata": {}, | ||
| "outputs": [ | ||
| { | ||
| "name": "stdout", | ||
| "output_type": "stream", | ||
| "text": [ | ||
| "outputs [{'identifier': 'workgraph.any', 'name': 'result'}]\n", | ||
| "outputs [{'identifier': 'workgraph.any', 'name': 'result'}]\n", | ||
| "outputs [{'identifier': 'workgraph.any', 'name': 'result'}]\n", | ||
| "outputs [{'identifier': 'workgraph.any', 'name': 'result'}]\n", | ||
| "outputs None\n", | ||
| "outputs None\n", | ||
| "outputs None\n", | ||
| "outputs None\n", | ||
| "outputs [{'identifier': 'workgraph.any', 'name': 'result'}]\n", | ||
| "outputs None\n", | ||
| "outputs None\n", | ||
| "outputs None\n", | ||
| "outputs None\n", | ||
| "outputs None\n", | ||
| "outputs None\n", | ||
| "outputs None\n", | ||
| "outputs None\n", | ||
| "outputs None\n", | ||
| "outputs None\n", | ||
| "outputs None\n", | ||
| "outputs [{'identifier': 'workgraph.any', 'name': 'result'}]\n" | ||
| ] | ||
| } | ||
| ], | ||
| "source": [ | ||
| "workgraph = load_workflow_json(filename='workflow_qe.json')" | ||
| ] | ||
| }, | ||
| { | ||
| "cell_type": "code", | ||
| "execution_count": 3, | ||
| "metadata": {}, | ||
| "outputs": [ | ||
| { | ||
| "data": { | ||
| "application/vnd.jupyter.widget-view+json": { | ||
| "model_id": "81c4d0a6e21042c7b82ccca0fdef201f", | ||
| "version_major": 2, | ||
| "version_minor": 1 | ||
| }, | ||
| "text/plain": [ | ||
| "NodeGraphWidget(settings={'minimap': True}, style={'width': '90%', 'height': '600px'}, value={'name': 'WorkGra…" | ||
| ] | ||
| }, | ||
| "execution_count": 3, | ||
| "metadata": {}, | ||
| "output_type": "execute_result" | ||
| } | ||
| ], | ||
| "source": [ | ||
| "\n", | ||
| "# TODO: Create inputs rather than tasks out of data nodes\n", | ||
| "workgraph" | ||
| ] | ||
| }, | ||
| { | ||
| "cell_type": "code", | ||
| "execution_count": null, | ||
| "metadata": {}, | ||
| "outputs": [], | ||
| "source": [] | ||
| } | ||
| ], | ||
| "metadata": { | ||
| "kernelspec": { | ||
| "display_name": "ADIS", | ||
| "language": "python", | ||
| "name": "python3" | ||
| }, | ||
| "language_info": { | ||
| "codemirror_mode": { | ||
| "name": "ipython", | ||
| "version": 3 | ||
| }, | ||
| "file_extension": ".py", | ||
| "mimetype": "text/x-python", | ||
| "name": "python", | ||
| "nbconvert_exporter": "python", | ||
| "pygments_lexer": "ipython3", | ||
| "version": "3.10.12" | ||
| } | ||
| }, | ||
| "nbformat": 4, | ||
| "nbformat_minor": 2 | ||
| } |
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.