Skip to content

Commit

Permalink
Merge pull request #49 from legend-exp/dev
Browse files Browse the repository at this point in the history
Support YAML files
  • Loading branch information
gipert authored Dec 30, 2023
2 parents 703abea + 82e1d02 commit a0de1dc
Show file tree
Hide file tree
Showing 32 changed files with 441 additions and 410 deletions.
2 changes: 1 addition & 1 deletion .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ repos:
rev: "v4.0.0-alpha.8"
hooks:
- id: prettier
types_or: [yaml, markdown, html, css, scss, javascript, json]
types_or: [yaml, markdown, json]
args: [--prose-wrap=always]

- repo: https://github.com/astral-sh/ruff-pre-commit
Expand Down
2 changes: 1 addition & 1 deletion docs/source/index.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
Welcome to pylegendmeta's documentation!
========================================

*pylegendmeta* is a Python package to access arbitrary JSON file databases,
*pylegendmeta* is a Python package to access arbitrary text file databases,
specialized to the legend-metadata_ repository, which stores `LEGEND metadata
<https://legend-exp.github.io/legend-data-format-specs/dev/metadata>`_.

Expand Down
33 changes: 19 additions & 14 deletions docs/source/tutorial.rst
Original file line number Diff line number Diff line change
Expand Up @@ -17,9 +17,14 @@ temporary (i.e. not preserved across system reboots) directory.
it or, alternatively, as an argument to the :class:`~.core.LegendMetadata`
constructor. Recommended if a custom legend-metadata_ is needed.

:class:`~.core.LegendMetadata` is a :class:`~.jsondb.JsonDB` object, which
implements an interface to a database of JSON files arbitrary scattered in a
filesystem. ``JsonDB`` does not assume any directory structure or file naming.
:class:`~.core.LegendMetadata` is a :class:`~.textdb.textdb` object, which
implements an interface to a database of text files arbitrary scattered in a
filesystem. ``TextDB`` does not assume any directory structure or file naming.

.. note::

Currently supported file formats are `JSON <https://json.org>`_ and `YAML
<https://yaml.org>`_.

Access
------
Expand All @@ -32,7 +37,7 @@ Let's consider the following database:
├── dir1
│   └── file1.json
├── file2.json
├── file3.json
├── file3.yaml
└── validity.jsonl
With:
Expand All @@ -45,10 +50,10 @@ With:
"value": 1
}
and similarly ``file2.json`` and ``file3.json``.
and similarly ``file2.json`` and ``file3.yaml``.

``JsonDB`` treats directories, files and JSON keys at the same semantic level.
Internally, the database is represented as a :class:`dict`, and can be
``TextDB`` treats directories, files and JSON/YAML keys at the same semantic
level. Internally, the database is represented as a :class:`dict`, and can be
therefore accessed with the same syntax:

>>> lmeta["dir1"] # a dict
Expand Down Expand Up @@ -77,8 +82,8 @@ Metadata validity
Mappings of metadata to time periods, data taking systems etc. are specified
through JSONL files (`specification
<https://legend-exp.github.io/legend-data-format-specs/dev/metadata>`_).
If a ``.jsonl`` file is present in a directory, ``JsonDB``
exposes the :meth:`~.jsondb.JsonDB.on` interface to perform a query.
If a ``.jsonl`` file is present in a directory, ``TextDB``
exposes the :meth:`~.textdb.textdb.on` interface to perform a query.

Let's assume the ``legend-metadata`` directory from the example above contains
the following file:
Expand All @@ -88,7 +93,7 @@ the following file:
:caption: ``validity.jsonl``
{"valid_from": "20220628T000000Z", "select": "all", "apply": ["file2.json"]}
{"valid_from": "20220629T000000Z", "select": "all", "apply": ["file3.json"]}
{"valid_from": "20220629T000000Z", "select": "all", "apply": ["file3.yaml"]}
From code, it's possible to obtain the metadata valid for a certain time point:

Expand Down Expand Up @@ -119,7 +124,7 @@ channel map:
Remapping and grouping metadata
-------------------------------

A second important method of ``JsonDB`` is :meth:`.JsonDB.map`, which allows to
A second important method of ``TextDB`` is :meth:`.textdb.map`, which allows to
query ``(key, value)`` dictionaries with an alternative unique key defined in
``value``. A typical application is querying parameters in a channel map
corresponding to a certain DAQ channel:
Expand All @@ -135,10 +140,10 @@ corresponding to a certain DAQ channel:
...

If the requested key is not unique, an exception will be raised.
:meth:`.JsonDB.map` can, however, handle non-unique keys too and return a
:meth:`.textdb.map` can, however, handle non-unique keys too and return a
dictionary of matching entries instead, keyed by an arbitrary integer to allow
further :meth:`.JsonDB.map` calls. The behavior is achieved by using
:meth:`.JsonDB.group` or by setting the ``unique`` argument flag. A typical
further :meth:`.textdb.map` calls. The behavior is achieved by using
:meth:`.textdb.group` or by setting the ``unique`` argument flag. A typical
application is retrieving all channels attached to the same CC4:

>>> chmap = lmeta.hardware.configuration.channelmaps.on(datetime.now())
Expand Down
1 change: 1 addition & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,7 @@ requires-python = ">=3.9"
dependencies = [
"GitPython",
"pandas",
"pyyaml",
"psycopg2-binary",
"sqlalchemy>=2",
]
Expand Down
11 changes: 6 additions & 5 deletions src/legendmeta/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,17 +16,18 @@
"""A package to access `legend-metadata <https://github.com/legend-exp/legend-metadata>`_ in Python."""
from __future__ import annotations

from legendmeta._version import version as __version__
from legendmeta.catalog import to_datetime
from legendmeta.core import LegendMetadata
from legendmeta.jsondb import AttrsDict, JsonDB
from legendmeta.slowcontrol import LegendSlowControlDB
from ._version import version as __version__
from .catalog import to_datetime
from .core import LegendMetadata
from .slowcontrol import LegendSlowControlDB
from .textdb import AttrsDict, JsonDB, TextDB

__all__ = [
"__version__",
"LegendMetadata",
"LegendSlowControlDB",
"JsonDB",
"TextDB",
"AttrsDict",
"to_datetime",
]
29 changes: 15 additions & 14 deletions src/legendmeta/catalog.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,8 @@
from pathlib import Path
from string import Template

from . import utils


def to_datetime(value):
"""Convert a LEGEND timestamp (or key) to :class:`datetime.datetime`."""
Expand Down Expand Up @@ -134,24 +136,23 @@ def read_from(sources, subst_pathvar=False, trim_null=False):
def read_impl(sources):
if isinstance(sources, str):
file_name = sources
with Path(file_name).open() as file:
result = json.load(file)
if subst_pathvar:
Props.subst_vars(
result,
var_values={"_": Path(file_name).parent},
ignore_missing=True,
)
return result

elif isinstance(sources, list):
result = utils.load_dict(file_name)
if subst_pathvar:
Props.subst_vars(
result,
var_values={"_": Path(file_name).parent},
ignore_missing=True,
)
return result

if isinstance(sources, list):
result = {}
for p in map(read_impl, sources):
Props.add_to(result, p)
return result
else:
msg = f"Can't run Props.read_from on sources-value of type {type(sources)}"
raise ValueError(msg)

msg = f"Can't run Props.read_from on sources-value of type {type(sources)}"
raise ValueError(msg)

result = read_impl(sources)
if trim_null:
Expand Down
8 changes: 4 additions & 4 deletions src/legendmeta/core.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,12 +24,12 @@

from git import GitCommandError, InvalidGitRepositoryError, Repo

from .jsondb import AttrsDict, JsonDB
from .textdb import AttrsDict, TextDB

log = logging.getLogger(__name__)


class LegendMetadata(JsonDB):
class LegendMetadata(TextDB):
"""LEGEND metadata.
Class representing the LEGEND metadata repository with utilities for fast
Expand All @@ -42,7 +42,7 @@ class LegendMetadata(JsonDB):
git-clone through SSH. If ``None``, legend-metadata will be cloned
in a temporary directory (see :func:`tempfile.gettempdir`).
**kwargs
further keyword arguments forwarded to :math:`JsonDB.__init__`.
further keyword arguments forwarded to :math:`TextDB.__init__`.
"""

def __init__(self, path: str | None = None, **kwargs) -> None:
Expand Down Expand Up @@ -121,7 +121,7 @@ def channelmap(self, on: str | datetime | None = None) -> AttrsDict:
See Also
--------
.jsondb.JsonDB.on
.textdb.TextDB.on
"""
if on is None:
on = datetime.now()
Expand Down
56 changes: 27 additions & 29 deletions src/legendmeta/police.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,8 @@
from importlib import resources
from pathlib import Path

from .jsondb import JsonDB
from . import utils
from .textdb import TextDB

templates = resources.files("legendmeta") / "templates"

Expand All @@ -36,41 +37,39 @@ def validate_legend_detector_db() -> bool:
prog="validate-legend-detdb", description="Validate LEGEND detector database"
)

parser.add_argument("files", nargs="+", help="JSON files")
parser.add_argument("files", nargs="+", help="files")

args = parser.parse_args()

dict_temp = {}
for typ in ("bege", "ppc", "coax", "icpc"):
with (templates / f"{typ}-detector.json").open() as f:
dict_temp[typ] = json.load(f)
dict_temp[typ] = utils.load_dict(templates / f"{typ}-detector.yaml")

for file in args.files:
valid = True

with Path(file).open() as f:
entry = json.load(f)
entry = utils.load_dict(file)

if "type" not in entry:
print( # noqa: T201
f"ERROR: '{file}' entry does not contain 'type' key"
)
valid *= False
continue

if entry["type"] not in dict_temp:
print( # noqa: T201
f"WARNING: '{file}': no template for type '{entry['type']}' detector"
)
continue
if "type" not in entry:
print( # noqa: T201
f"ERROR: '{file}' entry does not contain 'type' key"
)
valid *= False
continue

valid *= validate_dict_schema(
entry,
dict_temp[entry["type"]],
greedy=False,
typecheck=True,
root_obj=file,
if entry["type"] not in dict_temp:
print( # noqa: T201
f"WARNING: '{file}': no template for type '{entry['type']}' detector"
)
continue

valid *= validate_dict_schema(
entry,
dict_temp[entry["type"]],
greedy=False,
typecheck=True,
root_obj=file,
)

if not valid:
sys.exit(1)
Expand All @@ -85,23 +84,22 @@ def validate_legend_channel_map() -> bool:
prog="validate-legend-chmaps", description="Validate LEGEND channel map files"
)

parser.add_argument("files", nargs="+", help="JSON channel maps files")
parser.add_argument("files", nargs="+", help="channel maps files")

args = parser.parse_args()

dict_temp = {}
for typ in ("geds", "spms"):
with (templates / f"{typ}-channel.json").open() as f:
dict_temp[typ] = json.load(f)
dict_temp[typ] = utils.load_dict(templates / f"{typ}-channel.yaml")

for d in {Path(f).parent for f in args.files}:
db = JsonDB(d)
db = TextDB(d)
valid = True

with Path(f"{d}/validity.jsonl").open() as f:
for line in f.readlines():
ts = json.loads(line)["valid_from"]
sy = json.loads(line)["category"]
sy = json.loads(line)["select"]
chmap = db.on(ts, system=sy)

for k, v in chmap.items():
Expand Down
2 changes: 1 addition & 1 deletion src/legendmeta/scdb_tables.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@
import sqlalchemy as db
from sqlalchemy.orm import DeclarativeBase, Mapped

from .jsondb import AttrsDict
from .textdb import AttrsDict


class Base(DeclarativeBase):
Expand Down
40 changes: 0 additions & 40 deletions src/legendmeta/templates/bege-detector.json

This file was deleted.

30 changes: 30 additions & 0 deletions src/legendmeta/templates/bege-detector.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
name: ^B\w{6}$
type: bege
production:
manufacturer: ""
order: 0
crystal: ""
slice: ""
enrichment: 0.0
passivation: true
reprocessing: false
mass_in_g: 0.0
delivered:
geometry:
height_in_mm: 0.0
radius_in_mm: 0.0
groove:
depth_in_mm: 0.0
radius_in_mm:
outer: 0.0
inner: 0.0
pp_contact:
radius_in_mm: 0.0
depth_in_mm: 0.0
taper:
top:
angle_in_deg: 0.0
height_in_mm: 0.0
bottom:
angle_in_deg: 0.0
height_in_mm: 0.0
Loading

0 comments on commit a0de1dc

Please sign in to comment.