Skip to content

Commit

Permalink
Resolve merge conflicts
Browse files Browse the repository at this point in the history
  • Loading branch information
camposandro committed Apr 11, 2024
2 parents 25ed8a2 + 524f793 commit 6f60afa
Show file tree
Hide file tree
Showing 8 changed files with 13 additions and 15 deletions.
2 changes: 1 addition & 1 deletion .gitattributes
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# For explanation of this file and uses see
# https://git-scm.com/docs/gitattributes
# https://developer.lsst.io/git/git-lfs.html#using-git-lfs-enabled-repositories
# https://lincc-ppt.readthedocs.io/en/latest/practices/git-lfs.html
# https://lincc-ppt.readthedocs.io/en/stable/practices/git-lfs.html
#
# Used by https://github.com/lsst/afwdata.git
# *.boost filter=lfs diff=lfs merge=lfs -text
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/pre-commit-ci.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# This workflow runs pre-commit hooks on pushes and pull requests to main
# to enforce coding style. To ensure correct configuration, please refer to:
# https://lincc-ppt.readthedocs.io/en/latest/practices/ci_precommit.html
# https://lincc-ppt.readthedocs.io/en/stable/practices/ci_precommit.html
name: Run pre-commit hooks

on:
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/smoke-test.yml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# This workflow will run daily at 06:45.
# It will install Python dependencies and run tests with a variety of Python versions.
# See documentation for help debugging smoke test issues:
# https://lincc-ppt.readthedocs.io/en/latest/practices/ci_testing.html#version-culprit
# https://lincc-ppt.readthedocs.io/en/stable/practices/ci_testing.html#version-culprit

name: Unit test smoke test

Expand Down
10 changes: 5 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@

# LSDB

[![Template](https://img.shields.io/badge/Template-LINCC%20Frameworks%20Python%20Project%20Template-brightgreen)](https://lincc-ppt.readthedocs.io/en/latest/)
[![Template](https://img.shields.io/badge/Template-LINCC%20Frameworks%20Python%20Project%20Template-brightgreen)](https://lincc-ppt.readthedocs.io/en/stable/)

[![PyPI](https://img.shields.io/pypi/v/lsdb?color=blue&logo=pypi&logoColor=white)](https://pypi.org/project/lsdb/)
[![Conda](https://img.shields.io/conda/vn/conda-forge/lsdb.svg?color=blue&logo=condaforge&logoColor=white)](https://anaconda.org/conda-forge/lsdb)
Expand All @@ -19,21 +19,21 @@ A framework to facilitate and enable spatial analysis for extremely large astron
(i.e. querying and crossmatching O(1B) sources). This package uses dask to parallelize operations across
multiple HiPSCat partitioned surveys.

Check out our [ReadTheDocs site](https://lsdb.readthedocs.io/en/latest/)
Check out our [ReadTheDocs site](https://lsdb.readthedocs.io/en/stable/)
for more information on partitioning, installation, and contributing.

See related projects:

* HiPSCat ([on GitHub](https://github.com/astronomy-commons/hipscat))
([on ReadTheDocs](https://hipscat.readthedocs.io/en/latest/))
([on ReadTheDocs](https://hipscat.readthedocs.io/en/stable/))
* HiPSCat Import ([on GitHub](https://github.com/astronomy-commons/hipscat-import))
([on ReadTheDocs](https://hipscat-import.readthedocs.io/en/latest/))
([on ReadTheDocs](https://hipscat-import.readthedocs.io/en/stable/))

## Contributing

[![GitHub issue custom search in repo](https://img.shields.io/github/issues-search/astronomy-commons/lsdb?color=purple&label=Good%20first%20issues&query=is%3Aopen%20label%3A%22good%20first%20issue%22)](https://github.com/astronomy-commons/lsdb/issues?q=is%3Aissue+is%3Aopen+label%3A%22good+first+issue%22)

See the [contribution guide](https://lsdb.readthedocs.io/en/latest/developer/contributing.html)
See the [contribution guide](https://lsdb.readthedocs.io/en/stable/developer/contributing.html)
for complete installation instructions and contribution best practices.

## Acknowledgements
Expand Down
4 changes: 2 additions & 2 deletions docs/developer/contributing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -31,8 +31,8 @@ Notes:
2) ``pre-commit install`` will initialize pre-commit for this local repository, so
that a set of tests will be run prior to completing a local commit. For more
information, see the Python Project Template documentation on
`pre-commit <https://lincc-ppt.readthedocs.io/en/latest/practices/precommit.html>`_.
`pre-commit <https://lincc-ppt.readthedocs.io/en/stable/practices/precommit.html>`_.
3) Installing ``pandoc`` allows you to verify that automatic rendering of Jupyter notebooks
into documentation for ReadTheDocs works as expected. For more information, see
the Python Project Template documentation on
`Sphinx and Python Notebooks <https://lincc-ppt.readthedocs.io/en/latest/practices/sphinx.html#python-notebooks>`_.
`Sphinx and Python Notebooks <https://lincc-ppt.readthedocs.io/en/stable/practices/sphinx.html#python-notebooks>`_.
2 changes: 1 addition & 1 deletion docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ LSDB is a framework that facilitates and enables fast spatial analysis for extre
particular those brought up by `LSST <https://www.lsst.org/about>`_.

Built on top of Dask to efficiently scale and parallelize operations across multiple workers, it leverages
the `HiPSCat <https://hipscat.readthedocs.io/en/latest/>`_ data format for surveys in a partitioned HEALPix
the `HiPSCat <https://hipscat.readthedocs.io/en/stable/>`_ data format for surveys in a partitioned HEALPix
(Hierarchical Equal Area isoLatitude Pixelization) structure.

.. figure:: _static/gaia.png
Expand Down
2 changes: 1 addition & 1 deletion tests/lsdb/catalog/test_polygon_search.py
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ def test_polygon_search_empty(small_sky_order1_catalog):
vertices = [(0, 0), (1, 1), (0, 2)]
polygon_search_catalog = small_sky_order1_catalog.polygon_search(vertices)
assert len(polygon_search_catalog.get_healpix_pixels()) == 0
assert len(polygon_search_catalog.hc_structure.pixel_tree) == 1
assert len(polygon_search_catalog.hc_structure.pixel_tree) == 0


def test_polygon_search_coarse_versus_fine(small_sky_order1_catalog):
Expand Down
4 changes: 1 addition & 3 deletions tests/lsdb/loaders/dataframe/test_from_dataframe.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,6 @@
from hipscat.catalog import CatalogType
from hipscat.pixel_math.healpix_pixel_function import get_pixel_argsort
from hipscat.pixel_math.hipscat_id import HIPSCAT_ID_COLUMN
from hipscat.pixel_tree.pixel_node_type import PixelNodeType

import lsdb
from lsdb.catalog.margin_catalog import MarginCatalog
Expand Down Expand Up @@ -99,8 +98,7 @@ def test_partitions_on_map_match_pixel_tree(small_sky_order1_df, small_sky_order
kwargs = get_catalog_kwargs(small_sky_order1_catalog)
catalog = lsdb.from_dataframe(small_sky_order1_df, margin_threshold=None, **kwargs)
for hp_pixel, _ in catalog._ddf_pixel_map.items():
if hp_pixel in catalog.hc_structure.pixel_tree:
assert catalog.hc_structure.pixel_tree[hp_pixel].node_type == PixelNodeType.LEAF
assert hp_pixel in catalog.hc_structure.pixel_tree


def test_from_dataframe_with_non_default_ra_dec_columns(small_sky_order1_df, small_sky_order1_catalog):
Expand Down

0 comments on commit 6f60afa

Please sign in to comment.