Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue/540/docs updates #542

Open
wants to merge 5 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 5 additions & 5 deletions docs/tutorials/pre_executed/des-gaia.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
"source": [
"# Cross-matching of large catalogs: DES to Gaia\n",
"\n",
"In this tutorial we cross-match nain tables of Dark Energy Survey (DES) DR2 and Gaia DR3 catalogs. The outline of the tutorial is as follows:\n",
"In this tutorial we cross-match tables of Dark Energy Survey (DES) DR2 and Gaia DR3 catalogs. The outline of the tutorial is as follows:\n",
"\n",
"1. Get original data files\n",
"2. Convert the data to HATS format using [hats-import](https://github.com/astronomy-commons/hats-import/)\n",
Expand Down Expand Up @@ -135,11 +135,11 @@
"source": [
"### DES DR2\n",
"\n",
"The Dark Eenrgy Survey DR2 catalog is hosted by NCSA, see the [official website](https://des.ncsa.illinois.edu/releases/dr2) for more information. Data files, in [FITS](https://fits.gsfc.nasa.gov) format, are located at <https://desdr-server.ncsa.illinois.edu/despublic/dr2_tiles/>. You may also prefer to get the data with S3 client as described by <https://desdr-server.ncsa.illinois.edu>\n",
"The Dark Energy Survey DR2 catalog is hosted by NCSA, see the [official website](https://des.ncsa.illinois.edu/releases/dr2) for more information. Data files, in [FITS](https://fits.gsfc.nasa.gov) format, are located at <https://desdr-server.ncsa.illinois.edu/despublic/dr2_tiles/>. You may also prefer to get the data with S3 client as described by <https://desdr-server.ncsa.illinois.edu>.\n",
"\n",
"We use `*dr2_main.fits` files for the main catalog table, see [the schema here](https://des.ncsa.illinois.edu/releases/dr2/dr2-products/dr2-schema).\n",
"We use `*dr2_main.fits` files for the main catalog table; see [the schema here](https://des.ncsa.illinois.edu/releases/dr2/dr2-products/dr2-schema).\n",
"\n",
"Here we download a few first files to demonstrate the pipeline. The full catalog is about 1.1TB, feel free to download it if you have enough storage."
"Here we download a few first files to demonstrate the pipeline. The full catalog is about 1.1TB--feel free to download it if you have enough storage."
]
},
{
Expand Down Expand Up @@ -183,7 +183,7 @@
"\n",
"We use `gaia_source` table, see its [schema here](https://gea.esac.esa.int/archive/documentation/GDR3/Gaia_archive/chap_datamodel/sec_dm_main_source_catalogue/ssec_dm_gaia_source.html).\n",
"\n",
"Here we donwload a few files which barely correspond to the same area of the sky as the DES DR2 files above. The full catalog is much larger, feel free to download it all if you have enough storage."
"Here we download a few files which barely correspond to the same area of the sky as the DES DR2 files above. The full catalog is much larger, feel free to download it all if you have enough storage."
]
},
{
Expand Down
10 changes: 1 addition & 9 deletions docs/tutorials/pre_executed/ztf-alerts-sne.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -1085,7 +1085,7 @@
"source": [
"### Extract features and filter\n",
"\n",
"Here we use [`light-curve`](https://github.com/light-curve/light-curve-python) package to fit each r-band light-curve with Bazin function (Bazin+2009).\n",
"Here we use the [light-curve](https://github.com/light-curve/light-curve-python) package to fit each r-band light-curve with Bazin function (Bazin+2009).\n",
"\n",
"<div>\n",
"<img src=\"attachment:8cf4f6f1-b501-4ff8-970c-a1645c8009ce.png\" width=\"250\"/>\n",
Expand Down Expand Up @@ -2155,14 +2155,6 @@
" plot_lc(lc, nondet, title=oid)\n",
" print(f\"https://alerce.online/object/{oid}\")"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "ce797b850c4c2dd9",
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
Expand Down
12 changes: 12 additions & 0 deletions src/lsdb/catalog/catalog.py
Original file line number Diff line number Diff line change
Expand Up @@ -123,6 +123,18 @@ def assign(self, **kwargs) -> Catalog:

Returns:
The catalog containing both the old columns and the newly created columns

Examples:
Create a new column using a function::

catalog = Catalog(...)
catalog = catalog.assign(new_col=lambda df: df['existing_col'] * 2)

Add a column from a 1-D Dask array:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Add a column from a 1-D Dask array:
Add a column from a 1-D Dask array::

Needed here as well.


import dask.array as da
new_data = da.arange(...)
catalog = catalog.assign(new_col=new_data)
"""
ddf = self._ddf.assign(**kwargs)
return Catalog(ddf, self._ddf_pixel_map, self.hc_structure)
Expand Down
Loading