Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix unstructured logging variable formatting #2029

Merged
merged 48 commits into from
Sep 18, 2024
Merged
Show file tree
Hide file tree
Changes from 3 commits
Commits
Show all changes
48 commits
Select commit Hold shift + click to select a range
9d7d4c5
first implementation
aysim319 Aug 20, 2024
2c6623c
found missing and making msg consistent
aysim319 Aug 20, 2024
b10f0e5
suggested changes
aysim319 Aug 22, 2024
a2339ae
suggested change and making params more consistent
aysim319 Aug 22, 2024
114feb0
more suggested change
aysim319 Aug 22, 2024
43eff03
test and fix tests
aysim319 Aug 22, 2024
0c60b41
lint
aysim319 Aug 22, 2024
a53c2f1
lint
aysim319 Aug 27, 2024
39f26a2
Update doctor_visits/delphi_doctor_visits/run.py
aysim319 Aug 27, 2024
7262d29
suggested change
aysim319 Aug 29, 2024
8033bd6
suggested change
aysim319 Aug 29, 2024
43e1985
suggested change
aysim319 Aug 29, 2024
68a56d8
suggested change
aysim319 Aug 29, 2024
49af0b8
suggested changes
aysim319 Aug 29, 2024
539f0de
fixing order
aysim319 Aug 30, 2024
3bb80a9
lint
aysim319 Aug 30, 2024
1ae613c
make change to test
aysim319 Aug 30, 2024
4d7a6d0
rename argument
aysim319 Aug 30, 2024
bd89a30
Update _delphi_utils_python/README.md
aysim319 Aug 30, 2024
d085954
Update _delphi_utils_python/README.md
aysim319 Aug 30, 2024
b35a91e
Update _delphi_utils_python/delphi_utils/flash_eval/eval_day.py
aysim319 Aug 30, 2024
3ddd35d
Update _delphi_utils_python/delphi_utils/runner.py
aysim319 Aug 30, 2024
18f40af
suggested changes
aysim319 Sep 3, 2024
29c927b
more suggeseted changes
aysim319 Sep 3, 2024
fc2fa46
lint
aysim319 Sep 4, 2024
d0ef8aa
lint
aysim319 Sep 6, 2024
dd9e5fc
suggested change
aysim319 Sep 6, 2024
5a27f33
Update _delphi_utils_python/README.md
aysim319 Sep 6, 2024
6db7d61
suggesed changes
aysim319 Sep 6, 2024
5e04164
Merge remote-tracking branch 'origin/2025-fix-unstruc-logging' into 2…
aysim319 Sep 6, 2024
89e7245
lint
aysim319 Sep 6, 2024
e050a51
suggested changes
aysim319 Sep 6, 2024
37122f4
suggested changes
aysim319 Sep 9, 2024
e8631f8
more changes
aysim319 Sep 10, 2024
7a31f8e
more changes
aysim319 Sep 10, 2024
79b1ede
lint
aysim319 Sep 10, 2024
f8220a5
lint
aysim319 Sep 10, 2024
3d44e67
lint conflict
aysim319 Sep 10, 2024
a3d0aab
lint
aysim319 Sep 10, 2024
0b8d527
lint
aysim319 Sep 10, 2024
10df87a
lint
aysim319 Sep 10, 2024
9947332
suggested changes
aysim319 Sep 11, 2024
dcdbed8
lint
aysim319 Sep 11, 2024
b0c31ae
suggested changes
aysim319 Sep 17, 2024
0fb33fe
Delete fluview/tests/test_data/flu_metadata.json
melange396 Sep 18, 2024
bc6a1dc
Delete fluview/delphi_fluview/pull.py
melange396 Sep 18, 2024
54d183f
Delete fluview/tests/conftest.py
melange396 Sep 18, 2024
7c7f15a
Update runner.py
melange396 Sep 18, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
34 changes: 18 additions & 16 deletions changehc/delphi_changehc/run.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ def retrieve_files(params, filedate, logger):
if files["denom"] is None:

## download recent files from FTP server
logger.info("downloading recent files through SFTP")
logger.info("Downloading recent files through SFTP")
download_counts(filedate, params["indicator"]["input_cache_dir"], params["indicator"]["ftp_conn"])

denom_file = "%s/%s_Counts_Products_Denom.dat.gz" % (params["indicator"]["input_cache_dir"],filedate)
Expand Down Expand Up @@ -157,28 +157,30 @@ def run_module(params: Dict[str, Dict[str, Any]]):

startdate, enddate = process_dates(params, startdate_dt, enddate_dt)

logger.info("generating signal and exporting to CSV",
first_sensor_date = startdate,
last_sensor_date = enddate,
drop_date = dropdate,
n_backfill_days = n_backfill_days,
n_waiting_days = n_waiting_days,
geos = params["indicator"]["geos"],
export_dir = params["common"]["export_dir"],
parallel = params["indicator"]["parallel"],
weekday = params["indicator"]["weekday"],
types = params["indicator"]["types"],
se = params["indicator"]["se"])
logger.info(
"Generating signal and exporting to CSV",
first_sensor_date=startdate,
last_sensor_date=enddate,
drop_date=dropdate,
n_backfill_days=n_backfill_days,
n_waiting_days=n_waiting_days,
geos=params["indicator"]["geos"],
export_dir=params["common"]["export_dir"],
parallel=params["indicator"]["parallel"],
weekday=params["indicator"]["weekday"],
types=params["indicator"]["types"],
se=params["indicator"]["se"],
)

## start generating
stats = []
for geo in params["indicator"]["geos"]:
for numtype in params["indicator"]["types"]:
for weekday in params["indicator"]["weekday"]:
if weekday:
logger.info("starting weekday adj", geo = geo, numtype = numtype)
logger.info("Starting weekday adj", geo=geo, numtype=numtype)
else:
logger.info("starting no adj", geo = geo, numtype = numtype)
logger.info("Starting no adj", geo=geo, numtype=numtype)
su_inst = CHCSensorUpdater(
startdate,
enddate,
Expand Down Expand Up @@ -211,7 +213,7 @@ def run_module(params: Dict[str, Dict[str, Any]]):
)
stats.extend(more_stats)

logger.info("finished processing", geo = geo)
logger.info("Finished processing", geo=geo)

elapsed_time_in_seconds = round(time.time() - start_time, 2)
min_max_date = stats and min(s[0] for s in stats)
Expand Down
2 changes: 1 addition & 1 deletion changehc/delphi_changehc/update_sensor.py
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ def write_to_csv(df, geo_level, write_se, day_shift, out_name, logger, output_pa
assert df[suspicious_se_mask].empty, " se contains suspiciously large values"
assert not df["se"].isna().any(), " se contains nan values"
if write_se:
logger.info("========= WARNING: WRITING SEs TO {0} =========".format(out_name))
logger.info("WARNING: WRITING SEs", filename=out_name)
else:
df["se"] = np.nan

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ def download(ftp_credentials, out_path, logger):
"""Pull the latest raw files."""
current_time = datetime.datetime.now()
seconds_in_day = 24 * 60 * 60
logger.info("starting download", time=current_time)
logger.info("Starting download")
aysim319 marked this conversation as resolved.
Show resolved Hide resolved

# open client
client = paramiko.SSHClient()
Expand Down
2 changes: 1 addition & 1 deletion claims_hosp/delphi_claims_hosp/modify_claims_drops.py
Original file line number Diff line number Diff line change
Expand Up @@ -57,5 +57,5 @@ def modify_and_write(data_path, logger, test_mode=False):
dfs_list.append(dfs)
else:
dfs.to_csv(out_path, index=False)
logger.info(f"Wrote {out_path}")
logger.info("Wrote modified csv", filename=out_path)
return files, dfs_list
10 changes: 5 additions & 5 deletions claims_hosp/delphi_claims_hosp/run.py
Original file line number Diff line number Diff line change
Expand Up @@ -116,9 +116,9 @@ def run_module(params):
for geo in params["indicator"]["geos"]:
for weekday in params["indicator"]["weekday"]:
if weekday:
logger.info("starting weekday adj", geo = geo)
logger.info("Starting weekday adj", geo=geo)
aysim319 marked this conversation as resolved.
Show resolved Hide resolved
else:
logger.info("starting no weekday adj", geo = geo)
logger.info("Starting no weekday adj", geo=geo)
aysim319 marked this conversation as resolved.
Show resolved Hide resolved

signal_name = Config.signal_weekday_name if weekday else Config.signal_name
if params["indicator"]["write_se"]:
Expand All @@ -135,16 +135,16 @@ def run_module(params):
params["indicator"]["parallel"],
weekday,
params["indicator"]["write_se"],
signal_name
signal_name,
logger,
)
updater.update_indicator(
claims_file,
params["common"]["export_dir"],
logger,
)
max_dates.append(updater.output_dates[-1])
n_csv_export.append(len(updater.output_dates))
logger.info("finished updating", geo = geo)
logger.info("Finished updating", geo=geo)
aysim319 marked this conversation as resolved.
Show resolved Hide resolved

# Remove all the raw files
for fn in os.listdir(params["indicator"]["input_dir"]):
Expand Down
26 changes: 12 additions & 14 deletions claims_hosp/delphi_claims_hosp/update_indicator.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,6 @@
"""

# standard packages
import logging
from multiprocessing import Pool, cpu_count

# third party
Expand All @@ -28,8 +27,7 @@ class ClaimsHospIndicatorUpdater:
# pylint: disable=too-many-instance-attributes, too-many-arguments
# all variables are used

def __init__(self, startdate, enddate, dropdate, geo, parallel, weekday,
write_se, signal_name):
def __init__(self, startdate, enddate, dropdate, geo, parallel, weekday, write_se, signal_name, logger):
"""
Initialize updater for the claims-based hospitalization indicator.

Expand All @@ -53,6 +51,7 @@ def __init__(self, startdate, enddate, dropdate, geo, parallel, weekday,
# init in shift_dates, declared here for pylint
self.burnindate, self.fit_dates, self.burn_in_dates, self.output_dates = \
[None] * 4
self.logger = logger

assert (
self.startdate > (Config.FIRST_DATA_DATE + Config.BURN_IN_PERIOD)
Expand Down Expand Up @@ -114,9 +113,9 @@ def geo_reindex(self, data):
elif self.geo == "hrr":
data_frame = data # data is already adjusted in aggregation step above
else:
logging.error(
"%s is invalid, pick one of 'county', 'state', 'msa', 'hrr', 'hhs', nation'",
self.geo)
self.logger.error(
"geo is invalid, pick one of 'county', 'state', 'msa', 'hrr', 'hhs', nation'", geo=self.geo
aysim319 marked this conversation as resolved.
Show resolved Hide resolved
)
return False

unique_geo_ids = pd.unique(data_frame[self.geo])
Expand All @@ -133,7 +132,7 @@ def geo_reindex(self, data):
data_frame.fillna(0, inplace=True)
return data_frame

def update_indicator(self, input_filepath, outpath, logger):
def update_indicator(self, input_filepath, outpath):
aysim319 marked this conversation as resolved.
Show resolved Hide resolved
"""
Generate and output indicator values.

Expand All @@ -159,7 +158,7 @@ def update_indicator(self, input_filepath, outpath, logger):
["num"],
Config.DATE_COL,
[1, 1e5],
logger,
self.logger,
aysim319 marked this conversation as resolved.
Show resolved Hide resolved
)
if self.weekday
else None
Expand All @@ -182,7 +181,7 @@ def update_indicator(self, input_filepath, outpath, logger):
valid_inds[geo_id] = np.array(res.loc[final_output_inds, "incl"])
else:
n_cpu = min(Config.MAX_CPU_POOL, cpu_count())
logging.debug("starting pool with %d workers", n_cpu)
self.logger.debug("Starting pool", n_workers=n_cpu)
with Pool(n_cpu) as pool:
pool_results = []
for geo_id, sub_data in data_frame.groupby(level=0, as_index=False):
Expand Down Expand Up @@ -217,7 +216,7 @@ def update_indicator(self, input_filepath, outpath, logger):
}

self.write_to_csv(output_dict, outpath)
logging.debug("wrote files to %s", outpath)
self.logger.debug("Wrote files", directory=outpath)

def write_to_csv(self, output_dict, output_path="./receiving"):
"""
Expand All @@ -229,8 +228,7 @@ def write_to_csv(self, output_dict, output_path="./receiving"):

"""
if self.write_se:
logging.info("========= WARNING: WRITING SEs TO %s =========",
self.signal_name)
self.logger.info("WARNING: WRITING SEs", signal=self.signal_name)

geo_level = output_dict["geo_level"]
dates = output_dict["dates"]
Expand All @@ -255,7 +253,7 @@ def write_to_csv(self, output_dict, output_path="./receiving"):
assert not np.isnan(val), "value for included value is nan"
assert not np.isnan(se), "se for included rate is nan"
if val > 90:
logging.warning("value suspicious, %s: %d", geo_id, val)
self.logger.warning("value suspicious", geo=geo_id, value=val)
aysim319 marked this conversation as resolved.
Show resolved Hide resolved
assert se < 5, f"se suspicious, {geo_id}: {se}"
if self.write_se:
assert val > 0 and se > 0, "p=0, std_err=0 invalid"
Expand All @@ -267,4 +265,4 @@ def write_to_csv(self, output_dict, output_path="./receiving"):
"%s,%f,%s,%s,%s\n" % (geo_id, val, "NA", "NA", "NA"))
out_n += 1

logging.debug("wrote %d rows for %d %s", out_n, len(geo_ids), geo_level)
self.logger.debug("Wrote rows", num_rows=out_n, num_geo_ids=len(geo_ids), geo_level=geo_level)
aysim319 marked this conversation as resolved.
Show resolved Hide resolved
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ def download(ftp_credentials, out_path, logger, issue_date=None):
else:
current_time = datetime.datetime.strptime(issue_date, "%Y-%m-%d").replace(hour=23, minute=59, second=59)

logger.info("starting download", time=current_time)
logger.info("Starting download")
aysim319 marked this conversation as resolved.
Show resolved Hide resolved
seconds_in_day = 24 * 60 * 60

# open client
Expand Down
2 changes: 1 addition & 1 deletion doctor_visits/delphi_doctor_visits/modify_claims_drops.py
Original file line number Diff line number Diff line change
Expand Up @@ -48,5 +48,5 @@ def modify_and_write(f, logger, test_mode=False):

if not test_mode:
dfs.to_csv(out_path, index=False)
logger.info(f"Wrote {out_path}")
logger.info("Wrote modified csv", filename=out_path)
return dfs
11 changes: 7 additions & 4 deletions doctor_visits/delphi_doctor_visits/patch.py
Original file line number Diff line number Diff line change
Expand Up @@ -45,16 +45,19 @@ def patch():
start_issue = datetime.strptime(params["patch"]["start_issue"], "%Y-%m-%d")
end_issue = datetime.strptime(params["patch"]["end_issue"], "%Y-%m-%d")

logger.info(f"""Start patching {params["patch"]["patch_dir"]}""")
logger.info(f"""Start issue: {start_issue.strftime("%Y-%m-%d")}""")
logger.info(f"""End issue: {end_issue.strftime("%Y-%m-%d")}""")
logger.info(
"Starting patching",
patch_directory=params["patch"]["patch_dir"],
start_issue=start_issue.strftime("%Y-%m-%d"),
end_issue=end_issue.strftime("%Y-%m-%d"),
)

makedirs(params["patch"]["patch_dir"], exist_ok=True)

current_issue = start_issue

while current_issue <= end_issue:
logger.info(f"""Running issue {current_issue.strftime("%Y-%m-%d")}""")
logger.info("Running issue", issue_date=current_issue.strftime("%Y-%m-%d"))

params["patch"]["current_issue"] = current_issue.strftime("%Y-%m-%d")

Expand Down
35 changes: 18 additions & 17 deletions doctor_visits/delphi_doctor_visits/run.py
Original file line number Diff line number Diff line change
Expand Up @@ -88,32 +88,33 @@ def run_module(params, logger=None): # pylint: disable=too-many-statements
startdate_dt = enddate_dt - timedelta(days=n_backfill_days)
enddate = str(enddate_dt.date())
startdate = str(startdate_dt.date())
logger.info("drop date:\t\t%s", dropdate)
logger.info("first sensor date:\t%s", startdate)
logger.info("last sensor date:\t%s", enddate)
logger.info("n_backfill_days:\t%s", n_backfill_days)
logger.info("n_waiting_days:\t%s", n_waiting_days)

logger.info(
"Using params",
startdate=startdate,
enddate=enddate,
dropdate=dropdate,
n_backfill_days=n_backfill_days,
n_waiting_days=n_waiting_days,
outpath=export_dir,
aysim319 marked this conversation as resolved.
Show resolved Hide resolved
aysim319 marked this conversation as resolved.
Show resolved Hide resolved
parallel=params["indicator"]["parallel"],
weekday=params["indicator"]["weekday"],
write_se=se,
prefix=prefix,
)

## geographies
geos = ["state", "msa", "hrr", "county", "hhs", "nation"]


## print out other vars
logger.info("outpath:\t\t%s", export_dir)
logger.info("parallel:\t\t%s", params["indicator"]["parallel"])
logger.info("weekday:\t\t%s", params["indicator"]["weekday"])
logger.info("write se:\t\t%s", se)
logger.info("obfuscated prefix:\t%s", prefix)

max_dates = []
n_csv_export = []
## start generating
for geo in geos:
for weekday in params["indicator"]["weekday"]:
if weekday:
logger.info("starting %s, weekday adj", geo)
logger.info("Starting with weekday adj", geo=geo)
else:
logger.info("starting %s, no adj", geo)
logger.info("Starting with no adj", geo=geo)
sensor = update_sensor(
filepath=claims_file,
startdate=startdate,
Expand All @@ -137,8 +138,8 @@ def run_module(params, logger=None): # pylint: disable=too-many-statements
write_to_csv(sensor, geo, se, out_name, logger, export_dir)
max_dates.append(sensor.date.max())
n_csv_export.append(sensor.date.unique().shape[0])
logger.debug(f"wrote files to {export_dir}")
logger.info("finished updating", geo = geo)
logger.debug("Wrote files", directory=export_dir)
logger.info("Finished updating", geo=geo)

# Remove all the raw files
for fn in os.listdir(params["indicator"]["input_dir"]):
Expand Down
2 changes: 1 addition & 1 deletion doctor_visits/delphi_doctor_visits/update_sensor.py
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ def write_to_csv(output_df: pd.DataFrame, geo_level, se, out_name, logger, outpu
output_path: outfile path to write the csv (default is current directory)
"""
if se:
logger.info(f"========= WARNING: WRITING SEs TO {out_name} =========")
logger.info("WARNING: WRITING SEs", filename=out_name)

out_n = 0
for d in set(output_df["date"]):
Expand Down
2 changes: 1 addition & 1 deletion google_symptoms/delphi_google_symptoms/date_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -98,7 +98,7 @@ def generate_num_export_days(params: Dict, logger) -> [int]:
expected_date_diff += global_max_expected_lag

if latest_date_diff > expected_date_diff:
logger.info(f"Missing dates from: {to_datetime(min(gs_metadata.max_time)).date()}")
logger.info("Missing date", date=to_datetime(min(gs_metadata.max_time)).date())

num_export_days = expected_date_diff

Expand Down
11 changes: 7 additions & 4 deletions google_symptoms/delphi_google_symptoms/patch.py
Original file line number Diff line number Diff line change
Expand Up @@ -58,16 +58,19 @@ def patch(params):
issue_date = datetime.strptime(params["patch"]["start_issue"], "%Y-%m-%d")
end_issue = datetime.strptime(params["patch"]["end_issue"], "%Y-%m-%d")

logger.info(f"""Start patching {params["patch"]["patch_dir"]}""")
logger.info(f"""Start issue: {issue_date.strftime("%Y-%m-%d")}""")
logger.info(f"""End issue: {end_issue.strftime("%Y-%m-%d")}""")
logger.info(
"Starting patching",
patch_directory=params["patch"]["patch_dir"],
start_issue=issue_date.strftime("%Y-%m-%d"),
end_issue=end_issue.strftime("%Y-%m-%d"),
)

makedirs(params["patch"]["patch_dir"], exist_ok=True)

patch_dates = generate_patch_dates(params)

while issue_date <= end_issue:
logger.info(f"""Running issue {issue_date.strftime("%Y-%m-%d")}""")
logger.info("Running issue", issue_date=issue_date.strftime("%Y-%m-%d"))

# Output dir setup
current_issue_yyyymmdd = issue_date.strftime("%Y%m%d")
Expand Down
7 changes: 2 additions & 5 deletions google_symptoms/delphi_google_symptoms/run.py
Original file line number Diff line number Diff line change
Expand Up @@ -80,10 +80,7 @@ def run_module(params, logger=None):
if len(df_pull) == 0:
continue
for metric, smoother in product(COMBINED_METRIC, SMOOTHERS):
logger.info("generating signal and exporting to CSV",
geo_res=geo_res,
metric=metric,
smoother=smoother)
logger.info("Generating signal and exporting to CSV", geo_res=geo_res, metric=metric, smoother=smoother)
df = df_pull
df["val"] = df[metric].astype(float)
df["val"] = df[["geo_id", "val"]].groupby(
Expand All @@ -96,7 +93,7 @@ def run_module(params, logger=None):
df = df.reset_index()
sensor_name = "_".join([smoother, "search"])
if len(df) == 0:
logger.info("No data for %s_%s_%s", geo_res, metric.lower(), sensor_name)
logger.info("No data for signal", signal=f"{geo_res}_{metric.lower()}_{sensor_name}")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Match the above?

Suggested change
logger.info("No data for signal", signal=f"{geo_res}_{metric.lower()}_{sensor_name}")
logger.info("No data for signal", geo_res=geo_res, metric=metric.lower(), sensor_name=sensor_name)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wasn't sure since the original log wanted to capture it as a signal and not into seperate metric, sensor, geo....

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yea, these old pipelines like to mix together geo resolution, metric and sensor name into a single name and call it signal, though that doesn't match our definition of signal anywhere else. Seems better to me to separate these out, since then you can do granular search in the logs without doing substring search.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Two lines below this is a call to create_export_csv() which combines metric and sensor in the output file name, in what will ultimately be the signal name. How about:

Suggested change
logger.info("No data for signal", signal=f"{geo_res}_{metric.lower()}_{sensor_name}")
logger.info("No data for signal", geo_type=geo_res, signal=f"{metric}_{sensor_name}")

Should we use this opportunity to add better logging to create_export_csv()? @aysim319, you have something planned to change doctor visits and hospital admissions to use this method, right? Perhaps we can add the logging then.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah.. I think it's better to revisit once all the indicators are using create_export_csv... because at least for hospital-admissions it doesn't quite use the metric and the sensor combination if I recall

continue
exported_csv_dates = create_export_csv(
df,
Expand Down
2 changes: 1 addition & 1 deletion nssp/delphi_nssp/run.py
Original file line number Diff line number Diff line change
Expand Up @@ -90,7 +90,7 @@ def run_module(params):
for geo in GEOS:
df = df_pull.copy()
df["val"] = df[signal]
logger.info("Generating signal and exporting to CSV", metric=signal)
logger.info("Generating signal and exporting to CSV", geo=geo, signal=signal)
aysim319 marked this conversation as resolved.
Show resolved Hide resolved
if geo == "nation":
df = df[df["geography"] == "United States"]
df["geo_id"] = "us"
Expand Down
Loading
Loading