You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi @hteumeuleu, thanks for maintaining this great resource. I'm trying to build on top of this dataset and want to make sure that I'm understanding the nuances.
Some pretty common attribute types like css-box-size were last run in 2019, others like font-kerning in 2022 and yet others far more recently like css-border-spacing last December. My initial theory here was that once you reach some cutoff of browser support you stop testing to save on resources and assume browsers aren't going to introduce regressions. But when I run the breakdowns I'm not sure I see this correlation. Is there something I'm missing?
And then on a related note - is there a way to retrieve a mapping of (mail client, version number, parameter): crawl date? When looking at all of the combined stats it looks like there was some point that you transitioned off of version numbers and to test dates themselves. For legacy versions, do you happen to have an index of the most recent time that particular version number was tested? Would looking at the git history be the best place to check?
The text was updated successfully, but these errors were encountered:
Hey there. One important thing to know is that all tests are run and updated manually. So usually, when we introduce a new page (for an HTML or CSS feature) on the site, we set the last_test_date for that page. And eventually after that, we update it if we happen to run new tests for that feature in one or more specific email clients. But I admit this hasn't been done consistently since launch, which is why some pages still have september 2019 as their last_test_date even though they have been updated for one or more client since then. I think checking the git history would indeed be a great place to know when was a feature page last updated.
Regarding client version details, there's no more than what is in the data.json available. Ideally, we prefer to use version numbers for email clients. But webmails usually don't have that so we use dates instead. And we also haven't been consistent with that. (For example, most Outlook on Windows use their commercial names — 2016, 2019, … ).
Hi @hteumeuleu, thanks for maintaining this great resource. I'm trying to build on top of this dataset and want to make sure that I'm understanding the nuances.
Some pretty common attribute types like css-box-size were last run in 2019, others like font-kerning in 2022 and yet others far more recently like css-border-spacing last December. My initial theory here was that once you reach some cutoff of browser support you stop testing to save on resources and assume browsers aren't going to introduce regressions. But when I run the breakdowns I'm not sure I see this correlation. Is there something I'm missing?
And then on a related note - is there a way to retrieve a mapping of (mail client, version number, parameter): crawl date? When looking at all of the combined stats it looks like there was some point that you transitioned off of version numbers and to test dates themselves. For legacy versions, do you happen to have an index of the most recent time that particular version number was tested? Would looking at the git history be the best place to check?
The text was updated successfully, but these errors were encountered: