Replies: 3 comments 3 replies
-
This is a good idea. We could get even greater gains[0] by doing something similar on the edge (without necessarily modifying markup) using 103 Early Hints (MDN article, Cloudflare docs). Though we'd have to do some minimal parsing of the html if we were to send hints based on the markup. [0] - Since we won't have to wait for js to be downloaded, parsed, JITed, and executed before we can add the preload. With |
Beta Was this translation helpful? Give feedback.
-
i'd be curious to see figures of a trial of it. This would be a huge "loss" in terms of flexibility and content driven code. This goes in the way of what we had back in the day of loading huge chunks of unused CSS / JS. |
Beta Was this translation helpful? Give feedback.
-
I think this is great. It's overhead, but it makes a ton of sense. You cannot know marquee.js needs decorate.js until it's parsed... and that creates a chain. I do think this masks some root problems we have that should be explored more in-depth:
But the above would also create different types of complexity. I see the initial proposal as really good. The way I see this working would still be content dependent, so you are really only preloading when a particular block is called and we know the parallel dependencies are involved. We cannot do this with fragment content, but you still parallelize after the HTML is parsed and you know what blocks to load. Very cool. |
Beta Was this translation helpful? Give feedback.
-
While investigating the performance issue of this page, I noticed this page's marquee has a fragment, which has a video modal. All of the related blocks and features have to be initialized before LCP. The performance chart shows a scattered network calls on the timeline.
Milo uses a lot of dynamic imports and fetches. If there is a chain of imports, it becomes a series of back-to-back network calls. In this case it delays LCP.
We can use preload to pack network calls together. For example,
icons.svg
can be preloadedWhat should be preloaded may depend on the content. As a simple solution, we can have a map of preloads for each block. Scripts that are highly likely to be loaded can be added to the map. e.g.
/libs/utils/utils.js
The preload map can alternatively be in the consumer app. We can also make a smart preload decider, which finds preloads based on the content. With a simple preload map, I saw a more compact timeline and LCP was improved.
Beta Was this translation helpful? Give feedback.
All reactions