A user account is required in order to edit this wiki, but we've had to disable public user registrations due to spam.

To request an account, ask an autoconfirmed user on IRC (such as one of these permanent autoconfirmed members) or send an e-mail to admin@wiki.whatwg.org with your desired username and an explanation of the first edit you'd like to make. (Do not use this e-mail address for any other inquiries, as they will be ignored or politely declined.)

Talk:Dynamic Script Execution Order

From WHATWG Wiki
Jump to: navigation, search

Please discuss the proposal here. All informal feedback is welcomed, and when it's appropriate, please add content to the main page.


What about CommonJS / RequireJS?

I'm not sure we need to complicate the HTML or browser engines in order to satisfy the advanced use cases. To me it seems that the JavaScript langauge itself could abstract the requirements and let the browser vendors implement whatever solution they see fit. Specifically, if JavaScript provided the define() and require() calls proposed in the CommonJS "specifications" (and implemented by RequireJS, dojo (trunk), and others *Yabble?), then browsers could just load and evaluate scripts as fast as possible without regard to script execution order.

This would solve all of the problems except for the current inability to use loaders like LABjs in FF4 or webkit. To solve that problem, we should consider keeping the existing functionality or triggering the deprecated functionality when needed.

--Unscriptable 01:29, 2 November 2010 (UTC)

Response

This is a valid point except that it assumes that all code that needs to be loaded is "modular"... in fact, most code on the internet that needs to be loaded is NOT modular, and instead requires particular ordering when being loaded... for instance, "jquery" and "jquery-ui" -- jQuery must be run before jQuery-UI is loaded and run.


The spirit of this proposal is to add the facility to native browser (rendering engine, specifically) behavior, such that any and all existing content is able to be loaded (even with dependencies and order) without modification.

Are we trying to codify how one would implement this HTML dynamically so I can achieve parallel loading but retain dependency that is inferred in the order?

<script src="jquery.js"> </script>
<script src="jquery-ui.js"> </script>


If so, we must consider the case in which the dependency itself loads additional code which the discussions around `async=false` or even text/cache do not account for. Serverherder 19:23, 15 December 2010 (UTC)

Dependent code should be modular

I completely agree, 99% of JavaScript in use today is NOT modular, but that's arguably because there was no built-in support to make it modular (IMHO, YMMV, RTFM, BBQ, ROFL). But that doesn't mean there aren't dependencies that span script files / script nodes. I don't know how many sites would be broken (or if there is any way to determine how many sites would be broken) if/when the latest browsers started loading and evaluating script-injected scripts out of order. It's not just LABjs-based sites. I'll bet many dojo and YUI sites will break, too (although I don't know this for a fact because I don't know exactly how their dynamic script loaders work).

In any case, breaking existing sites is a BAD IDEA. :) There should be some way to allow LABjs and other dynamic script loaders to continue to work.

The part I don't agree with in the proposals is that it should be the job of HTML and/or the browser engine to manage dependencies. Specifically, I am talking about the scriptGroup element and the "waitFor" attribute. The "async" attribute may be the right level: it appropriates only general heuristics to the browser: load these scripts whenever you like, but load these other scripts in this order.

Detailed dependency management belongs in the modules, though. This is how it's done in most environments / languages: import statements, include statements, etc. It's no different in JavaScript, if a developer wants to take advantage of some code in another module, she/he should import it into their current module. This should be the model moving forward. It's not hard, it's not un-web-like, and it's already being done.

Existing code can be made to work modularly, too:

1. Identify modules and wrap them with a define() method. 2. Identify dependencies using either the Asynchronous Module Definition format or the CommonJS synchronous format

async format:

define(['dependency1', 'dependency2'], function (dependency1, dependency2) {
	/* this is the factory function, return the module inside here */
	return { /* some object, but could also be a constructor or a function */ };
});

sync format:

define(function (require, exports, module) {
	/* this is the factory function, require dependencies and return the module inside here */
	var dependency1 = require('dependency1');
	var dependency2 = require('dependency2');
	return { /* some object, but could also be a constructor or a function */ };
});

In the async case, the browser may employ whatever method it sees fit (e.g. parallel) to download the dependencies since the code inside the factory function won't execute until all of the dependencies are loaded and evaluated. In the sync case, the browser has no choice but to download and evaluate the dependency before continuing code execution. It's conceivable that the browser could find and asynchronously pre-fetch the inline require()'ed dependencies in the sync case. (RequireJS doesn't quite do this. Instead I believe it finds the inline require() calls and converts the module to an async module.)

So, in short: don't break LABjs (or any other script loaders), but don't extend the spec to manage dependency management from HTML.

--Unscriptable 01:29, 2 November 2010 (UTC)

Response #2

The cat's already out of the bag... we've got a dozen+ years worth, hundreds of millions of javascript files out there, which are NOT modular, and will not be modified to be modular.

If we're talking about either trying to change all the JS on the web to be modular, or trying to extend the HTML markup spec to include a better facility for accomodating the existing script content, I think the former is idealistic and "correct" but the latter is realistic and "right".

A modular loading system will only work for modular code, but a general loading mechanism in HTML (like the <script> tag) can load both modular and non-modular code, which is why that more general mechanism is what's in HTML right now and not some more specialized loading API that only loads a small percentage of JavaScript code out there.

The <script> tag has worked fine forever, and there's been no real big push to do away with it in favor of having all loading handled by JavaScript -- in fact, that would be impossible because if you had no way to load JS onto the page with a <script> tag, then no JavaScript could run on a page to load more code. So I don't agree that we should stop adding functionality to the HTML loading mechanism and only focus on JavaScript.

If we agree that the <script> tag itself is not only required but normative, then I'm not sure why decorating the <script> tag with additional behavior to help it be more powerful is a bad pattern. We've addded "defer" and "async" already. I'm not sure I understand why adding a wrapper tag called "scriptGroup" would be that much different than attributes on the <script> tag?

I also don't think that "scriptGroup" constitutes "dependency management". "scriptGroup" is a semantic way for a web author to express that they want to group a set of scripts together ONLY so far as execution order behavior is concerned. It implies nothing more to it (that "dependency management" would imply) than that.

For instance, consider this: if JavaScript had been built with non-global context and statement level context and sandboxing and such things, we'd have a much more secure JavaScript right now. But just because JavaScript may someday be taught some new tricks in terms of security, that doesn't mean that "now" HTML shouldn't try to help solve the problem. If for instance the <frag> tag can be used to create a sandboxed environment for third-party JavaScript code to run in, this is a much easier way to address the security problem than to suggest we have to wait for ES-Harmony to add extra syntax to the language to accomplish the same thing.

Getify 02:13, 2 November 2010 (UTC)

Text/cache solution

It seems to me that having a real text/cache script type, or a similar element that fires a load/error event and provides a .text property gets you where you need to be. You could setAttribute("type", "text/JavaScript") or use the text to create a new script element. Or a new style element, or whatever. That puts you in control of when script or CSS is interpreted. With async as the default, you don't need a new ordered attribute as that's easily managed in script.

"text/cache" response

On the surface, this idea sounds like it's pretty similar to <link rel=prefetch> -- in other words, we've already standardized a way to prefetch a resource into cache, and probably in a better way than the hack that I did in LABjs of just relying on it fetching unrecognizable content.

Unfortunately, it still suffers some similar problems to <link rel=prefetch>.

Firstly, if the notion is that you'd use <script type="text/cache"> to prefetch, and then you'd make a second request for the resource with a real <script type="text/javascript> node, it suffers the faulty assumption that the resource was served with proper cache headers.

If we try to avoid the need for the second request by suggesting that you can access the `text` property of the node (and then eval() or inject that code when you want to execute it), this runs afoul of the same-origin policy problem if you requested the script from a non-local domain.

The part of your suggestion that holds water would be if, given a "prefetched" script node that didn't execute yet, I could simply change that node's `type` value from "text/cache" to "text/javascript", and that would explicitly trigger the browser to immediately execute it.

This is an interesting idea. I can see though that it perhaps has some potential complications.

For instance, would the browser have to keep a "hasExecuted" internal flag, so that changing the `type` value would only at most result in one execution? Also, if the script node in question is loaded but not executed, can the page alter the script code (by setting its `text` property) before making the change to the `type` value? In that case, should the browser execute the original code or the modified code? I can see security implications and confusion abounding in that behavior.

Lastly, as mentioned above, this basically copies (in a similar but not exactly identical way) the <link rel=prefetch> behavior. It may be seen as a negative to be trying to step on the toes of <link rel=prefetch>, and also it may be more confusing to the general public as to why you'd use one technique or the other, given their differences being so nuanced.

That having been said, I can see that it's a viable alternate proposal. It's worth adding a note to that section of the main Wiki page and to getting some feedback from browser vendors on if they feel this is a more realistic/suitable solution to the use-case.

The difference between pre-fetching and this proposal seems pretty clear: Prefetching, if not performed now, will eventually be done once the user explicitly requests the page. while the "text/cache" solution indicates that the resource should load immediately, not during idle time.


Serverherder 19:10, 15 December 2010 (UTC)

Getify 20:56, 4 November 2010 (UTC)

RequireJS order support

I am the main developer for RequireJS, and just want to voice my support for a solution that works on the script element itself, either by leveraging the async attribute or allowing something like an "order" boolean attribute that would allow specifying a dynamically added script element should be executed in the order that it appears in the DOM. This capability should be something can be capability detected to avoid browser sniffing.

Why shouldn't dependency management be implemented in high-level code and the browser simply provide a means by which parallel loading can be achieved? As you know, there is a considerable amount of nuance. I'm of the opinion that implementing dependency management in the engine will eventually lead us right back to where we were 10 years ago with inline script tags which provides an easy method for dependency management at the expense of unmanaged blocking and an inability to detect or adapt to errors.
Serverherder 19:51, 15 December 2010 (UTC)

There are many scripts on the web today that have implicit dependencies that should be loaded in a specific order, but the application developer will want to use dynamically created script elements for best performance. jQuery plugins that depend on the jQuery script already been executed in the page are a good example.

Agreed, but I don't believe the proposals allow for proper emulation of the dependency management provided by script tags. Prototype.js, for example, loads its core modules dynamically (using document.write so maybe not the best example...). When prototype.js is included via script tags, ensuing script blocks are free to "depend" on all of these modules; Dynamically creating a dependency on prototype.js by loading both with the `async=true` property will fail since prototype.js won't be able to "jump the queue" and add insure its core modules load before its extensions.
Serverherder 19:51, 15 December 2010 (UTC)

I have to admit, you raise a valid point with respect to that common usage. The YUI model (where the framework is responsible for all loading) is perhaps the more appropriate model to move toward, though. Or, the framework doesn't auto-load its own dependencies at all. In either case, I think it's cleaner to have one loading mechanism in play rather than two or more competing ones, because as you point out, this is going to get messy. I still am not sure this means that the async=false proposal is lacking, because as I think about how complicated such a split loading would be with your `readyState` proposal, that seems even worse/more complicated to me. Getify 20:06, 15 December 2010 (UTC)

I would not argue against a global queue for synchronized execution, I just think the usefulness of such a construct declines as scripts that "I don't care about" are added. It's my opinion the spec should provide the ability for high-level code to implement similar constructs, regardless of how complex it may be to do so.

That being said, I don't think it's any more complicated than existing models. Here's a demonstration of an IE-only parallel-loading, synchronously-executing script queue implemented in 40 lines of code. It tests both cached and un-cached scripts (from the LABjs test suite) and has been tested and found to be working in IE6-8.
Serverherder 01:52, 16 December 2010 (UTC)

- James Burke

Vendors Should Implement the Existing Suggestion

Since the spec all ready includes a suggestion to download resources when the SRC attribute is set, parallel loading is an option the vendor chooses to implement and not something a script loader should be able to require. Vendors that don't implement the suggestion will run the risk of being less performant in these situations. That, however, is their choice to make as the spec suggests. If the benefits of parallel loading dependencies are so great, users will opt for this in their choice of agent or vendors can adopt the existing mechanism.

The Current Suggestion Should be Changed to a Mandate

The spec is pretty clear on why it chose to make this optional. If dependent script loading is so common that browsers that adopt the suggestion enjoy considerably better performance, the user will eventually decide to opt for those implementations. On the other hand, if it is more common that scripts are created and never inserted, leading to wasted bandwidth and clogged fetching queues, the converse will occur. Either way, by making it optional, the spec hedges its bet and is more prone to be future-proof as the habits of script developers evolve.

Modify the specification so parallel-loading is configurable

The specification expresses the reasons parallel loading should not be done. To this point, allowing developers a means to require resources be downloaded in parallel is in direct opposition to the spec's all ready stated position. To this end, I see no purpose for async=true: A browser that hopes to be highly performant should implement the spec's suggestion to improve performance. Developers should not be able to circumvent the vendor's attempt to be conservative in its use of bandwidth simply to achieve ordered execution.

response

Nothing about this proposal requires any parallelism that the browser wouldn't already afford. The spirit of this proposal is not about loading behavior, but about execution order.

The spec says nothing about the parallel loading, nor do I think it should necessarily. All modern browsers HAVE in fact implemented parallel loading for all scripts, regardless of async or not. And I think this is a good thing. But it has nothing to do with `async` or my proposal to make the definition of `async` symmetric/consistent between markup script elements and dynamic script elements.

The spec already defines that `async` is specifically to create unblocking of the page's resource loading AND "as soon as possible" execution of the script. But it only defines this for parser-inserted (markup) script tags. I think this is short-sighted, because I think a script-loader should be able to opt into either ordered or unordered execution in the same way that markup script elements can. That's really all the main proposal is about.

Script loaders can all ready choose ordered execution: chain the onload events. The issue is that vendors have chosen not to adopt the specification's suggestion. The high priority vendors place on performance suggests this was not done consciously. Instead of advocating WHATWG for additions to the spec, I think we may be better off advocating vendors implement the spec's suggestion. The existing implementation of parallelism for parser-inserted scripts suggests vendors would be willing to do so and the existing lack of support is merely an issue of communication.

At its core, the existing suggestion to retrieve upon src assignment is not altogether different than the eager-fetching performed for parser-inserted scripts. The risk still exists that the browser download a script that is never actually executed. For example, below a bunch of script tags, the HTML may contain a closing comment tag (-->) and one of the scripts doc.write("<\!--"). Another risk is that the eager-fetching clog the fetch queue, delaying the download of any scripts included via document.write(). Vendors have chosen to assume this risk though, since its probability is low and the benefit high.

Likewise, vendors can calculate the risk/reward for script-inserted scripts and will likely make the same decision -- they implement this exact behavior for images. Script loaders should respect the fact that a vendor has consciously chosen not to support parallelism for script-inserted scripts. Providing a means by which this decision can be circumvented undermines their choice. I believe it was in that spirit the specification leaves eager fetching script resources as a suggestion.


I think you still miss the pragmatic point here. I'm not disagreeing with you in theory, but in practice, the hard-nosed approach you suggest would spell death for any script loader. That's not a chance I'm willing to take.

When a developer uses a script loader that claims it is "The Performance Script Loader" (as LABjs does), and they test in some latest-version browser (whether that be FF, Safari, Chrome, IE, or Opera) and the performance is terrible (worse than before they installed LABjs), they're not going to say, "Well, I'll just keep LABjs in there and hope those browsers fix this quickly".

This is tail-wagging-the-dog mentality, and it ignores the reality: people will almost always chose the path of least resistance. If they try a script loader, and it gives them worse performance than before, they'll simply take it out. They may complain about it once in a tweet, but they won't take a stand against the browsers to the detriment of their own site and their own users, and they won't tell all their users to switch to another browser in protest.

And worse, for LABjs, the path of least resistance may very well mean that if performance is important to them, they'll just pick some other loader, which has more hacks in it. As long as they get what they want, which is reliable and pretty good performance, they'll be happy. And it will totally have been lost on them that the battle for web standards was just dealt a blow by their actions. General web page authors do not care to fight that battle on behalf of LABjs. So, LABjs must fight that battle under the covers.

I understand your perspective, but I can say as the maintainer of a tool like LABjs that cares about keeping its relevancy for "survival", I won't take an action that I know is likely to cause a decline in the usefulness of my tool. Instead, I will take a more cautious, slow-to-change approach, while at the same time feature-testing for the new standards and using them if available. Only after a long time of the new feature-testable stuff being present will I consider removing the legacy stuff. This is a strategy I think almost every framework/lib maintainer would echo support for.

Getify 00:50, 24 December 2010 (UTC)

Getify 21:39, 23 December 2010 (UTC)

A phased approach toward adopting this is reasonable and you clearly state the risks of both positions. My comments have derailed the true purpose of this discussion: Does the specification require modification? Since the spec all ready provides a mechanism by which this functionality can be achieved, I believe it is best to reword the suggestion so it more appropriately conveys the importance of it, but don't think the behavior should be imposed on vendors. Serverherder