-
-
Notifications
You must be signed in to change notification settings - Fork 924
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
v3 proposal #2982
base: main
Are you sure you want to change the base?
v3 proposal #2982
Conversation
Perf is somewhat improved in spots, but generally the same otherwise.
Cuts only about 2% of the bundle, but removes a pretty bad pain point in the code. Plus, inserting arbitrary HTML outside a container is usually a recipe for disaster in terms of styling.
It's simpler that way and I don't have to condition it.
It's redundant and has been for ages. A v3 increment gives me the ability to minimize surprise.
…t from the router
People can just use equality and regexps as needed.
Also, rename it `m` to make dev stack traces a little more readable, and remove a useless layer of indirection.
This makes that class fully monomorphic and also a bit safer.
That's not been there for years
The long comment at the top of it explains the motivation.
It's now been made redundant with `m.tracked`.
…e `m.use` `m.use` is specially designed to be used a fair amount, and simulating it is relatively slow, so it's worth making a built-in. `m.key` -> `m.keyed` provides for a simpler interface, one that you don't have to type `key` out explicitly for anymore. (Most frameworks require you to explicitly add a "key" attribute, while this approaches it more like a glorified `Map`.)
It's very common to want context and not care about attributes. What's *not* common is wanting the old attributes and not the new. So that's what this optimizes for.
This is the best early Christmas present 🥲 |
It mostly delegates to `fetch`. Also, kill `m.withProgress` as that only existed because there was no `m.request`.
472e59e
to
e21c89f
Compare
Updated this PR with some changes and a much better explainer of the new API. |
Thanks! 🙏
Updated the initial comment to explain these better, with detailed examples that took about a day or two to work out. In the process, I had to modify
Re-added that, just with a different name (to help people more easily notice there's a difference).
Unabbreviated
If you're wondering about why I optimized it so far, check the initial comment of
Fixed with some explainers in the initial comment. It's pretty easy to squint and see the similarities.
To be fair, this isn't our first rodeo. And we did go through a similar transition from v0.2 to v1.
More academic APIs have proven to be exactly the right type of abstraction needed for many problems. While the API in my PR is more academic, I also tried to make it simpler. This feedback did lead me to move from Will call out that I'm pulling a page out of the lessons learned from all the ECMAScript design discussions, including max-min classes. There's three major lessons that stick out the most to me:
Academic APIs can sometimes produce more gotchas than they get rid of, but pragmatic use of academic styles of APIs can often get you 90% of the way there with 10% of the effort. And it's that pragmatic use that was my ultimate goal.
Ironically, it's those that you named (mod
I did restore
Oh, this feedback was immensely helpful.
There's none currently, but I'm open to adding such a story for that. I'd rather see it deferred until after this PR, though. |
e21c89f
to
3220a25
Compare
I want a hard guarantee that it doesn't unnecessarily allocate here. To do otherwise could introduce uncertainty into tests.
@Claudia Meadows I just had a proper read through of the proposal. Thank you for all the hard work you've put into this! There's a lot here to discuss, and even if I don't agree with all of the proposals, it is so nice to have a proposal to discuss! First main point, I personally think v3 should just be a clean up release of v2, extremely minor changes, tightening up the v2 API e.g. drop support for classes/pojo's. No new abstractions/concepts. I think if anything this should be a v4 proposal. That would give us a little bit more breathing room to consider the API surface without holding off a v3 release that removes some footguns. I think there's a lot of things here that don't need to be in mithril core. At least not in an initial release. I think for example the debounce/throttle stuff is way more elegant with streams and its kind of streams' wheelhouse. If we added those methods to the stream API I think it would be a lot cleaner than the proposal example. I also think as a high level goal we should strive to not have too many top level methods / properties on the I really like the direction of removal of magic properties from VNodes. But it feels a little strange having effects in the view, they get re-evaluated every render, and it adds noise to the markup (in my opinion at least). If something isn't returning/producing vtree I don't see why it should sit in the view. Could they instead sit on the vnode: const Example = v => {
v.create(() => console.log('hello') )
v.remove(() => console.log('goodbye') )
return () => m('div')
} I think this would also establish precendence for having more component scoped APIs that auto cleanup / cancel. E.g. I like v.init, not sure about the naming. I think having a built in util that provides an abort signal when the component is unmounted is an extremely useful util to add to core, and it is easy to understand too. But I think it should work even if called after There is a bit of API overlap with I don't think we should support the functional components variation I think that will be a huge footgun and is one of the worst API design mistakes React ever made. Writing those extra Love the simplified closure component interface though ( I think create/update should be separate. I don't see how having it in one hook simplifies things as any persistent binding needs to sit in a separate scope anyway. I'm not sure about I like offering those route matching methods and ditching route resolver. I really strongly believe we shouldn't offer "just use a regex" as a solution. I don't like the second proposal as the default happy path, its too imperative. But I think offering those utils would be extremely useful. Additionally can we skip m.lazy and just support async functions that return m.route({
'/': Home,
'/settings': () => import('./pages/settings.js')
}) Also I didn't see if there was a nested routes example, any thoughts there? I think if we're going to do a route api refresh it should support nesting. Perhaps I think we can drop m.route as an entry point and just always use m.mount, keep m.route as just a view level utility. I think we should really strive for simplifying call signatures / usages. I like dropping XHR in favour of fetch internally for m.request, but I don't see a huge win in changing the m.request API. If we're not offering any sugar on top of fetch, may as well just document using native fetch and use I think async event handlers are great (didn't we already land that in v2), I don't think we need I'm not sure about the I see where you're going with In my codebases I pretend Can we rename
I think Can we keep it as is but use the new lifecycle API? |
Slightly boosts performance (by around 5-10%) and slightly saves on bundle size
@JAForbes I'll address the rest of your comment if/when I find time, but I do want to address a couple parts now:
What are your thoughts on injecting them in the context alongside I do have an
Unfortunately, we don't really have a simpler or cheaper abstraction than a function for a component. Keep in mind, there aren't anything like React-style hooks. Also, I will point out that there are unique optimization benefits beyond just memory consumption for doing it this way. V8 in particular optimizes functions by reference, not by code body. |
@JAForbes I'll go ahead and address most of the rest:
I get where you're coming from on adding it to streams, but pragmatically speaking, in most web applications, debouncing and throttling is applied very early in the pipeline. So early, people commonly just do For one concrete example, https://www.npmjs.com/package/redux-debounced gets less than 10k weekly downloads, while this SO answer on how to debounce Redux actions simply uses a debounce helper. In the world of React Hooks, debouncing also similarly occurs very early, like immediately after the variable state is defined. You use the debounced variable instead of the raw state variable. But even that answer also recommends people just do something like Debouncing streams really only makes sense if you're passing events directly into a stream, and in the front end, this is a very rare thing to see if you're not procedurally subscribing to events on elements. As for why in core? Debouncing and throttling are extremely common operations. It's so common, there's a current WHATWG proposal just to add it to event emitters as an option as well as to WHATWG's (very WIP) observable spec as a method. People have been wanting a native debouncing and throttling API for years. The basic concept itself is about 15 years old, and the name itself is borrowed from a similar mechanical issue switches have faced for many decades.
Unfortunately, I found it very hard to defend porting
I'm okay with changing the name of the factory if there's sufficient support for it. However, the choice of When I first opened this PR, I used an
So, to minimize bug risk, both arguments need to be in some way optionally accessible. There's two ways to do that: pass it all in an object (say,
And I chose to have
I'll admit I don't like I will point out that there are two performance constraints that I'm trying to meet:
I've been torn on this. There's arguments both ways on this one. Both mechanisms simplify one thing but make the other more complicated. I wonder if it'd be best to just provide both create/update and layout hooks - it'd all just be going into the same queue anyways.
To be clear, this isn't a direct analogue to
If you have a better alternative that preserves the flexibility, I'm all ears.
I screwed up that example, but it's also a bit simplistic. The correct API is
I glossed over that, but the intent is that you could do something like this for easy subroutes: let match
if (match = this.route.match("/prefix/:route...")) {
return doSubroute(match.route)
} Basically, subroutes are just a matter of abstraction.
The problem is the need to redraw whenever someone clicks the back button. That's why
The v3 async event handlers can auto-redraw when the handler resolves. v2 does not have this functionality. And
Keep in mind this is intended for multi-child fragments. I am interested in learning more about how this could impact you, though. |
@dead-claudia thanks for this. Have you tried your builds with the JS benchmarks and seen improvements? |
@PierBover I have not, though I'm open to looking into it. |
Hi, I have a question about I want to set a CSS variable through After some digging, I found that there's a You mentioned |
@YUCLing That's actually just a bug in this line, so thanks for the catch! That condition should've been inverted. (It also should be checking the first two characters, and I just haven't gotten around to fixing that.) I'll try to have a fix pushed in the next week or so, but I have a ton on my plate right now. |
I want your feedback. Please use it, abuse it, and tell me what issues you find. Bugs, ergonomic issues, weird slowdowns, I want to know it all. Feel free to drop a comment here on whatever's on your mind, so I can easily track it.
Also, @MithrilJS/collaborators please don't merge this yet. I want feedback first. I also need to create a docs PR before this can get merged.
Miscellaneous TODOs
Don't count on these happening anytime soon. Not all of these are even guaranteed to happen. They're mostly reminders for me.
parent.append(child)
is equivalent toparent.insertBefore(child, null)
. Exploit this to simplify some code (and remove that insertion helper).mithril/stream
, based on https://github.com/cujojs/most/tree/master/test/perfDescription
This is my v3 proposal. It's pretty detailed and represents a comprehensive overhaul of the API. It synthesizes much of my API research. And yes, all tests pass. 🙂
Preserved the commit history if you're curious how I got here. (Reviewers: you can ignore that. Focus on the file diff - you'll save yourself a lot of time.)
If you want to play around, you can find artifacts here: https://github.com/dead-claudia/mithril.js/releases/tag/v3.0.0-alpha.2
This is not published to npm, so you'll have to use a GitHub tag/commit link. If you just want to mess around in a code playground, here's a link for you.
Quick line-by-line summary
If you're short on time and just want a very high level summary, here's a comprehensive list, grouped by type.Highlights:
Additions:
m.layout((dom) => ...)
, scheduled to be invoked after renderm.remove((dom) => ...)
, scheduled to be invoked after the render in which it's removed fromm.retain()
, retains the current vnode at that given positionm.set(contextKeys, ...children)
, sets context keys you can get via the third parameter in componentsm.use([...deps], ...children)
, works similar to React'suseEffect
m.init(async (signal) => ...)
, callback return works just like event listenerstracked = m.tracked()
to manage delayed removalm.throttler()
m.debouncer()
Changes:
key:
attribute →m.keyed(view, (item) => [key, vnode])
m.buildPathname(template, query)
→m.p(template, query)
m.buildQueryString(query)
→m.query(query)
false
to prevent redraws.removeOnThrow: false
is passed tom.render
.removeOnThrow: false
is passed tom.render
.m.route(elem, defaultRoute, {...routes})
->m.route(prefix, view)
m.mount(...)
now returns aredraw
function that works likem.redraw
, but only for that one root.redraw
is also set in the context via theredraw
key.m.render(...)
now accepts an options object for its third parameter.redraw
sets the redraw function (and is made available in the context).(attrs, old, context) => vnode
functions or an(attrs, null, context) => ...
function returning thatm.request(url, opts)
→m.fetch(url, opts)
, accepts allwindow.fetch
options,onprogress
,extract
, andresponseType
.Removals:
mithril/stream
- the main library no longer needs the DOM to loadm.trust
- useinnerHTML
insteadm.censor
m.vnode(...)
- usem.normalize
if it's truly necessarym.mount(root, null)
- usem.render(root, null)
insteadm.redraw()
-> eitherredraw
function returned fromm.mount
or theredraw
context keyoninit
oncreate
/onupdate
- usem.layout(...)
insteadonremove
- usem.remove(...)
insteadonbeforeupdate
- usem.retain()
insteadonbeforeremove
- usem.tracked()
utility to manage parents insteadm.parsePathname
m.parseQueryString
ev.redraw = false
- return/resolve withfalse
or reject/throw from event handlersstyle
attribute objects - those now throw. (This won't likely impact many.)Miscellaneous:
mithril/stream
to require a lot less allocation - I don't have precise numbers, but it should both be faster and require less memory.Benchmarks
For the raw benchmark data and specifics on how the memory benchmark was performed, see this gist. I initially posted the full data here, but apparently it's gotten too
powerfulbig for GitHub Issues - I got the error message "There was an error posting your comment: Body is too long" when I tried.Memory benchmarks
v2.2.8:
window
final retained size: 1001416 bytesThis PR:
window
final retained size: 862976 bytesRaw DOM:
window
final retained size: 611780 bytesRaw empty:
window
size: 287292 bytesCPU benchmarks
Chromium (Blink):
simpleTree
nestedTree
mutateStylesPropertiesTree
repeatedTree
shuffledKeyedTree
simpleTree
nestedTree
mutateStylesPropertiesTree
repeatedTree
shuffledKeyedTree
simpleTree
nestedTree
mutateStylesPropertiesTree
repeatedTree
shuffledKeyedTree
simpleTree
simpleTree
Firefox (Gecko):
simpleTree
nestedTree
mutateStylesPropertiesTree
repeatedTree
shuffledKeyedTree
simpleTree
nestedTree
mutateStylesPropertiesTree
repeatedTree
shuffledKeyedTree
simpleTree
nestedTree
mutateStylesPropertiesTree
repeatedTree
shuffledKeyedTree
Motivation and Context
This resolves a number of ergonomic issues around the API, crossing out many long-standing feature requests and eliminating a number of gotchas.
Related issues
- Fixes https://github.com//issues/1937 by enabling it to be done in userland, mostly via `m.render(elem, vnode, {removeOnThrow: true})` - Resolves https://github.com//issues/2310 by dropping `m.trust` - Resolves https://github.com//issues/2505 by not resolving routes - Resolves https://github.com//issues/2531 by not resolving routes - Fixes https://github.com//issues/2555 - Resolves https://github.com//issues/2592 by dropping `onbeforeremove` - Fixes https://github.com//issues/2621 by making each mount independent of other mount points - Fixes https://github.com//issues/2645 - Fixes https://github.com//issues/2778 - Fixes https://github.com//issues/2794 by dropping internal `ospec` - Fixes https://github.com//issues/2799 - Resolves https://github.com//issues/2802 by dropping `m.request`Related discussions
- Resolves https://github.com//discussions/2754 - Resolves https://github.com//discussions/2775 by not resolving routes - Implements https://github.com//discussions/2912 - Implements https://github.com//discussions/2915 by throwing - Resolves https://github.com//discussions/2916 by dropping `m.request` - Implements https://github.com//discussions/2917 with some minor changes - Implements https://github.com//discussions/2918 - Implements https://github.com//discussions/2919 - Implements https://github.com//discussions/2920 - Resolves https://github.com//discussions/2922 by dropping `m.request` - Resolves https://github.com//discussions/2924 by dropping `m.request` - Implements https://github.com//discussions/2925 - Resolves https://github.com//discussions/2926 by dropping `m.request` - Resolves https://github.com//discussions/2929 by not doing it (doesn't fit with the model) - Resolves https://github.com//discussions/2931 by not resolving routes - Resolves https://github.com//discussions/2934 by dropping `m.request` - Resolves https://github.com//discussions/2935 by not resolving routes - Partially implements https://github.com//discussions/2936 - Partially implements https://github.com//discussions/2937, intentionally skips rest - Resolves https://github.com//discussions/2941 by not resolving routes - Implements https://github.com//discussions/2942 by offering a configurable `route` context key - Implements https://github.com//discussions/2943 - Implements https://github.com//discussions/2945 - Resolves https://github.com//discussions/2946 by making the router a component instead (thus making it implicit) - Implements https://github.com//discussions/2948 by throwing - Resolves https://github.com//discussions/2950 by dropping `m.request`Things this does *not* resolve
- https://github.com//issues/2256 (This needs some further digging to lock down precisely what needs done) - https://github.com//issues/2315 - https://github.com//issues/2359 - https://github.com//issues/2612 - https://github.com//issues/2623 - https://github.com//issues/2643 - https://github.com//issues/2809 - https://github.com//issues/2886New API
This is all in the style of https://mithril.js.org/api.html. I also include comparisons to v2. Collapsed so you can skip past it without scrolling for days.
New API overview
Vnodes
Vnodes have changed under the hood massively. Now, everything is packed into a much smaller object (10 to 6) when normalized. This means that in large apps, you can get away with way more components before you see perf issues. (Significantly reducing memory overhead is a partial goal of this change.)
Each type below also contains a quick explainer of the underlying normalized vnode object representing them. If a field isn't documented below, it's either not used or an implementation detail.
onbeforeupdate
→m.retain()
Return
m.retain()
if you want to retain the previous vnode. Also, components always receive the previous attributes via their second argument (null
on first render).Here's an overoptimized, overabstracted counter to show the difference:
This PR:
v2:
Normalized vnode fields:
vnode.m
is specially set to -1.null
, orundefined
). For instance, if the vnode it was retaining is anm("div")
, it gets itsvnode.d
field set to the rendered DOM element.oncreate
/onupdate
/onremove
→m.layout
/m.remove
DOM-related lifecycle methods have been moved to vnodes. Here's an example using the FullCalendar Bootstrap 4 plugin (an old version of this is currently used in the docs), show the difference:
This PR, option 1: nest the
m.layout
andm.remove
This PR, option 2: use
vnode.d
to get the elementv2:
Normalized vnode fields:
vnode.m & m.TYPE_MASK
ism.TYPE_LAYOUT
form.layout(...)
,m.TYPE_REMOVE
form.remove(...)
.vnode.a
holds the callback for both types.m(selector, attrs, children)
, JSX<selector {...attrs}>{children}</selector>
Same as before, but with a few important differences:
m.censor
is removed.)return false
to prevent redraw rather than to "capture" the event (usem.capture(ev)
for that).This PR:
v2:
Normalized vnode fields:
vnode.m & m.TYPE_MASK
ism.TYPE_ELEMENT
.vnode.t
holds the selector,"div.class#id"
in this case.vnode.a
holds the attributes. Note that if both the selector and attributes have aclass
/className
attribute, they're now normalized toclass
, notclassName
.vnode.c
holds the (normalized) children.vnode.d
holds the DOM element, once rendered.m(Component, attrs, children)
, JSX<Component {...attrs}>{children}</Component>
Usage is the same as before, but with one difference: no magic lifecycle methods. You'll need to use special attributes to expose inner DOM nodes, lifecycle, and state, and you'll need to use lifecycle vnodes outside the component for everything else. (For similar reasons,
m.censor
is removed.)The component definition differs a lot, though, and that'll be covered later.
This PR:
v2:
Normalized vnode fields:
vnode.m & m.TYPE_MASK
ism.TYPE_COMPONENT
.vnode.t
holds the component,Greeter
in this case.vnode.a
holds the attributes. Any children passed in are merged in as{...attrs, children}
, but they are not normalized.vnode.c
holds the instance vnode, once rendered.Holes:
null
,undefined
,false
, andtrue
Holes work exactly the same as before, with all the same rules. And like before, they're normalized to
null
."text"
Text vnodes work exactly the same as before, with all the same rules. Anything neither a hole, a fragment array, nor a normalized vnode is stringified. Symbols can even be stringified like before.
Normalized vnode fields:
vnode.m & m.TYPE_MASK
ism.TYPE_TEXT
.vnode.a
holds the (stringified) text.[...children]
,m(m.Fragment, ...children)
, JSX<>{children}</>
Unkeyed fragments work the same as before, but with three differences:
m.fragment(...)
. Usem.normalize([...])
orm(m.Fragment, ...)
instead - there's no longer a dedicated normalized fragment vnode factory.m.Fragment
actually looks like a component, rather than looking like an element selector (and being faked as a component in TypeScript types).m(m.Fragment, ...)
.Skipping examples, since it's the same across both aside from the above.
Normalized vnode fields:
vnode.m & m.TYPE_MASK
ism.TYPE_FRAGMENT
.vnode.c
holds the (normalized) children.m.keyed(list, view)
,m.keyed([...entries])
Keyed fragments have changed. Instead of returning an array of vnodes with
key
properties, you usem.keyed(...)
. In this new model, keys are more implicit, and while there's a few more brackets and parentheses, it's a bit easier to follow due to the added context of "this is a keyed list, not a normal one" given by them.keyed
call.This PR:
v2:
Normalized vnode fields:
vnode.m & m.TYPE_MASK
ism.TYPE_KEYED
.vnode.c
holds the key to (normalized) child map.m.use([...deps], ...children)
For cases where you want to reset something based on state, this makes that way easier for you, and it's much more explicit than a single-item keyed list. (In fact, it was such a common question during v0.2 and v1 days that we had to eventually document it.) This new factory provides a better story, one that should also hopefully facilitate hooks-like use cases as well.
This PR:
v2:
Normalized vnode fields:
vnode.m & m.TYPE_MASK
ism.TYPE_USE
.vnode.a
holds the collected dependency array (deps
can be any iterable, not just an array).vnode.c
holds the (normalized) children.m.set({key: value, ...}, ...children)
This type is entirely new, without a v2 equivalent. You've probably noticed a bunch of
this.route
andthis.redraw()
spattered everywhere. Well,this.redraw()
is provided implicitly by the runtime, but the newm.route(...)
directly usesm.set({route}, view())
internally.Normalized vnode fields:
vnode.m & m.TYPE_MASK
ism.TYPE_USE
.vnode.a
holds the collected dependency array (deps
can be any iterable, not just an array).vnode.c
holds the (normalized) children.m.inline(view)
This type is entirely new, without a v2 equivalent. It's equivalent to the following component, but is optimized for internally (and is its own vnode type).
It's used internally by the new
m.route(prefix, view)
. See that helper for more details on how this plays out in practice.Normalized vnode fields:
vnode.m & m.TYPE_MASK
ism.TYPE_INLINE
.vnode.a
holds the view function.vnode.c
holds the instance vnode, once rendered.m.mount(element, component)
→m.mount(element, view)
This PR:
v2:
m.redraw()
→context.redraw()
This PR, option 1: use from a component
This PR, option 2: use from a mount view callback
v2:
Routing:
m.route(root, defaultRoute, {...routes})
→m.route(prefix, view)
It's more verbose, but there are unique benefits to doing it this way.
if
statements to guard your routes.The main goal here is to just get as out of your way as possible. Give you the helpers needed to easily solve it yourself with only basic abstractions, instead of trying to magically solve routing for you and ultimately failing because the abstraction didn't quite work as you needed it to.
This PR, option 1: define using static strings and/or regexps
This PR, option 2: use
route.match(...)
(equivalent tom.match(route, path)
) to match route templatesv2:
Note that while the above uses the first argument, the
view
is actually passed into anm.inline
, so it receives all the context, not just theroute
key, and it also receives it as boththis
and the first parameter. So this is equivalent to the second option:route.set(path)
Current route
m(m.route.Link, ...)
→m.link(href, opts?)
This PR:
v2:
What happened to route resolvers?
Instead of accepting components, everything is essentially a
render
function from v2. But foronmatch
, the only two legitimate uses for it are auth and lazy route loading. There's better ways to handle both:Comp = m.lazy({fetch: () => import("./path/to/Comp.js")})
helper.onmatch
. Instead, the view for it should be rendered right away, just with loading placeholders for parts that aren't yet ready to show. Trying to force that into a lazy loading system would just cause blank screen flickering and other stuff that at best doesn't look pretty and at worst causes accessibility issues.m.buildPathname(template, params)
→m.p(template, params)
It's the same API, just with a shorter name as it's expected to be used a lot more. It also has an entirely new implementation that's almost twice as fast when parameters are involved, to match that same expected jump in usage.
Here's another example of how it could end up used.
m.buildQueryString(object)
→m.query(object)
Same as before, just with a shorter name to complement
m.p
. It also reuses some of the optimizations created form.p
.m.request(url, opts?)
→m.fetch(url, opts)
The new
m.fetch
accepts all the same arguments aswindow.fetch
, plus a few other options:opts.responseType
: Selects the type of value to return (default:"json"
)"json"
: Parse the result as JSON and return the parsed result"formdata"
: Parse the result asmultipart/form-data
and return the parsedFormData
object"arraybuffer"
: Collect the result into anArrayBuffer
object and return it"blob"
: Collect the result into aBlob
object and return it"text"
: Collect the result into a UTF-8 string and return it"document"
: Parse the result as HTML/XML and return the parsedDocument
objectopts.extract
: Instead of usingresponseType
to determine how to extract it, you can provide a(response) => result
message to extract the result. Only called on 2xx and takes precedence overopts.responseType
.opts.onprogress
: Pass a(current, total) => ...
function to get called on download progress.m.fetch
returns a promise that resolves according toopts.responseType
andopts.extract
on 2xx, rejects with an error on anything else. Errors returned through rejections have the following properties:message
: The returned UTF-8 text, or the status text if empty. If an error was thrown during them.fetch
call, its message is duplicated here instead.status
: The status code, or 0 if it failed before receiving a response.response
: The response from the innerfetch
call, orundefined
if the innerfetch
call itself failed to return a response.cause
: If an error was thrown during them.fetch
call, this is set to that error.m.throttler()
This is a general-purpose bi-edge throttler, with a dynamically configurable limit. It's much better than your typical
throttle(f, ms)
because it lets you easily separate the trigger and reaction using a single shared, encapsulated state object. That same separation is also used to make the rate limit dynamically reconfigurable on hit.Create as
throttled = m.throttler(ms)
and doif (await throttled()) return
to rate-limit the code that follows. The result is one of three values, to allow you to identify edges:undefined
false
, returned only if a second call was madetrue
Call
throttled.update(ms)
to update the interval. This not only impacts future delays, but also any current one.To dispose, like on component removal, call
throttled.dispose()
.If you don't sepecify a delay, it defaults to 500ms on creation, which works well enough for most needs. There is no default for
throttled.update(...)
- you must specify one explicitly.Example usage:
Here's the v2 equivalent of the above code, to show how this (and async event listeners) help reduce how much you need to write.
m.debouncer()
A general-purpose bi-edge debouncer, with a dynamically configurable limit. It's much better than your typical
debounce(f, ms)
because it lets you easily separate the trigger and reaction using a single shared, encapsulated state object. That same separation is also used to make the rate limit dynamically reconfigurable on hit.Create as
debounced = m.debouncer(ms)
and doif (await debounced()) return
to rate-limit the code that follows. The result is one of three values, to allow you to identify edges:undefined
false
, returned only if a second call was madetrue
Call
debounced.update(ms)
to update the interval. This not only impacts future delays, but also any current one.To dispose, like on component removal, call
debounced.dispose()
.If you don't sepecify a delay, it defaults to 500ms on creation, which works well enough for most needs. There is no default for
debounced.update(...)
- you must specify one explicitly.Example usage:
Here's the v2 equivalent of the above code, to show how this (and async event listeners in general) help reduce how much you need to write.
onbeforeremove
→m.tracked(redraw, initial?)
,m.trackedList(redraw, initial?)
In v2, for managing animations, you'd do something like this:
In this PR, things are done a bit differently as it's much lower-level: you don't remove right away, but instead you first trigger the animation, and then you remove. Tracking the virtual list needed for this is complicated, but that's where
m.tracked()
andm.trackedList()
come in to help - those help track it for you. You just need to make sure to render what they say is live in the given moment.Here's the equivalent for this PR:
Separating this from the framework also helps bring better structure. For page transitions, you could do something like this:
This replacement also comes with a unique advantage: you get a pre-made signal that you can use to immediately abort stuff like fetches before the node is removed.
m.tracked()
APItracked = m.tracked(redraw, initial?)
: Create a tracked value. Default initial value isundefined
, but you shouldhandles = tracked(state)
: Track an incoming state value. Returns a list of handlesm.trackedList()
APItracked = m.trackedList(redraw, initial?)
: Create a tracked value.handles = tracked(state)
: Track an incoming state value. Returns a list of handlesoninit
→m.init(callback)
In general,
oninit
is redundant, even in v2. However,m.init
has some use in it that makes it not entirely redundant.To take the
ShowUser
example from before:Let's assume there wasn't a
this.pageHandle
in the context:This could save a couple lines by using
m.init
.The closest v2 equivalent would be this, similar to the above variant without
m.init
:It's a bit niche, but it can save code in certain situations. It also has the added benefit of scheduling the block to not block frame rendering. (This was done before with
queueMicrotask
, but common idioms aren't usually aware of that.)How Has This Been Tested?
I've written a bunch of new tests, and every current test (to the extent I've kept them) pass. At the time of writing, there's about 219k tests, but all but about 8k of that is for the near exhaustive
m.fetch
tests.Types of changes
Checklist