You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Daguerreo will use OffscreenCanvas and will be able to send the rendering context between effects to be processed.
Here's an example of a render process for a frame.
Initial image is taken from the source effect.
Rendering context is sent to the first effect.
Rendering context is sent to the second effect.
Image is sent to the screen.
Possible pain points:
Effects that affect how the resulting image is composited could be difficult. For example: blend modes.
Effects could scale the source in a way that clips out parts of the image.
Web workers could be used to render multiple frames at a time, but effects would have to manage state using transferables.
Switching to a different rendering engine in the future could be difficult, but that is not really relevant now.
Since VideoEncoder will be used, encoding will be limited to what the browser supports.
Remedies:
Blend modes could be built into Daguerreo, and always be available. at the moment I can't think of any other effects that affect compositing.
Effects can modify a DOMMatrix, which will transform the end result
The entire effect object would have to be executed inside workers, including their local state outside of properties.
Maybe WebAssembly could be used to port certain encoders to the browser, but I have zero experience with that.
Advantages:
By using regular canvas, it will be possible to switch to WebGL (or WebGPU) if performance seems to be an issue.
Using web workers is possible. And probably required.
Requirements:
A source effect that can interpret the data from Media file #287 and render it.
A difference between rendering to export and previewing needs to be able to be defined.
When rendering to export, a file needs to be produced, either by using the file system API and writing directly, or by writing chunks to IndexedDB. For in the future
Effects need to be able to set data for the next effect.
Current rendering state needs to be able to be serialized. Will have to be done externally to properly integrate things like property keyframes
Notes:
For now, web workers shouldn't be used. I need more experience with them.
Unless if performance is an issue, 2D context should be used for now. (Note: Firefox struggles a bit, so this decision should be reconsidered in the future)
The text was updated successfully, but these errors were encountered:
Daguerreo is the name of the first renderer for Safelight.
Name
The name originates from Daguerreotype, which is the first publicly available photographic process, and since this is the first renderer for Safelight. I think it fits.
Technology
Daguerreo will use OffscreenCanvas and will be able to send the rendering context between effects to be processed.
Here's an example of a render process for a frame.
Possible pain points:
VideoEncoder
will be used, encoding will be limited to what the browser supports.Remedies:
Advantages:
Requirements:
When rendering to export, a file needs to be produced, either by using the file system API and writing directly, or by writing chunks to IndexedDB.For in the futureCurrent rendering state needs to be able to be serialized. Will have to be done externally to properly integrate things like property keyframesNotes:
The text was updated successfully, but these errors were encountered: