Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create TextureStream.md #2743

Open
wants to merge 19 commits into
base: main
Choose a base branch
from
Open
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
353 changes: 353 additions & 0 deletions specs/APIReview_TextureStream.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,353 @@

sunggook marked this conversation as resolved.
Show resolved Hide resolved
sunggook marked this conversation as resolved.
Show resolved Hide resolved
# Background
Many native apps use a native engine for real-time communication scenarios, which include video
capture, networking and video rendering. However, often, these apps still use WebView or
Electrion for UI rendering. The separation between real-time video rendering and UI rendering
sunggook marked this conversation as resolved.
Show resolved Hide resolved
prevents apps from rendering real-time video inside the web contents. This forces apps to
render the real-time video on top of the web contents, which is limiting. Rendering video on
top constrains the user experience and it may also cause performance problems.
We can ask the native apps to use web renderer for video handling because web standard already
provides these features through WebRTC APIs. The end developers, however, prefer to use
their existing engine such as capturing and composition, meanwhile using WebRTC API for rendering.

# Description
The proposed APIs will allow the end developers to stream the captured or composed video frame to
the WebView renderer where Javascript is able to insert the frame to the page through W3C standard
API of Video, MediaStream element for displaying it.
The API will use the shared GPU texture buffer so that it can minimize the overall cost with
regards to frame copy.

# Examples
Javascript
sunggook marked this conversation as resolved.
Show resolved Hide resolved
// User click the video capture button.
document.querySelector('#showVideo').addEventListener('click',
e => getStreamFromTheHost(e));
async function getStreamFromTheHost(e) {
try {
// Request stream to the host with unique stream id.
const stream = await window.chrome.webview.getTextureStream('webview2-abcd1234');
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Calling this a texture stream seems confusing from the web side of things. There is already something called a texture in WebGL in the web world and this is not related to that. Since this is actually returning a MediaStream can we call this getMediaStream instead and also update the native API to be called something similar like CoreWebView2MediaStream?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A bunch of my comments are applicable to the API surface not the sample code and are left unresolved. Can you look through them and reply to them please?

sunggook marked this conversation as resolved.
Show resolved Hide resolved
sunggook marked this conversation as resolved.
Show resolved Hide resolved
// The MediaStream object is returned and it gets video MediaStreamTrack element from it.
const video_tracks = stream.getVideoTracks();
const videoTrack = video_tracks[0];
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

video_tracks and videoTrack are unused?

// Show the video via Video Element.
document.getElementById(video_id).srcObject = stream;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it easy to show where the video_id value is defined and that the element is a video element?

} catch (error) {
console.log(error);
}
}
sunggook marked this conversation as resolved.
Show resolved Hide resolved
Win32 C++
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Win32 C++
## Win32 C++
```cpp

UINT32 luid,
// Get the LUID (Graphic adapter) that the WebView renderer uses.
coreWebView->GetRenderAdapterLUID(&luid);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Many calls throughout here are missing error handling - aren't checking the HRESULT return value. In our sample code we have a CHECK_HRESULT (or something like that) macro we use.

// Create D3D device based on the WebView's LUID.
ComPtr<D3D11Device> d3d_device = MyCreateD3DDevice(luid);
// Register unique texture stream that the host can provide.
ComPtr<ICoreWebView2TextureStream> webviewTextureStream;
g_webviewStaging3->CreateTextureStream(L"webview2-abcd1234", d3d_device.Get(), &webviewTextureStream);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Usually in WebView2, object creation happens on the CoreWebView2Environment as it acts sort of like a factory. Then the object may have additional initialization (like setting up event handlers) and then add or use the object with a CoreWebView2.

Here it looks like the Create method is on the CoreWebView2 and creating it also adds it (by the name parameter) to available texture streams for that CoreWebView2. Is the TextureStream tied to that CoreWebView2 in particular? Or can we move the TextureStream creation to the CoreWebView2Environment and have a separate method for 'adding' it to a CoreWebView2? If so this would have the benefits of:

  • Clearer from API calls when the TextureStream has been added to a WebView2.
  • No concern over races where the TextureStream has been created with a particular name, but the event handlers and such haven't been setup yet.
  • Able to use the same one TextureStream object with different CoreWebView2s or the same CoreWebView2 but with different names.

// Register the Origin URL that the target renderer could stream of the registered stream id. The request from not registered origin will fail to stream.
webviewTextureStream->AddRequestedFilter(L"https://edge-webscratch");
// Listen to Start request
EventRegistrationToken start_token;
webviewTextureStream->add_StartRequested(Callback<ICoreWebView2StagingTextureStreamStartRequestedEventHandler>(
[hWnd](ICoreWebView2StagingTextureStream* webview, IUnknown* eventArgs) -> HRESULT {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
[hWnd](ICoreWebView2StagingTextureStream* webview, IUnknown* eventArgs) -> HRESULT {
[hWnd](ICoreWebView2StagingTextureStream* textureStream, IUnknown* eventArgs) -> HRESULT {

// Capture video stream by using native API, for example, Media Foundation on Windows.
StartMediaFoundationCapture(hWnd);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this method that we don't show what its doing, going to interact with the TextureStream by writing frames to it or something? If so, we should show that method or parts of that method. The sample should show you how to interact with the TextureStream.

return S_OK;
}).Get(), &start_token);
EventRegistrationToken stop_token;
webviewTextureStream->add_StopRequested(Callback<ICoreWebView2StagingTextureStreamStopRequestedEventHandler>(
[hWnd](ICoreWebView2StagingTextureStream* webview, IUnknown* eventArgs) -> HRESULT {
StopMediaFoundationCapture();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Similarly we should show what's happening in here. As is, I don't really know what I'm supposed to do with this event.

return S_OK;
}).Get(), &stop_token);
EventRegistrationToken texture_token;
webviewTextureStream->add_TextureError(Callback<ICoreWebView2StagingTextureStreamTextureErrorEventHandler>(
[hWnd](ICoreWebView2StagingTextureStream* sender, ICoreWebView2StagingTextureStreamTextureErrorEventArgs* args) {
COREWEBVIEW2_TEXTURE_STREAM_ERROR_KIND kind;
HRESULT hr = args->get_Kind(&kind);
assert(SUCCEEDED(hr));
switch (kind)
{
case COREWEBVIEW2_TEXTURE_STREAM_ERROR_NO_VIDEO_TRACK_STARTED:
case COREWEBVIEW2_TEXTURE_STREAM_ERROR_BUFFER_NOT_FOUND:
case COREWEBVIEW2_TEXTURE_STREAM_ERROR_BUFFER_IN_USE:
// assert(false);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The goal is for this to be code that shows how to use this api end to end. We should have working error handling. How would we expect errors like this to be handled? Displayed to user in native UI or web UI or something else?

break;
default:
break;
}
return S_OK;
}).Get(), &texture_token);

// TextureStream APIs are called in the UI thread on the WebView2 process meanwhile Video capture
// and composition could happen in worker thread or out of process.
LRESULT CALLBACK WndProc(HWND hWnd, UINT message, WPARAM wParam, LPARAM lParam)
{
static ICoreWebView2Staging3* webview2_17 = nullptr;
TCHAR greeting[] = _T("Hello, Windows desktop!");
ComPtr<ID3D11Device> d3d_device;
HANDLE slimCoreHandle;
HRESULT hr;
ComPtr<ICoreWebView2StagingTexture> texture_buffer;
int64_t bufferId = -1;
switch (message)
{
case IDC_TEST_SEND_TEXTURE:
sunggook marked this conversation as resolved.
Show resolved Hide resolved
if (webviewTextureStream) {
// Present API should be called on same thread where main WebView
// object is created.
bufferId = (int)wParam;
texture_buffer = texture_address_to_buffer_ids[(HANDLE)bufferId];
// assert(texture_buffer != nullptr);
if (texture_buffer) {
// Notify the renderer for updated texture on the shared buffer.
webviewTextureStream->SetBuffer(texture_buffer.Get(), texture_buffer_info_->timestamp);
webviewTextureStream->Present();
}
}
break;
case IDC_TEST_REQUEST_BUFFER:
// Retrieve available shared buffer.
hr = webviewTextureStream->GetAvailableBuffer(&texture_buffer);
if (SUCCEEDED(hr)) {
texture_buffer->get_Handle((HANDLE*)&bufferId);
}
SendBufferIdToOOFCaptureEngine(false, nullptr, bufferId);
break;
case IDC_TEST_CREATE_NEW_BUFFER:
if (webviewTextureStream) {
ComPtr<ICoreWebView2StagingTexture> texture_buffer;
UINT32 width = (UINT32)wParam;
UINT32 height = (UINT32)lParam;
// Create shared buffer.
webviewTextureStream->CreateSharedBuffer(width, height, &texture_buffer);
texture_buffer->get_Handle(&slimCoreHandle);
texture_address_to_buffer_ids[slimCoreHandle] = texture_buffer;
SendBufferIdToOOFCaptureEngine(true, slimCoreHandle, (int)slimCoreHandle);
}
break;
default:
return DefWindowProc(hWnd, message, wParam, lParam);
break;
}
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Need ``` to end code sample block here

# API Details
Win32 C++
[v1_enum]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
[v1_enum]
```c# (but really IDL)
[v1_enum]

typedef enum COREWEBVIEW2_TEXTURE_STREAM_ERROR_KIND {
/// The host can't create a TextureStream instance more than once
/// for a specific stream id.
COREWEBVIEW2_TEXTURE_STREAM_ERROR_STREAM_ID_ALREADY_REGISTERED,
/// Occurs when the host calls CreateBuffer or Present
/// APIs without being called of Start event. Or, 10 seconds passed before
/// calling these APIs since the OnStart event.
COREWEBVIEW2_TEXTURE_STREAM_ERROR_NO_VIDEO_TRACK_STARTED,
/// The buffer has been removed using RemoveBuffer.
COREWEBVIEW2_TEXTURE_STREAM_ERROR_BUFFER_NOT_FOUND,
/// The texture to be presented is already in use for rendering.
/// Call GetAvailableBuffer to determine an available buffer to present.
/// The developer can technically call SetBuffer multiple times.
/// But once they call Present, the buffer becomes "in use" until
/// they call SetBuffer and Present on a different buffer and wait a bit
/// for the original buffer to stop being used.
COREWEBVIEW2_TEXTURE_STREAM_ERROR_BUFFER_IN_USE,
} COREWEBVIEW2_TEXTURE_STREAM_ERROR_KIND;
/// This is ICoreWebView2Staging3 that returns the texture stream interface.
[uuid(96c27a45-f142-4873-80ad-9d0cd899b2b9), object, pointer_default(unique)]
interface ICoreWebView2Staging3 : IUnknown {
/// Registers the stream id that the host can handle, providing a
/// texture stream when requested from the WebView2's JavaScript code.
/// The host can register multiple unique stream instances, each with
/// a unique stream ID, enabling the host to stream from different sources
/// concurrently.
/// The host should call this only once for unique streamId. The second
/// call of already created streamId without destroying
/// ICoreWebView2StagingTextureStream object will return an error.
/// 'd3dDevice' is used for creating shared IDXGI resource and NT shared
/// of it. The host should use Adapter of the LUID from the GetRenderAdapterLUID
/// for creating the D3D Device.
HRESULT CreateTextureStream(
[in] LPCWSTR streamId,
[in] IUnknown* d3dDevice,
[out, retval ] ICoreWebView2StagingTextureStream** value);
/// Get the graphics adapter LUID of the renderer. The host should use this
/// LUID adapter when creating D3D device to use with CreateTextureStream().
HRESULT GetRenderAdapterLUID([out, retval] LUID* luid);
sunggook marked this conversation as resolved.
Show resolved Hide resolved
/// Listens for change of graphics adapter LUID of the browser.
/// The host can get the updated LUID by GetRenderAdapterLUID. It is expected
/// that the host updates texture's d3d Device with UpdateD3DDevice,
/// removes existing buffers and creates new buffer.
HRESULT add_RenderAdapterLUIDUpdated(
sunggook marked this conversation as resolved.
Show resolved Hide resolved
[in] ICoreWebView2StagingRenderAdapterLUIDUpdatedEventHandler* eventHandler,
[out] EventRegistrationToken* token);
/// Remove listener for start stream request.
HRESULT remove_RenderAdapterLUIDUpdated(
[in] EventRegistrationToken token);
}
/// This is the interface that handles texture streaming.
/// The most of APIs have to be called on UI thread.
[uuid(afca8431-633f-4528-abfe-7fc3bedd8962), object, pointer_default(unique)]
interface ICoreWebView2StagingTextureStream : IUnknown {
/// Get the stream ID of the object that is used when calling CreateTextureStream.
/// The caller must free the returned string with CoTaskMemFree. See
/// [API Conventions](/microsoft-edge/webview2/concepts/win32-api-conventions#strings).
// MSOWNERS: TBD ([email protected])
[propget] HRESULT StreamId([out, retval] LPWSTR* id);
/// Adds an allowed url origin for the given stream id. The stream requests
/// could be made from any frame, including iframes, but these origins
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is that only frames within this CoreWebView2? Or does it apply across any CoreWebView2 associated with that browser process?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We've had issues in the past with APIs intending to work with iframes that are a different origin than the top level document's origin not working in that case. Please make sure to test that

/// must be registered first in order for the request to succeed.
/// The added filter will be persistent until
/// ICoreWebView2StagingTextureStream is destroyed or
/// RemoveRequestedFilter is called.
/// The renderer does not support wildcard so it will compare
/// literal string input to the requesting frame origin. So, the input string
/// should have a scheme like https://.
/// For example, https://www.valid-host.com, http://www.valid-host.com are
/// valid origins but www.valid-host.com, or *.valid-host.com. are not
/// valid origins.
/// getTextureStream() will fail unless the requesting frame's origin URL is
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please document the manner in which it will fail.

/// added to the request filter.
HRESULT AddRequestedFilter([in] LPCWSTR origin);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about naming this AddAllowedOrigin instead? (And the corresponding Remove method)

/// Remove added origin, which was added by AddRequestedFilter.
HRESULT RemoveRequestedFilter([in] LPCWSTR origin);
/// Listens for stream requests from the Javascript's getTextureStream call
/// for the given stream id. It is called for the first request only, the
/// subsequent requests of same stream id will not be called.
/// It is expected that the host provides the stream within 10s after
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What happens if it takes longer than 10s? Is this a requirement by our code or just a suggested goal for end developers?

/// being requested. The first call to Present() fulfills the stream request.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What does it mean to fulfill the stream request? Is the JS getTextureStream call waiting until the first Present call before it returns the MediaStream?

HRESULT add_StartRequested(
[in] ICoreWebView2StagingTextureStreamStartRequestedEventHandler* eventHandler,
[out] EventRegistrationToken* token);
/// Remove listener for start stream request.
HRESULT remove_StartRequested(
[in] EventRegistrationToken token);
/// Listen to stop stream request once the stream started.
/// It is called when user stop all streaming requests from
/// the renderers (Javascript) or the host calls the Stop API. The renderer
/// can stream again by calling the streaming request API.
/// The renderer cleared all registered buffers before sending
/// the stop request event so that the callback of the next start request
/// should register the textures again.
/// The event is triggered when all requests for given stream id closed
/// by the Javascript, or the host's Stop API call.
HRESULT add_StopRequested(
[in] ICoreWebView2StagingTextureStreamStopRequestedEventHandler* eventHandler,
[out] EventRegistrationToken* token);
/// Remove listener for stop stream request.
HRESULT remove_StopRequested(
[in] EventRegistrationToken token);
/// Creates shared buffer that will be referenced by the host and the browser.
/// By using the shared buffer mechanism, the host does not have to
/// send respective texture to the renderer, instead it notifies it
/// with internal buffer id, which is the identity of the shared buffer.
/// The shared buffer is 2D texture, IDXGIResource, format and will be
/// exposed through shared HANDLE or IUnknown type through ICoreWebView2StagingTexture.
/// Whenever the host has new texture to write, it should ask
/// reusable ICoreWebView2StagingTexture from the GetAvailableBuffer,
/// which returns ICoreWebView2StagingTexture.
/// If the GetAvailableBuffer returns an error, then the host calls the
/// CreateBuffer to allocate new shared buffer.
/// The API also registers created shared handle to the browser once it
/// created the resource.
HRESULT CreateBuffer(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Similarly to above, Create* methods usually go on the CoreWebView2Environment. We should consider separating Create from the Add/Register like mentioned above. If we keep them together at the very least it should be named CreateAndAddBuffer or something like that. Just naming something Create doesn't suggest that there will be registration as well.

[in] UINT32 width,
[in] UINT32 height,
[out, retval] ICoreWebView2StagingTexture** buffer);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You call it a buffer in the parameter name and in the Create method name, but the type name is Texture. How about making the type name and Create name match

  • TextureStreamBuffer?
  • TextureStreamTexture?
  • TextureStreamFrame?

/// GetAvailableBuffer can be called on any thread like SetBuffer.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great note that this can be called on any thread. Please make sure all methods and properties that work on other threads are noted as working on other threads. Its stated generally that WebView2 methods only work on the WebView2 UI thread so we need to explicitly call out anything that doesn't work like that.

HRESULT GetAvailableBuffer([out, retval] ICoreWebView2StagingTexture** buffer);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What does GetAvailableBuffer do? Its documentation above needs to explain.

/// Remove texture buffer when the host removes the backed 2D texture.
/// The host can save the existing resources by deleting 2D textures
/// when it changes the frame sizes.
HRESULT RemoveBuffer([in] ICoreWebView2StagingTexture* buffer);
/// Indicates that the buffer is ready to present.
/// The buffer must be retrieved from the GetAvailableBuffer.
/// The host writes new texture to the local shared 2D texture of
/// the buffer id, which is created via CreateBuffer.
/// SetBuffer API can be called in any thread.
HRESULT SetBuffer([in] ICoreWebView2StagingTexture* buffer,
[in] ULONGLONG timestamp);
/// Render texture that is current set ICoreWebView2StagingTexture.
HRESULT Present();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is SetBuffer and Present different methods? Why doesn't Present take the parameters of SetBuffer and present them?

/// Stop streaming of the current stream id.
/// API calls of Present, CreateBuffer will fail after this
/// with an error of COREWEBVIEW2_TEXTURE_STREAM_ERROR_NO_VIDEO_TRACK_STARTED.
/// The Javascript can restart the stream with getTextureStream.
HRESULT Stop();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is the point of the Stop method? Isn't the native code in charge of calling Present and CreateBuffer? Why does it call Stop versus just no longer streaming data to the TextureStream? Does this do something to the corresponding JavaScript objects?

/// Event handler for those that occur at the Renderer side, the example
/// are CreateBuffer, Present, or Stop.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This comments doesn't seem to apply to add_TextureError does it?

HRESULT add_TextureError(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Event names should be verb phrases like ErrorOccurred or ErrorDetected

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If these are general TextureStream errors then we don't need the 'Texture' name prefix.

[in] ICoreWebView2StagingTextureStreamTextureErrorEventHandler* eventHandler,
[out] EventRegistrationToken* token);
/// Remove listener for texture error event.
HRESULT remove_TextureError([in] EventRegistrationToken token);
/// Updates d3d Device when it is updated by RenderAdapterLUIDUpdated
/// event.
HRESULT UpdateD3DDevice([in] IUnknown* d3dDevice);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What does update mean in this context? Is it obvious to someone familiar with D3D what Update would mean here? Can you be more specific in this comment about what Update means?

}
/// Texture stream buffer that the host writes to so that the Renderer
/// will render on it.
[uuid(0836f09c-34bd-47bf-914a-99fb56ae2d07), object, pointer_default(unique)]
interface ICoreWebView2StagingTexture : IUnknown {
/// Returns shared Windows NT handle. The caller expected to open it with
/// ID3D11Device1::OpenSharedResource1 and writes the incoming texture to it.
[propget] HRESULT Handle([out, retval] HANDLE* handle);
/// Returns IUnknown type that could be query interface to IDXGIResource.
/// The caller can write incoming texture to it.
[propget] HRESULT Resource([out, retval] IUnknown** resource);
}
/// This is the callback for new texture stream request.
[uuid(62d09330-00a9-41bf-a9ae-55aaef8b3c44), object, pointer_default(unique)]
interface ICoreWebView2StagingTextureStreamStartRequestedEventHandler : IUnknown {
//// Called to provide the implementer with the event args for the
//// corresponding event. There are no event args and the args
//// parameter will be null.
HRESULT Invoke(
[in] ICoreWebView2StagingTextureStream* sender,
[in] IUnknown* args);
}
/// This is the callback for stop request of texture stream.
[uuid(4111102a-d19f-4438-af46-efc563b2b9cf), object, pointer_default(unique)]
interface ICoreWebView2StagingTextureStreamStopRequestedEventHandler : IUnknown {
/// Called to provide the implementer with the event args for the
/// corresponding event. There are no event args and the args
/// parameter will be null.
HRESULT Invoke(
[in] ICoreWebView2StagingTextureStream* sender,
[in] IUnknown* args);
}
/// This is the callback for texture stream rendering error.
[uuid(52cb8898-c711-401a-8f97-3646831ba72d), object, pointer_default(unique)]
interface ICoreWebView2StagingTextureStreamTextureErrorEventHandler : IUnknown {
/// Called to provide the implementer with the event args for the
/// corresponding event.
HRESULT Invoke(
[in] ICoreWebView2StagingTextureStream* sender,
[in] ICoreWebView2StagingTextureStreamTextureErrorEventArgs* args);
}
/// This is the event args interface for texture stream error callback.
[uuid(0e1730c1-03df-4ad2-b847-be4d63adf700), object, pointer_default(unique)]
interface ICoreWebView2StagingTextureStreamTextureErrorEventArgs : IUnknown {
/// Error kind.
[propget] HRESULT Kind([out, retval]
COREWEBVIEW2_TEXTURE_STREAM_ERROR_KIND* value);
// Texture buffer that the error is associated with.
HRESULT GetBuffer([out, retval] ICoreWebView2StagingTexture** buffer);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This can be a property instead right? [propget] HRESULT Buffer([out, retval] ICoreWebView2StagingTexture** value);

}
[uuid(431721e0-0f18-4d7b-bd4d-e5b1522bb110), object, pointer_default(unique)]
interface ICoreWebView2StagingRenderAdapterLUIDUpdatedEventHandler : IUnknown {
/// Called to provide the implementer with the event args for the
/// corresponding event.
HRESULT Invoke(
[in] ICoreWebView2StagingTextureStream* sender,
[in] IUnknown* args);
}
[uuid(431721e0-0f18-4d7b-bd4d-e5b1522bb110), object, pointer_default(unique)]
interface ICoreWebView2StagingRenderAdapterLUIDUpdatedEventHandler : IUnknown {
/// Called to provide the implementer with the event args for the
/// corresponding event.
HRESULT Invoke(
[in] ICoreWebView2Staging3 * sender,
[in] IUnknown* args);
}

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And add the ``` to end the idl block here


Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This document is missing the MIDL3 API definition and its missing the WinRT/.NET C# sample code