Releases: mendix/openai-connector
release-pgvectorknowledgebase-2.2.0
We have made some small UX improvements on the admin page to manage vector database configurations.
release-genaicommons-1.5.0
We have improved the documentation texts of entities in the domain model of GenAI Commons.
release-conversationalui-1.5.0
We have made a number of small changes to facilitate easier implementation in custom projects.
- The exposed names of the operations in Conversational UI have been made clearer.
- Custom action microflows can now receive specializations of ChatContext as well.
- The text below the chat input box can now be customized easily using a constant (@UserInputInstructions).
- Additional css classes have been added to allow for styling customization in the chat conversation UI. For a full overview, see the https://docs.mendix.com/appstore/modules/genai/conversational-ui/
For the end user a button to clear the current chat was added.
release-connector-3.4.1
Added migration file that was missing in release 3.4.0.
We replaced many actions inside the chat completions operations by a new GenAICommons action that processes the request. This requires the newest version of the GenAICommons module. In addition, we improved the log messages for failed operations.
release-connector-3.4.0
We replaced many actions inside the chat completions operations by a new GenAICommons action that processes the request. This requires the newest version of the GenAICommons module. In addition, we improved the log messages for failed operations.
release-genaicommons-1.4.0
We added a new java action for connector builders that can be used for chat completions to execute the model calling microflow while also taking care of processing functions that the model called. Additionally, it stores the Usage object for a given deployment identifier.
release-showcase-5.1.0
A page has been added for usage monitoring. You can now see how to use the page, snippets and logic from the Conversational UI module to get insights into how much tokens were used up during successful GenAI operations in the specified time period. Currently, the token monitor is only supported by the OpenAI Connector.
release-genaicommons-1.3.0
We have added operations for GenAI connector developers to be able to store token usage data for monitoring purposes. This is a new requirement for GenAI connectors that adhere to the principles of GenAI Commons. For now, this is only applicable for chat completions (text generation) and embeddings operations.
If you are working on a customer app and want to show charts in your app at runtime for this usage data, you can use the page and logic from the Conversational UI module. To control whether usage data is stored, refer to the constant StoreUsageMetrics.
Additionally, we added a cleanup process that can be controlled using the constant Usage_CleanUpAfterDays (see in-model documentation for more details).
release-conversationalui-1.4.0
We added UI components to visualize GenAI token consumption for monitoring purposes. Pages, snippets and logic can be found in USE_ME under “Usage Monitoring”. The latest version of the GenAI Commons module is required and the latest version of the OpenAI Connector supports this feature.
An example can be observed in the newest version of the GenAI Showcase App.
Finally, a styling bug for conversation starters in a horizontal gallery was fixed.
release-connector-3.3.0
The operations for chat completions and embeddings now store token usage data for every successful call, if enabled in GenAI Commons. This can be used for usage monitoring purposes. Updating the GenAI Commons module is required for this connector version to compile.
To display usage data, pages and logic were made available in the Conversational UI module. See the GenAI showcase app for an example implementation.