-
Notifications
You must be signed in to change notification settings - Fork 831
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Support langchain transformer on fabric #2036
Conversation
Hey @lhrotk 👋! We use semantic commit messages to streamline the release process. Examples of commit messages with semantic prefixes:
To test your commit locally, please follow our guild on building from source. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Summary by GPT-4
This code change adds support for running the LangchainTransform on Synapse internal platform. It checks if the code is running on Synapse internal platform and sets the default URL accordingly. If the URL is not set and the code is running on Synapse internal, it initializes a personalized session using OpenAIPrerun. Otherwise, it uses the provided subscription key, URL, and API version to configure OpenAI API.
Here's a summary of the changes:
- Import
running_on_synapse_internal
fromsynapse.ml.core.platform
. - Check if running on Synapse internal platform using
running_on_synapse_internal()
. - If running on Synapse internal platform, set the default URL using
get_fabric_env_config().fabric_env_config.ml_workload_endpoint + "cognitive/openai"
. - In
_transform
method, check if running on Synapse internal platform and URL is not set. - If true, initialize a personalized session using
OpenAIPrerun(api_base=self.getUrl()).init_personalized_session(None)
. - Else, configure OpenAI API using provided subscription key, URL, and API version.
This change allows LangchainTransform to work seamlessly both on Synapse internal platform and other platforms by automatically configuring the appropriate settings based on the environment it's running in.
Suggestions
The changes in this PR seem to be well-implemented and do not require any suggestions for improvement.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Quick question for ya
self.useFabricInternalEndpoints = Param(self, "useFabricInternalEndpoints", "use internal openai endpoints when on fabric") | ||
self.running_on_synapse_internal = running_on_synapse_internal() | ||
if running_on_synapse_internal(): | ||
from synapse.ml.fabric.service_discovery import get_fabric_env_config | ||
self.fabric_env_config = get_fabric_env_config() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is there any way to do this more implicitly without an explicit new param that users need to set? Im thinking that if a user doesent set the url and its on fabric it defaults to the fabric one if it can find one, otherwise it will throw an error
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I get your point, it is possible to pass only the openai api_base. I will make the change
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
THanks boss
ae0e66f
to
c3cf48e
Compare
/azp run |
Azure Pipelines successfully started running 1 pipeline(s). |
Codecov Report
@@ Coverage Diff @@
## master #2036 +/- ##
==========================================
- Coverage 87.03% 87.02% -0.01%
==========================================
Files 306 306
Lines 16063 16063
Branches 852 852
==========================================
- Hits 13980 13979 -1
- Misses 2083 2084 +1 |
/azp run |
Azure Pipelines successfully started running 1 pipeline(s). |
* support langchain transformer on fabric * avoid addtional param * format code --------- Co-authored-by: cruise <[email protected]>
Related Issues/PRs
N/A
What changes are proposed in this pull request?
Make Langchain transformer use fabric internal endpoints by default
How is this patch tested?
Manually tested on fabric
Does this PR change any dependencies?
Does this PR add a new feature? If so, have you added samples on website?