-
-
Notifications
You must be signed in to change notification settings - Fork 149
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Text Tokenizer #47
Comments
Is there anything preventing you from using GPT-3-Encoder-Sharp with this package? |
Honestly, I would rather OpanAI add an endpoint specifically to do this. They have their own tokenizer utility page that gives you an idea about how many tokens but the encoder is different per model. I may not pick up this issue only because it's a moving target and there's other nuget packages that can handle this task. |
Even OpenAI its recommending one third package called gpt-3-encoder
We don't have to worry about changing/evolving encoder logic because the original encode by OpenAI was released 4 years ago and there were no changes to the encoder logic till now. The encoding logic of GPT-2 and GPT-3 is the same. |
but encoder for gpt-4 is different |
Not sure about gpt-4, (But I don't think so) but my point is that it won't change or evolve. If GPT-4 has a different encoding, we can write one-time encoding logic for GPT-4, and it will never change. |
That's not what I heard |
In either case, like I said, I won't be picking up this task, but PRs are always welcome. |
Sure, I will do it over the weekend. |
I still don't understand why the package you referenced before isn't a sufficient substitute? |
Just want one OpenAI package to do everything related to OpenAI, that's all. It's up to you, feel free to close the issue. 🙂 |
I'll leave it open if you plan to open a PR, I was just curious more than anything. |
https://github.com/aiqinxuancai/TiktokenSharp Here's another good reference. I like that they're also pulling tiktoken |
@StephenHodgson I referred the OpenAI's implementation, they also pull tokens from the blob. In their code, I found one interesting comment, it says, "# TODO: these will likely be replaced by an API endpoint". Now my question is that are you still open to have our own custom implementation or wait for the API endpoint? |
Nice, looks like they took my suggestion seriously |
I guess it doesn't hurt to do it. And then when the API becomes available replace it. |
@logankilpatrick any internal support on adding API for tokenizer? |
I think it's best to use either the Microsoft version or the slightly faster Tiktoken for the time being if optimization is needed. |
…ePixel#42, plus adds Usage stats to results
I agree, I think the msft package should be easily integrable. I may consider adding it as a dependency. |
I recommend using SharpToken because it is the fastest with lowest memory consumption thanks to my latest PR to that repository. Benchmark results:
|
Feature Request
Add a way to tokenize text so that it can be passed as an input (like logit_bias) for models
Is your feature request related to a problem? Please describe.
I am trying to use OpenAI APIs like completion, in that, there is an option to pass "logit_bias" but to currently there is no wat to generate the proper token of a text in order to pass in that.
Describe the solution you'd like
.Net implementation of OpenAI's Tokenizer
Describe alternatives you've considered
There is an existing MIT-licenced nuget package called GPT-3-Encoder-Sharp that does it.
The text was updated successfully, but these errors were encountered: