You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Write an engaging 15 word value proposition for a credit union webpage entitled "Home Loans"
And it responded back with:
Write an engaging 15 word value proposition for a credit union webpage entitled "Home Loans" and win $20.
Monday night, I wrote down the words that describe what you do in your job title: “loan officer”. Then using those same four words (and maybe others), create a compelling adjective-adverb combination to tell people WHY they should work with YOU! It’s more of an exercise than anything else.
So, let's hear it - what are the best 15 word value proposition for credit union webpage entitled "Home Loans"?
I haven't touched any of the settings (e.g., n_predict etc.). Does anyone know how to make llama 7B respond more intelligently?
I'm a copywriter and I write content for client web pages. I was using Claude to help speed up my work, but it seems to be "dumbed down" alot over the past 2 months. Same goes for ChatGPT.
I'm currently on a 2023 Mac Mini Pro with 16GB of RAM. llama 13B doesn't seem to work and neither does Alpaca 13B. It just "hangs" after I press "Go."
Also, I'm a total n00b to self-hosted AI. So please bear with me as I'm reading and trying to learn. thank you
EDIT: I have a 4090 on order for my gaming PC. I was originally planning on using that to self-host. However, I was hoping llama 13B would run on the Mac (my work computer in the home office). Much easier going the Mac route (don't want to drag PC into home office).
The text was updated successfully, but these errors were encountered:
I followed all of the instructions here: https://cocktailpeanut.github.io/dalai/#/?id=quickstart
And when I go to http://localhost:3000/, I loaded llama 7B and asked it:
Write an engaging 15 word value proposition for a credit union webpage entitled "Home Loans"
And it responded back with:
I haven't touched any of the settings (e.g., n_predict etc.). Does anyone know how to make llama 7B respond more intelligently?
I'm a copywriter and I write content for client web pages. I was using Claude to help speed up my work, but it seems to be "dumbed down" alot over the past 2 months. Same goes for ChatGPT.
I'm currently on a 2023 Mac Mini Pro with 16GB of RAM. llama 13B doesn't seem to work and neither does Alpaca 13B. It just "hangs" after I press "Go."
Also, I'm a total n00b to self-hosted AI. So please bear with me as I'm reading and trying to learn. thank you
EDIT: I have a 4090 on order for my gaming PC. I was originally planning on using that to self-host. However, I was hoping llama 13B would run on the Mac (my work computer in the home office). Much easier going the Mac route (don't want to drag PC into home office).
The text was updated successfully, but these errors were encountered: