Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

llama3.2 on iPhone 16 generates repeated, bad responses #7156

Open
fighting300 opened this issue Dec 3, 2024 · 2 comments
Open

llama3.2 on iPhone 16 generates repeated, bad responses #7156

fighting300 opened this issue Dec 3, 2024 · 2 comments
Labels
bug Something isn't working module: examples Issues related to demos under examples directory need-user-input The issue needs more information from the reporter before moving forward

Comments

@fighting300
Copy link

🐛 Describe the bug

Running llama3.2 results in an error on iphone 16, making conversation impossible.

IMG_0002
IMG_0003
IMG_0004
IMG_0005
IMG_0006

Versions

iphone: 16
os:iOS 18.1
PyTorch version: 2.2.2
Is debug build: False
CUDA used to build PyTorch: None
ROCM used to build PyTorch: N/A

OS: macOS 15.0.1 (x86_64)
GCC version: Could not collect
Clang version: 16.0.0 (clang-1600.0.26.3)
CMake version: version 3.31.1
Libc version: N/A

Python version: 3.10.15 (main, Sep 7 2024, 00:20:06) [Clang 15.0.0 (clang-1500.3.9.4)] (64-bit runtime)
Python platform: macOS-15.0.1-x86_64-i386-64bit
Is CUDA available: False
CUDA runtime version: No CUDA
CUDA_MODULE_LOADING set to: N/A
GPU models and configuration: No CUDA
Nvidia driver version: No CUDA
cuDNN version: No CUDA
HIP runtime version: N/A
MIOpen runtime version: N/A
Is XNNPACK available: True

CPU:
Intel(R) Core(TM) i7-1068NG7 CPU @ 2.30GHz

Versions of relevant libraries:
[pip3] executorch==0.4.0a0+6a085ff
[pip3] executorchcoreml==0.0.1
[pip3] numpy==1.21.3
[pip3] torch==2.2.2
[pip3] torchao==0.7.0+git75d06933
[pip3] torchaudio==2.2.2
[pip3] torchsr==1.0.4
[pip3] torchvision==0.17.2
[conda] executorch 0.4.0a0+6a085ff pypi_0 pypi
[conda] executorchcoreml 0.0.1 pypi_0 pypi
[conda] numpy 2.1.3 pypi_0 pypi
[conda] numpydoc 1.7.0 py312hecd8cb5_0 defaults
[conda] torch 2.2.2 pypi_0 pypi
[conda] torchaudio 2.2.2 pypi_0 pypi
[conda] torchsr 1.0.4 pypi_0 pypi
[conda] torchvision 0.17.2 pypi_0 pypi

@dbort dbort changed the title Running llama3.2 results in an error, making conversation impossible. llama3.2 on iPhone 16 generates repeated, bad responses Dec 3, 2024
@dbort
Copy link
Contributor

dbort commented Dec 3, 2024

@fighting300 thanks for letting us know about the problem. What steps can we follow to build the same version of the app that you're using?

  • What version of executorch are you using? I see from the env info that you're using v0.4.0 (6a085ff)
  • Which instructions did you follow to build/get the app?

@dbort dbort added bug Something isn't working need-user-input The issue needs more information from the reporter before moving forward module: examples Issues related to demos under examples directory labels Dec 3, 2024
@shoumikhin
Copy link
Contributor

Also, do you get the same results for command line generation?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working module: examples Issues related to demos under examples directory need-user-input The issue needs more information from the reporter before moving forward
Projects
None yet
Development

No branches or pull requests

3 participants