Skip to content

Issues: ollama/ollama

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Label
Filter by label
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Milestones
Filter by milestone
Assignee
Filter by who’s assigned
Sort

Issues list

ollama save feature feature request New feature or request
#4785 opened Jun 2, 2024 by CorollaD
Weird output with ordinary setting bug Something isn't working
#4784 opened Jun 2, 2024 by JoonSumisu
ollama not show my model. bug Something isn't working
#4781 opened Jun 2, 2024 by tuantupharma
Pls add Radeon VII feature request New feature or request
#4780 opened Jun 2, 2024 by MrSteelRat
Support for jina-embeddings-v2-base-zh model request Model requests
#4778 opened Jun 2, 2024 by wwjCMP
Error: llama runner process has terminated: exit status 1 bug Something isn't working
#4775 opened Jun 2, 2024 by BAK-HOME
Ignoring env, being weird with env bug Something isn't working
#4771 opened Jun 1, 2024 by RealMrCactus
server.log grows indefinitely on windows bug Something isn't working
#4770 opened Jun 1, 2024 by dhiltgen
Model response corruption and leaking data between session. bug Something isn't working
#4767 opened Jun 1, 2024 by MarkWard0110
ollama stop [id of running model] feature request New feature or request
#4764 opened Jun 1, 2024 by mrdev023
Add this web app to the list of apps in the README feature request New feature or request
#4758 opened May 31, 2024 by greenido
FROM is not recognized bug Something isn't working
#4753 opened May 31, 2024 by EugeoSynthesisThirtyTwo
Multi-GPU and batch management feature request New feature or request
#4752 opened May 31, 2024 by LaetLanf
Garbage output running llama3 GGUF model bug Something isn't working
#4750 opened May 31, 2024 by DiptenduIDEAS
Custom-llama issue bug Something isn't working
#4748 opened May 31, 2024 by Ascariota
sensitivity to slow or unstable internet bug Something isn't working
#4739 opened May 31, 2024 by logiota
Unable to Change Ollama Models Directory on Linux (Rocky9) bug Something isn't working
#4732 opened May 30, 2024 by pykeras
llama3:8b-instruct performs much worse than llama3-8b-8192 on groq bug Something isn't working
#4730 opened May 30, 2024 by mitar
dolphin-2.9.2-mixtral-8x22b model request Model requests
#4729 opened May 30, 2024 by psyv282j9d
ProTip! Find all open issues with in progress development work with linked:pr.