-
-
Notifications
You must be signed in to change notification settings - Fork 40
Description
To upvote this issue, give it a thumbs up. See this list for the most upvoted issues.
Please avoid AI slops, be concise, and focus on what matters for this issue.
Describe the bug
Sometimes stop does not actually stop the generation, and I end up with multiple overlapping answers.
I'm using eca-emacs, not sure where this bug belongs.
Example output:
Ervixplorce, e
Per cl'orompchestletratorare le spe agecgiiuficnge l'he, ho eventancora o alcuniall chiara coimenti da connec setassartoi `da_c:ategor
**i1zza. Tripi die`
eventi2. nFellase di caategorizz caziodonea:
- Me**ssa:ggi L'LLMut viene inenterte (da Terogalto per cegram, ompMletare atrix,inform azioEmnail) i ma→ dopo vencrificaanti (prfiirma?
- Torità, deriggera evedline, tnt pier proatmetiout) vie atggià orn(es. "mare il daeetitab
(two instances are writing their own output simultaneously. If I press STOP, one of the two stops but the other continue)
To Reproduce
I don't have a simple sequence of steps to reproduce. It just happens randomly if I interrupt the generation.
Expected behavior
Stop should always stop the agent.
Additional context
I'm using local llama.cpp
"providers": {
"llama": {
"api": "openai-chat",
"url": "http://localhost:8080/v1",
"fetchModels": true
}
}Metadata
Metadata
Assignees
Labels
Type
Projects
Status