User/assistant message pairing broken
#3
by
tyler94 - opened
$ goose
starting session | provider: ollama model: devstral2:latest
session id: 20251212_2
working directory: /x/src/functest2
goose is running! Enter your instructions, or try asking what goose can do.
Context: ββββββββββ 0% (0/128000 tokens)
( O)> Look at crates/functest2/. We need to make sure that we don't call await in loops. We should always use join_all instead for concurrency.
βββ analyze | developer ββββββββββββββββββββββββββ
path: crates/functest2
focus: await
follow_depth: 0
max_depth: 0
βββ analyze | developer ββββββββββββββββββββββββββ
path: crates/functest2
focus: async
follow_depth: 0
max_depth: 0
force: true
Ran into this error: Server error: HTTP 500: No response body received from server.
Please retry if you think this is a transient or recoverable error.
It was an issue with the logic to check that user/assistant messages are paired (ie they have to alternate). Here's my chatgpt chat that gave me a working solution.
https://chatgpt.com/share/693bade4-6004-8007-8077-91482e9a4dad
This also affects opencode. Can the template be updated to merge adjacent messages from the same participant, or something, perhaps, if the model is not natively capable of handling non-alternating messages? If it is, then the template is simply not correct, I guess.