Q: “It’s it really reasoning or with each prompt…”
It doesn’t remember just the prompt but the combination of prompts and outputs during the chat (it’s context window).
So you’re building a network of contexts that gradually increases in statistical narrowing of outputs (relevance) until it runs out of context memory, or gets sidetracked. So that’s why you must produce prompts that not only increase the contextual precision but also correct it’s past inferences.
It’s not any different from speaking to someone with whom you’re unfamiliar and trying to get them to understand what might be unfamiliar to them. You’re trying to narrow the parameters (context) sufficiently to produce unambiguity so that agreement (understanding), deduction, induction, and even abduction (guessing) can occur.
Same thing. 😉
Reply addressees: @Nunnie3001 @xriskology
Source date (UTC): 2024-08-17 23:11:46 UTC
Original post: https://twitter.com/i/web/status/1824947257005023232
Replying to: https://twitter.com/i/web/status/1824940236540944877
Leave a Reply