Switches to ellmer
for all integration with the
LLMs. This effectively removes any direct integration such as that one
used for OpenAI, Databricks and LlamaGPT-chat. It will now only
integrate with whatever backend ellmer
integrates
with.
Shiny app now uses the stream from functionality from
ellmer
instead of the more complex, and error prone,
background process.
Fixes how it displays error from the model end-point when being used in a notebook or the app
Fixes how the errors from OpenAI are parsed and processed. This should make it easier for users to determine where an downstream issue could be.
Adds model
to defaults
Improves token discovery