βοΈPrompting up your backend (APIs)
The initial prompt is key while building any backend. A backend in Databutton consists of Python APIs.
Last updated
The initial prompt is key while building any backend. A backend in Databutton consists of Python APIs.
Last updated
While building the API β we have noticed that defining a well defined argument model works best. Thus it becomes super essential to clearly specify the inputs and outputs for the API.
The Databutton agent will store the necessary packages it needs to do its job, and will ask you for the necessary api keys (like openai keys). Your API keys is stored as a secret in Databutton, leveraging google's secret store behind the scenes.
The initial prompt with the a short description of the LLM was useful for the Agent to plan further, in this case it asks for the an API key ( OpenAI API key ).
On receiving the API key, the agent proceeds to write and, when necessary, debug the code, ultimately building the FastAPI endpoint.
Databutton is trained with the most common AI stacks. For instance, OpenAI, LangChain, CohereAI etc. If you have specific suggestions, please let us know - we can easily include them.
It uses databuttonβs own SDK to fetch the API key from the storage.
Databutton has access to internet and can perform real-time web searches and conduct research on the relevant results!
You can trigger this functionality by writing a prompt with "Research about it ...".
Passing urls of docs, works pretty well with Databutton inorder to gather relevant informations.
Databutton ensures the testing of the generated API. If any bugs are found, the Databutton's βDebugging Toolβ starts analysing the error logs to debug them.
The console is the best place to monitor any informations related to the API.
Using print
statement can help to dump output as well. For example, print(llm_response)
If any error persist constantly and hard to debug , always feel free to reach us via the intercom bubble!