š
PromptCraft is now RAG-ready š Oh, and it also works with
Llama2!
We've been hard at work building some of our most asked features for āØPromptCraft ⨠and are excited to announce a powerful upgrade!
⢠š¤
PromptCraft is now RAG-ready! You can inject context by pointing PromptCraft to your RESTful #RAG endpoint. We'll retrieve your context using the input and help you execute, track, and evaluate your #LLM prompts!
⢠š¦
Want to iterate on your self-hosted prompts? You can now choose from multiple OpenAI models and even point your PromptCraft UI to a #Llama2 model that you host with Hugging Face TGI.
⢠š§
We've also upgraded our test suite - you don't need to code anymore!! You can simply write a .csv of your test suite containing expectations and one .csv with your test cases, and PromptCraft will run the tests and generate a report showing improvement. #automation
⢠āļø
Moar settings! - You can also now change more settings for your completion and chat completions models - top_k, temperature, etc. and also run tests on chat completions
Would love to talk to anyone here interested and see if this is a viable tool to help improve your PromptOps productivity.
Try out v0.3.1 now!