In this video, we are going to test out AutoGEN + Ollama + LiteLLM combination.
We are going to load up two LLMs using the LiteLLMs and Ollama and power up our AutoGEN Framework.
Ollama: ollama.ai/
LiteLLM: litellm.ai/
AutoGen: microsoft.github.io/autogen/
Github Code: github.com/PromptEngineer48/A...
Let’s do this!
Join the AI Revolution!
#ai #autogen #mistral #neural-chat #ollama #litellm #assistantAPIs #GPTs #llm_selector #auto_llm_selector #localllms #github #streamlit #langchain #qstar #openai #ollama #webui #github #python #llm #largelanguagemodels
CHANNEL LINKS:
☕ Buy me a coffee: ko-fi.com/promptengineer
🧛♂️ Join my Patreon: / promptengineer975
❤️ Subscribe: / @promptengineer48
💀 GitHub Profile: github.com/PromptEngineer48
🔖 Twitter Profile: / prompt48
🤠Join this channel to get access to perks:
/ @promptengineer48
TIME STAMPS:
0:00 - Intro
3:58 - Conda Environment Creation
5:38 - Install LiteLLM
5:52 - Install AutoGEN
6:07 - Start Mistral
6:43 - Start Neural-Chat
8:06 - Explanation of the Code
10:42 - Running the Code
11:42 - Future Prospects
🎁Subscribe to my channel: / @promptengineer48
If you have any questions, comments or suggestions, feel free to comment below.
🔔 Don't forget to hit the bell icon to stay updated on our latest innovations and exciting developments in the world of AI!
Негізгі бет Ғылым және технология Run AutoGEN using Ollama/LiteLLM in SIMPLE Steps | Updated (Use Case)
Пікірлер: 23