Hey! With this course from beginning to end, you will be familiar with AutoGen and be able to create your own ai agent workflow. Like, subscribe and comment 😀 Have a good day coding!
@Imrankhan-jw1so
4 ай бұрын
Oh no coding again?
@StynerDevHub
3 ай бұрын
I love that your profile picture features both you and your wife. I'm subscribing not only because you're a great teacher but also because you proudly display the love of your life. 🥳❤🥳❤🥳❤🥳❤🥳❤
@TylerReedAI
3 ай бұрын
@@StynerDevHub Thank you! I really appreciate your words! Yeah, it wouldn't be possible without her...that's the truth lol
@nestorcolt
4 ай бұрын
You earned a new subscriber and loyal follower, gentleman. Great speech modulation and clearness.
@TylerReedAI
4 ай бұрын
Thank you so much appreciate this 🙌
@abhaymishra7991
4 ай бұрын
I usually do not comment, but I am commenting because you are just awsome. I was confused from last 3 days about LangGraph vs Autogen, but now you have completed my all doubt, with this video, thanks.
@TylerReedAI
4 ай бұрын
Hey thank you so much! I'm so glad to help clear things up 👍
@padhuLP
4 ай бұрын
A wonderful beginner's tutorial. Thanks for providing the code as we can copy paste and test it quickly. Appreciate it.
@AndyPandy-ni1io
4 ай бұрын
how did you get it working i get so many issues with it running locally
@TylerReedAI
4 ай бұрын
@AndyPandy-ni1io what issues are you having?
@AndyPandy-ni1io
4 ай бұрын
@@TylerReedAI main thing is when I make from scratch but I want it run run a local LLM do I still need the config.json or do I just put the Equivalent API stuff in main.py?
@BenjaminK123
3 ай бұрын
This is truly a fantasic video, i have been trying to learn crewai and i just have crap results every time plus its allot more code, i think i am setting on autogen and your video has been a huge help thank you you just earn another sub :)
@jimg8296
5 ай бұрын
Great summary. I've been looking for more examples of Autogen. I'd love to see a comparison of CrewAI vs Autogen and the code behind the test.
@sirishkumar-m5z
Ай бұрын
Great beginner's course on AutoGen! A solid starting point for those new to AI development, and an excellent way to build foundational knowledge."
@TylerReedAI
Ай бұрын
Thank you for this, I really appreciate it!
@brianvansteen4422
Ай бұрын
Only 11 minutes in, and this is great! Have subscribed! Haha! My execution of the first script, it pip installed the yfinance library!
@TylerReedAI
Ай бұрын
Awesome! Good job 👍
@brianvansteen4422
Ай бұрын
@@TylerReedAI Typo for the functions section, Euro to USD should be 1 * 1.1
@ahmedadly
5 ай бұрын
Thank you Tyler, Awesome as usual!
@TylerReedAI
4 ай бұрын
Thank you!!
@robgruhl3439
4 ай бұрын
This course is fantastic, thank you!!
@AndyPandy-ni1io
3 ай бұрын
OK so sorry for the caps, mastered it now THANKS :) best thing you can do is talk a little slower though haha makes it hard to follow when it's all new EDIT = Just want to say anyone not getting this at 1st do it a few times and suddenly the penny will drop, just focus on why he's writting and get the structure understood then it's not so hard anymore :)
@TylerReedAI
3 ай бұрын
Okay noted!
@bacloud
3 ай бұрын
Great explanation and summary of autogen !
@TylerReedAI
3 ай бұрын
Thank you 👍
@DancingFireStudios-yw7yv
3 ай бұрын
thank you for these videos. I would like to ask that a KZitemr do a video like this that goes straight into using a local LLM so that those of us without extensive knowledge don't have to piece it together later and hope we have it all right. Because quite frankly anyone looking to set this stuff up is more likely headed to the free version of most all if not all of this.
@TylerReedAI
3 ай бұрын
You’re welcome! And I hear ya and understand. It seemed it might be easier to get something going with OpenAI but I see your point!
@RetiredVet1
5 ай бұрын
Sounds fantastic. I will take it as soon as I can.
@TylerReedAI
5 ай бұрын
Awesome, thank you 👍
@ravishmahajan9314
Ай бұрын
Thank you :) you are awesome
@TylerReedAI
Ай бұрын
Thank you :)
@EigenA
Ай бұрын
Great!
@BerkGoknil
4 ай бұрын
Tyler, excellent video. I learned a lot. God bless you.
@TylerReedAI
4 ай бұрын
Thank you, glad it was helpful!
@autonom8
Ай бұрын
Quick note: When I copy the code from your github repo, it says human_input_mode="ALWAYS". I didn't notice that and the code completed and asked me for input. I then noticed in the tutorial that you used human_input_mode="NEVER". I made this change and I was able to get the agents to "auto run". Perhaps others might run into this issue as well. Thanks for the tutorial!
@TylerReedAI
Ай бұрын
Thanks for this! I hope this helps somebody else as well
@kks2105
3 ай бұрын
Thanks for this awesome tutorial. Looks like llama70 has some weird issue when trying the sequential chat. !!
@TylerReedAI
2 ай бұрын
You are welcome! Oh really? That's good to know, thank you for the testing! Hopefully they get better lol
@GES1985
3 ай бұрын
I saw a video that had little game devs working together. I want to make a game in Unreal Engine using little AI agents as helpers, but im not entirely convinced that AI agents are all the way there yet (?) What all can be done in this regard, to your knowledge? Like, i need one agent to interact with me, as my liasian to the other agents, to help prioritize which agents operate / do tasks in / etc., one to give screen reading/ keyboard/ mouse control to, to operate some specific programs (unreal, browser, etc), one to scrape websites for data, one to compile that data into tables, one that can learn unreal engine, one that codes in multiple languages, one for front end, one for back end, one to operate a local ai image generation on my workstation to make 2d pictures for inventory item sprites, and ui design iterations for me to pick through, etc.
@ravishmahajan9314
Ай бұрын
i dont have openAI keys as they are costly. But we have cohere & Gemini APIs offered free of cost. So it will be great if we can get same configurations via cohere & Gemini so that more people can practice freely.
@georgewestbrook4512
4 ай бұрын
Amazing tutorial, very clear and packed full!
@dheerajsachdeva5735
Ай бұрын
Another observation I saw: if I simply stated the user query as “who are you and what is your purpose?” - it starts the initiate_chat() with couple of agents and talk about life and AI etc whereas it should be intelligent enough to know that the question doesn’t really require a swarm of agents to respond to Hi / Hello / Who are you type of questions (which users invariably ask when starting a chat). How can we address these trivial queries within this framework and not spawn unnecessary agents and group chat?
@thepresistence5935
19 күн бұрын
This is awesome stuff. But I really recommend you guys to check the documentation :)
@TylerReedAI
16 күн бұрын
Yes documentation always, especially with how often it gets updated
@gokusaiyan1128
3 ай бұрын
is function calling same as adding skills to agents ? if not can you make a video on adding skills to agents in autogen
@TylerReedAI
3 ай бұрын
yeah its the same idea, give them tools to execute some actions. But yes, I plan on making videos on adding skills with autogen studio.
@RetiredVet1
4 ай бұрын
I don't see the reddit url you mention at 1:20:28. I tried to see the url, but I can't make it out when I zoomed in on the video.
@TylerReedAI
4 ай бұрын
fixed it, thank you!
@PrinzMegahertz
4 ай бұрын
Thank you for this excellent introduction! I have one question: I would like to have two agents performing an interview with a human on a certain topic. The first agent should ask the questions, while the second agent should reflect on their understanding of the topic and decide whether additional messages are needed. This seems like a good case for a Nested Chat. However, the nested chat seems to be bound to the amount of turns you define in the beginning. Is there a way to have the nested agent decide when to finish the interaction?
@TylerReedAI
4 ай бұрын
Hey I'm glad it has helped and thank you! So yeah you can determine how many max_turns a chat could have in the nested chat. It is sequential, but I guess for that...you may just need to say something in the prompt of each agent. For instance, AssistantAgent could say, ".... When task is done, reply TERMINATE". Then the UserAgent checks for that in the termination message. res = user.initiate_chats( [ {"recipient": assistant_1, "message": tasks[0], "max_turns": 1, "summary_method": "last_msg"}, {"recipient": assistant_2, "message": tasks[1]}, ] ) Here, in this example, you can increase the max turns where the user will initiate a chat with another assistant. I get what you're saying, and I think the answer is...No. Like not exactly. The closest would be with the prompting. Hope this helps, if it didn't let me know!
@PrinzMegahertz
4 ай бұрын
@@TylerReedAI Thank you very much, I'll give it a try!
@srikanthpvr2856
2 ай бұрын
@tylerreedAI : i have a use case at work, where I need to load a xlsx file with dealer data and then based on user question with year, month for a particular dealer..need to calculate 12 month previous rolling avg including current month on Demand for each part of that dealer and previous month rolling avg for 12 months and then find the difference of current to previous rolling avg on demand and come up with a percentage difference of demand. if the difference is 10% or more create a new file with those entries. I have it kind of working with 3 agents group chat for all dealer data but when i try to add filters like year or month or dealer or part number, it falls apart. would love to take your input and get it fixed , if you are willing to help. this would prove your subscribers that it can work in real scenario than just hello worlds. thanks
@yumaheymans680
3 ай бұрын
legend
@TylerReedAI
3 ай бұрын
Thank you 🙏
@steveknows6126
4 ай бұрын
Thanks Tyler. I see you suggested goinng oai config instead of a .env and they both appear to do the same. What's the difference?
@TylerReedAI
4 ай бұрын
Hey, yeah there really is no functional difference, it's just how they get the properties. Because you could even just import os, and then say like os.environ("OPENAI_API_KEY"), something like that, and you would have that set into your configuration on env path. the oai_json is just the way I like to do it
@jcdenton1664
5 ай бұрын
This is excellent! Thank you for your efforts in bringing this tutorial out. Can I ask, can we add pdf’s to agents. Like ask agents to digest pdfs at particular points in the workflow and contribute to the discussion based on what it learns there?
@TylerReedAI
4 ай бұрын
Hey thank you, and absolutely you can. This would be using RAG. I will have a video soon on how to do just this!
@FaithfulStreaming
3 ай бұрын
How can i do it on autogenstudio? I am in the conda environment but it doesnt do it on my desktop
@TylerReedAI
3 ай бұрын
So once you have it installed with pip install autogenstudio, then run it like this: Autogenstudio ui -port 8080 Then it will show a localhost url in the terminal
@ravishmahajan9314
Ай бұрын
Inorder to run phi-2 locally, what are the hardware requirements. I have a 8GB RAM with i3 processor
@TylerReedAI
Ай бұрын
That may be fine, I only have 8GB of ram as well. The processor may be an issue, but just try it out and see how long a simple response like what is 2 + 2 takes. Let me know!
@varunsakhuja19
3 ай бұрын
Hey Tyler! Amazing video I am non technical, just wanted to know if I can use jupyter notebook instead of pycharm. If yes, do I need to create separate jsonn file to call Open AI API key like you did by for two way chat? Thanks
@TylerReedAI
2 ай бұрын
Thank you! And yes you can absolutely use jupyter notebook instead of pycharm. You do not need to create a separate file, you can just add the model and api key to the config list separately. If you need help, email me and I can give you a sample code! tylerreedytlearning@gmail.com
@松永英行-c5u
4 ай бұрын
Thanks for sharing this video, it helps me a a lot. I have one question. Is that possible to dynamically changing base prompt(system_message)? "dynamically" means that I would like to know how to change system_message during conversation.
@TylerReedAI
4 ай бұрын
Hey I'm glad it could help! And I will look into that. I can think of just adding context in each iteration to shape the output, but you would need to set the human_input_mode="ALWAYS".
@artemkhomenko2317
5 ай бұрын
Any way to use AutoGen to login on the website and perform a job? I mean the functionality where I can describe with the text to login on the specific website with my credentials and do specific tasks, without specifying manually CSS or XPath elements and without writing (or generating) code for Selenium or similar tools?
@TylerReedAI
4 ай бұрын
Hey, I don't think doing that with their native tools just yet, however I know they are hard working (as per last week) on making things like this happen. They mentioned it in a discord call they had.
@AndyPandy-ni1io
4 ай бұрын
Hi So I want to do this with a local run LLM how do I change [ { "model": "gpt-3.5-turbo", "api_key": "sk-proj-1111" } ] to run with say LM studio with Llama 3
@TylerReedAI
4 ай бұрын
model: the actual model from LM studio you are using api-key: “lm-studio” base_url: “the url found in LM studio as well” There is like a snippet of python code when you start local server, both of the model and base url can be found there
@AndyPandy-ni1io
4 ай бұрын
import autogen def main(): llama3 = { "config_list": [ { "model": "Meta-Llama-3-8B-Instruct-GGUF", "base_url": "localhost:1234/v1", "api_key": "lm-studio", }, ], "cache_seed": None, "max_tokens": 1024 } phil = autogen.ConversableAgent( "Phil (Phi-2)", llm_config=llama3, system_message=""" Your name is Phil and you are a comedian. """, ) # Create the agent that represents the user in the conversation. user_proxy = autogen.UserProxyAgent( "user_proxy", code_execution_config=False, default_auto_reply="...", human_input_mode="NEVER" ) user_proxy.initiate_chat(phil, message="Tell me a joke!") if __name__ == "__main__": main()
@vbtsundari
3 ай бұрын
Hi - While using functions it is answering 2+2 = 4 then how is it different from Tools ? I am using your exact code from your git
@TylerReedAI
3 ай бұрын
Tools and Functions are very similar. End of the day, just python functions. A couple things though, I think that one of the differences is how they interact with OpenAI. And tools are a little more diverse, and tbh, I prefer them over function calling. You can also assign a bunch of tools to an agent and it will decide which is best. A function must be called when assigned to an agent.
@LeviZortman
4 ай бұрын
Hi Tyler I was following along with your repo and it vanished mid tutorial. Any ideas? Great work btw
@TylerReedAI
4 ай бұрын
Hey thank you, what do you mean…like the repo doesn’t exist?
@LeviZortman
4 ай бұрын
@@TylerReedAI I was following along with your /autogen_beginnner_course repo and I refreshed at one point and got 404. Its gone
@TylerReedAI
4 ай бұрын
I see, I had a different one and migrated because of some issue, I apologize. Try this: github.com/tylerprogramming/autogen-beginner-course
@wpuncensored
4 ай бұрын
Can I request the a code written in Next.js(typescript) or .NET(C#) or it is strictly working with Python?
@TylerReedAI
4 ай бұрын
you are in luck! They just added .net support!
@Oliv-B
5 ай бұрын
I'm at 10:45 on your tutorial and my code just pop up a META & TESLA Stock price graph! Just an issue: it was a success with Assistant sending "TERMINATE", but then "user" doesn't stop sending empty messages to Assistant, and Assistant responding "good bye" / "feel free to ask more questions"... in an infinite loop, CTRL-C help me get out of here! (^_^)
@TylerReedAI
4 ай бұрын
I'm glad you got the graph! Yeah sorry, it happens, but I will try to update the code with better termination replies and prompts so you don't run into this issue nearly as often. But yeah ctrl + c gets you out of it :D
@gokusaiyan1128
3 ай бұрын
I am not getting the py file at 10:54. the files are not saving for me. i am on mac too.
@TylerReedAI
3 ай бұрын
What model are you using? You using openai or a local model?
@gokusaiyan1128
3 ай бұрын
@@TylerReedAI openAI api :)
@TylerReedAI
2 ай бұрын
try gpt-4o, just tried it and it created the image and saved the code, let me know if that worked. Sometimes 3.5-turbo is weird
@frankkujath3501
5 ай бұрын
well i wonder, the first programs runs with no error, but it don`t create the coding folder and also don`t create or run files where i see the chart in it, also not if i create it before, also in mode "Never". So it only produce output on output but not the resulting scripts. Any idea? Well i tryed with python 3.10
@frankkujath3501
5 ай бұрын
i also don*t see the 3 dots in the output log...
@frankkujath3501
5 ай бұрын
found the reason: after create project i have to say in the 3thd tab conda as environment and use py 3.10.11. it seams there is a problem with my 3.10.6 installed for automatic1111.
@TylerReedAI
4 ай бұрын
Sorry for late reply, I'm glad you got it figured out. Yeah so this is why I'm soon going to be creating docker images so everybody can have the same workflow with same settings I have. Then we won't have issues like this.
@yumaheymans680
3 ай бұрын
where can i find the full code?
@TylerReedAI
3 ай бұрын
Hey it should be in a link to my GitHub in the description! Let me know if it works or not
@RetiredVet1
5 ай бұрын
When I ran the code, I got the following repeated about 12 or more times. Maybe we need to limit replies? Assistant (to user): If you have any more questions or need assistance in the future, please feel free to ask. Have a great day! Goodbye! TERMINATE -------------------------------------------------------------------------------- user (to Assistant): I also did not get the same results you did, but I now think I know why. Since I have docker running, I set "use_docker" to True. When I set "user_docker" to false, I get results closer to yours. I was thinking I needed to use the docker executor, but that causes other issues. You might want to try using docker and see if there are any differences. If so, it might be the subject of another video. I had more consistent results when I set the temperature to 0, and set use_docker to false.
@TylerReedAI
4 ай бұрын
Gotcha, we talked in discord but yeah it's really interesting to have differences like this.
@Alex29196
2 ай бұрын
9:35 Process finished with exit code 0. Not same as the tutorial. End of game. not printing nothing
@TylerReedAI
2 ай бұрын
So it's not outputting anything? It seems that it didn't error out at least. If you want to either join my discord and share code and I can see whats going on or give me some of it here
@jsoutter
Ай бұрын
@@TylerReedAI I have the same issue
@jsoutter
Ай бұрын
How does one join your Discord?
@TylerReedAI
Ай бұрын
@jsoutter there should be an invite in the description! Let me know if that doesn’t work for some reason
@TheLombudXa
5 ай бұрын
After like 3 weeks of fiddling around with AI, the way to go is to fine-tune the model itself directly to create agents. There's no need for any tool. The AI itself has it all already.
@b.861
5 ай бұрын
😂😂
@kareldaulatram
5 ай бұрын
How?
@brandonheaton6197
4 ай бұрын
Using llama 3, that is a viable strategy. However, consider that the AgentOptimizer autogen workflow from Zhang and Zhang allows you to get the effect while still using the top of the line models. gpt-4-turbo is current $30 per million tokens. Until the SML agent swarm gets traction, this is going to be the best option
@bertobertoberto3
4 ай бұрын
Nope
@AnsaarSyed
3 ай бұрын
Hi, i commented on your other SAAS customer survey video i am using that code of you yours, and i keep getting "openai.BadRequestError: Error code: 400 - {'error': "'messages' array must only contain objects with a 'content' field that is not empty."}" error even though i have default_auto_reply:"..." A different question, when i followed the exact path of your SAAS customer survey video it once ran the code, it even generated me an output, but couple things i see is: - I cant see all those requirement interactiosn between the agents, - and once the execution is done with code generation the last thing i get is the same error mentioned above PLS HELP
@TylerReedAI
3 ай бұрын
Oh I see, you also have the default auto reply already. Are you using lm studio or ollama for local integration? Or something else? So, to get all the interactions, just set the human_input_mode="ALWAYS" so you can be a part of the conversation. And you are still using Mistral-7B for this?
@gokusaiyan1128
3 ай бұрын
when to use register_for_llm and register_for_execution ?
@TylerReedAI
3 ай бұрын
so, these are decorators to be used at the top of the python method. This will let the framework know this is a tool to be used by an agent!
@gokusaiyan1128
2 ай бұрын
@@TylerReedAI is it possible to coordinate groupchat ? like I want specific agent to be called
Пікірлер: 113