Thank you for taking time out to share your expertise. Awesome stuff. I’m about to fine-tune a couple more models for specific functions at my company.. fine-tuning a model is a goldmine. Its amazing much a fine-tuned local llm can get you near 100% consistent results in a specific domain when done correctly with validation.
@csepartha
7 ай бұрын
Great explanation. Kindly provide kuch such diagramatic representation of complex topics in kater videos.
@RemekKinas
7 ай бұрын
Really great work ❤ This is why I bought access to repo 👍
@TrelisResearch
7 ай бұрын
Cheers! I appreciate it
@josepalacios843
7 ай бұрын
Thanks for sharing your content. Your product (videos+monetized repos) is a very classy way of doing it. You're the opposite of a grifter. Thank you, I hope you go mega viral in the AI sphere.
@TrelisResearch
7 ай бұрын
Appreciate that a lot, cheers Jose
@HistoryIsAbsurd
7 ай бұрын
Thanks for the vid man you have a way of explaining this stuff that is unmatched
@TrelisResearch
6 ай бұрын
Appreciate it
@stephennfernandes
7 ай бұрын
there are a ton of KZitem videos out there, and i refrain from watching most of them. as they just provide more noise to the actual information. But your videos are exceptionally great. i love your explaining and teaching style. you are by far the most clear explanation i have found to these new emergent topics. you are the signal in the noise i was searching for
@HistoryIsAbsurd
7 ай бұрын
100% agreed
@professorlich
3 ай бұрын
I never heard of this channel, I have not watched this video, and yet... Sincerity of your comment made me instantly subscribe to his channel, and like the video. Your words just come off as genuine and authentic. Just thought I'd let you know, that's all.
@loicbaconnier9150
6 ай бұрын
A video on how to create datasets according to the use case and method use for fine tunning (sft, dpo, dpop ,...) will be so appreciate 😊
@TrelisResearch
6 ай бұрын
Thanks for the tip, there'll be a vid this week on the SFT front covering data prep.
@RemekKinas
6 ай бұрын
As I can see Unsloth support Mistral, Gemma, Llama finetuning - not only LLama architecture :)
@TrelisResearch
6 ай бұрын
Yes Llama and Mistral (both very similar architectures) and yes, Daniel Han has just released some great work getting Gemma to work.
@techbeauty2450
Ай бұрын
Regarding Dora, woudn't multiplying M * D create a matrix of size 1 x number of rows of D, so we would lose original W matrix? Unclear whats the end result.
@TrelisResearch
Ай бұрын
Dora just extracts the magnitude from W so you have a magnitude times a tensor. So you don’t lose W. Make sense?
@programwithpradhan
7 ай бұрын
Hey a great video really deserves an appreciation Just a quick question Can you please share some platforms where you get the latest update and research papers like Dora,lora+ etc
@TrelisResearch
7 ай бұрын
Locallama on Reddit is good. Then I follow TheKaitchup and Sebastian Rascha on Substack
@loicbaconnier9150
6 ай бұрын
Great video as usual. What about Laser method to improve models ? You convinced me to buy your repo access to thank you for you work. Thanks
@TrelisResearch
6 ай бұрын
Merci! do you have a link where I can learn about the Laser method.
@PommelKnight
5 ай бұрын
How can I use it locally?
@TrelisResearch
5 ай бұрын
you can run this locally in the same way if you have an Nvidia gpu OR you can run on cpu by setting the device type to cpu (although it will be slow).
@csepartha
7 ай бұрын
Kindly make a tutorial to fine tune an open source LLM model on many pdfs data. The fine tuned LLM must be able to answer the questions from the pdfs accurately.
@TrelisResearch
7 ай бұрын
Actually I've a video that's coming up on that!
@csepartha
7 ай бұрын
@@TrelisResearch I am waiting for that. Keep good works and keep us making learned.
@romanweilguny3415
7 ай бұрын
great video!!!
@danielhanchen
7 ай бұрын
Extremely well presented and packed with info! Love the video! Keep up the fabulous work!
@TrelisResearch
6 ай бұрын
Appreciate that Daniel
@RemekKinas
6 ай бұрын
It would be great to have next two videos from you: 1. LLama or Mistral (because most popular) architecture breakdown (step by step), 2. pre-training. Your videos are piece of gold.
@TrelisResearch
6 ай бұрын
Check out Umar Jamil's channel for great breakdowns. Probably that's too detailed for this channel. Yeah, potentially doing a pre-training vid could be an idea, although I've a few more vids to get through first.
@RemekKinas
6 ай бұрын
@@TrelisResearch WoW this is super cool! Thank you for recommendation.
@morgancredib-ai2501
3 ай бұрын
@@TrelisResearchUmar Jamil videos are indeed great for theoretical break down . Pretraining or continuous pretraining from bootstrapped weights would be great !
Пікірлер: 33