👍 Please like if you found this video helpful, and subscribe to stay updated with my latest tutorials. 🔔 ❤ You can support this channel by buying me a ☕: buymeacoffee.com/codesfinance
@heathcliffebird7514
5 ай бұрын
Just finished watching another of your videos. Nice work! Subscribed, and leaving a comment for the algorithm. Good luck :-)
@VincentCodesFinance
5 ай бұрын
Thank you very much! Glad you enjoy!
@siamakshams1923
10 күн бұрын
Thank you, very helpful. Most of the currently available information on llama or ollama stay with "why is sky blue?" or summarization of a short text which basically defeats the objective of the exercise. Your video, in contrast, went into the detail that I've been searching and trying to solve for a few days which is working with large files. I also learnt about some other Python libraries which was a bonus. Like other viewers, I've liked and subscribed. Good work.
@VincentCodesFinance
10 күн бұрын
Glad it was helpful!
@irfanshaikh262
5 ай бұрын
Not all heroes wear a cape. Some just start a KZitem channel.
@VincentCodesFinance
5 ай бұрын
Wow. Thanks.
@swapnenduchatterjee8978
6 ай бұрын
Hi, great video with more detail. video. However, there are lots of dependencies & programs that need to run.... at some points it is very difficult for us (non-programmers) to execute the overall process. Is it possible to find a downloadable free app that features a PDF upload facility, offering options to customize selected pages, and allows for the integration and smooth operation with chosen Large Language Models (LLMs), effectively catering to all my queries and summarization needs?
@VincentCodesFinance
6 ай бұрын
I don’t know of any such program. It got me thinking however, the app that I develop in the video should be easy to package as a docker container. I’ll look into it, that would be one easy command to launch the app.
@mikedoyle9908
5 ай бұрын
No offence but if you are a non-programmer - you are in the wrong place... use quivr
@mikedoyle9908
5 ай бұрын
greta work - thanks!! Can I ask what spec machine you have to run those models?
@VincentCodesFinance
5 ай бұрын
Thanks! I have a MacBook Pro M3 Max with 64GB of ram. The Apple chips are great for running these model because the ram is shared between cpu and gpu, so the gpu can use all of it when needed. In my most recent video, I test the new models released this month to see which ones will run smoothly on my laptop: kzitem.info/news/bejne/kauguJtmbZh1imk
Пікірлер: 16