okay thats someting good and appreciatable content
@LinuxTex
2 ай бұрын
Thanks Nehal. 👍
@nothingbutanime4975
2 ай бұрын
Do make a video regarding ai generated images local models
@LinuxTex
2 ай бұрын
Definitely bro👍
@decipher365
Ай бұрын
Yes we need much more such videos
@MuhammadMuaazAnsari-l1b
2 ай бұрын
0:39 The Holy Trinity 😂😂🤣
@乾淨核能
2 ай бұрын
hardware requirement?
@m4saurabh
2 ай бұрын
Nvidia H100
@乾淨核能
2 ай бұрын
@@m4saurabh seriously? that's not for everyone T_T
@LinuxTex
2 ай бұрын
For Phi 3.8b, 8 gb ram. No gpu needed. For llama 3.1 7b, 16 gb ram. Most consumer GPUs will suffice. H100 not necessary.😉
@乾淨核能
2 ай бұрын
@@LinuxTex thakn you!
@Soth0w76
2 ай бұрын
I already use ollama on my Galaxy A54 with Kali Nethunter
@Lex_Invictus
Ай бұрын
BASED!
@shubhamshandilya6160
Ай бұрын
How to make api calls to these offline llm. For used in projects
@Mr_Ravee
2 ай бұрын
Quality content dude..👍 got a sub👆👍
@nageswaraopatha5445
2 ай бұрын
Do more videos like this❤ good apps
@LinuxTex
2 ай бұрын
Will do. Thanks for the comment👍
@janetmartha3261
2 ай бұрын
Useful information 💯 ,,🔥🔥🔥🔥🔥
@LinuxTex
2 ай бұрын
Thank you👍
@MrNorthNJ
2 ай бұрын
I have an old file/media server which I am planning on rebuilding as a future project. Would I be able to run this on that server and still access it with other computers on my network or would it just be available on the server itself?
@LinuxTex
2 ай бұрын
That's a great idea actually. You could make it accessible on other devices on your network as ollama suppots that.
@MrNorthNJ
2 ай бұрын
@@LinuxTex Thanks!
@kennethwillis8339
2 ай бұрын
How can I have my local LLM work with my files?
@LinuxTex
2 ай бұрын
Sir you need to setup RAG for that. In the MSTY app that I've linked in the description below, you can create knowledge bases easily by just dragging and dropping files. Then you can interact with them using your local llms.
@atharavapawar3272
2 ай бұрын
which linux is perfect for my samsung NP300E5Z 4 RAM , Intel Core i5-2450M Processor , plz reply
Пікірлер: 34