For anyone having trouble with the 'ollama create', you have to spell the model's name in lowercase according to the ollama documentation. So the first line would be 'FROM llama3'
@sajithj9246
3 ай бұрын
Awesome bro same problem me also facing ❤❤❤
@kskroyaltech
3 ай бұрын
Correct.
@jivtheshm.r1784
2 ай бұрын
Does this work in case if there is no internet
@rayrai982
Ай бұрын
Can you guide us setting up ComfyUI as web ui, ComfyUI+Krita and ComfyUI+Blender. I was able to setup and use ComfyUI as web ui, ComfyUI+Krita, but when i try to set up ComfyUI in blender i get some error message. Os - Garuda System - Fully AMD
@kskroyaltech
Ай бұрын
your query is noted.
@rayrai982
28 күн бұрын
@@kskroyaltech I figured it out. It was audaspace which was missing. After installing that package now my ComfyUI setup is fully functional with Blender and Krita on my Arch machine. 🤩 This auto image/video generating setup is amazing.
@drift2399
Ай бұрын
you installed llama 2 times in this video one from cli and one from the lm studio can i remove the cli version to save space if so then please tell me how
@kskroyaltech
Ай бұрын
Yes you can remove that.. type *ollama list* to see list of models then choose the model you want with the below command: ollama rm MODEL NAME
@drift2399
Ай бұрын
@@kskroyaltech thanks a lot brother Also please make a video to extend battery life in arch linux if possible
@PTRAARON
3 ай бұрын
How to uninstall ollama from my computer. I have no graphics card
@thewizard5716
3 ай бұрын
You dont need one
@abhidnyasonawane9608
2 ай бұрын
its powerful then the chat gpt 4o or not capable
@kskroyaltech
2 ай бұрын
Llama 3.1 is powerful than GPT 4o .
@shrirammadurantakam
3 ай бұрын
The ollama system configuration is very iseful for agentic workflows Need to learn to make llms talk to each other
@kskroyaltech
3 ай бұрын
Absolutely!
@AgentX-dh3lf
3 ай бұрын
Any good LLM for low end hardware?
@kskroyaltech
3 ай бұрын
TinyLlama, GEMMA 2B..
@Zer0YT
3 ай бұрын
Very nice Video 🙏🏼 Bur is There also a free ki you can Host localy for picture Generation? Maybe this would be a Video Worth 😊🙌🏼 I would be interested 💯🙌🏼
@Arador1112
3 ай бұрын
hey,how do one can delete a model from oolama?
@kskroyaltech
3 ай бұрын
ollama list to see all models ollama rm MODEL_NAME
@Arador1112
3 ай бұрын
@@kskroyaltech i mean how to completely delete it. it does take the space even after running this command
@Yowise-q8k
Ай бұрын
Is there a possibility to host locally AI but be able to access it remotely by phone? Preferable something outsite of using teamviewer.. it's pretty uncomfortable.
@kskroyaltech
Ай бұрын
YEs you can do that. You need to create a NEXT JS Application that can be used to talk to the AI in the background. It's fairly complicated task.
@Yowise-q8k
Ай бұрын
@@kskroyaltech What about just making the API code to be public like IP public so i can use it to another chat app? What's the point of creating AI to talk with AI that will allow me to talk with AI? what?
@Amit-hb9ex
3 ай бұрын
which is your main system for you work also what are you doinng in you life like in education pov
@kskroyaltech
3 ай бұрын
I use Linux and macOS as primary OS'es.. MacOS I use for building iOS Apps. but mostly I spend my time with Linux . I love tinkering open source stuff. Education: I dropped Out B-Tech long back. Natural Farming I do in the part time and full time KZitem.
@Arador1112
3 ай бұрын
@@kskroyaltech great bro
@Amit-hb9ex
3 ай бұрын
@@Arador1112 Nice dp 🙂
@wolfisraging
3 ай бұрын
Alpaca is best for llm GUI. its on flatpak as well. Clean & simple UI.
@kskroyaltech
3 ай бұрын
Thanks for telling will try.
@chef2654
3 ай бұрын
Do what exactly makes Linux superior for AI? You do realise that you can run Ollama & LM Studio just as easily on macOS & Windows. Not to mention, they also work with AMD GPUs, not just Nvidia.
@kskroyaltech
3 ай бұрын
Offcourse.
@MrRom079
3 ай бұрын
Yea but windows sucks balls 😂😂😂😂
@seventhtenth
3 ай бұрын
LLM local running is cool but what is the best training set?
Пікірлер: 35