Локальные LLM через Ollama и LM Studio - Практическое руководство [en]

Аватар автора
channel62316078
[Academind Pro] Локальные LLM через Ollama и LM Studio - Практическое руководство [en] Welcome To The Course! What Exactly Are "Open LLMs"? Why Would You Want To Run Open LLMs Locally? Popular Open LLMs - Some Examples Where To Find Open LLMs? Running LLMs Locally - Available Options Check The Model Licenses! Module Introduction LLM Hardware Requirements - First Steps Deriving Hardware Requirements From Model Parameters Quantization To The Rescue! Does It Run On Your Machine? Module Introduction Running Locally vs Remotely Installing & Using LM Studio Finding, Downloading & Activating Open LLMs Using the LM Studio Chat Interface Working with System Prompts & Presets Managing Chats Power User Features For Managing Models & Chats Leveraging Multimodal Models & Extracting Content From Images (OCR) Analyzing & Summarizing PDF Documents Onwards To More Advanced Settings Understanding Temperature, top_k & top_p Controlling Temperature, top_k & top_p in LM Studio Managing the Underlying Runtime & Hardware Configuration Managing Context Length Using Flash Attention Working With Structured Outputs Using Local LLMs For Code Generation Content Generation & Few Shot Prompting (Prompt Engineering) Onwards To Programmatic Use LM Studio & Its OpenAI Compatibility More Code Examples! Diving Deeper Into The LM Studio APIs Module Introduction Installing & Starting Ollama Finding Usable Open Models Running Open LLMs Locally via Ollama Adding a GUI with Open WebUI Dealing with Multiline...

0/0


0/0

0/0

0/0