C

CourseWWWork

9 Followers
    19.4 -Installing and Using Hugging Face CLI Tools
    2:37
    20.1 -Agentic AI Fundamentals – Section Intro
    1:01
    18.5 -FastAPI Environment Setup & Dependencies
    4:01
    18.2 -Dockerized Environment Setup for LLMs
    4:03
    18.1 -Ollama Overview Local LLM Runtime Engine
    2:24
    17.4 -INST Format LLaMA-2 Instruction Specification
    1:54
    18.3 -Running Ollama Models with Docker Runner
    3:15
    19.3 -Accessing Instruct-Tuned Models (Google Gemma)
    1:58
    19.2 -Configuring and Securing Hugging Face Account
    2:36
    19.1 -Hugging Face Model Deployment – Section Intro
    3:01
    17.1 -Introduction to Prompt Serialization Styles
    2:00
    16.6 -Chain-of-Thought (CoT) for Reasoning
    12:49
    17.2 -Alpaca Prompt Template for Instruction Tuning
    2:49
    16.5 -Structured Outputs with Few-Shot Prompting
    3:13
    16.1 -Prompt Fundamentals Encoding Instructions for LLMs
    0:56
    16.3 -One-Shot Prompting for Deterministic Inference
    3:23
    16.4 -Few-Shot Prompting for Contextual Generalization
    3:31
    16.8 -Persona-Based Prompting
    5:22
    16.7 -Auto-CoT Automated Reasoning Prompt Generation
    8:47
    17.3 -ChatML Schema OpenAI’s Structured Prompt Format
    1:30
    16.2 -Prompting Types Zero-Shot, Few-Shot, One-Shot
    3:53
    14.9 -Understanding Multi-Head Attention for Rich Context
    5:19
    14.4 Fundamentals of Tokenization in NLP
    8:05
    14.7 Deep Diving into Vector Embeddings
    9:09