Join the "Get Things Done with AI" Bootcamp: [ Ссылка ]
DeepSeek v3 is an MoE language model with 671B parameters (37B active) trained on 14.8T tokens. That's one giant model, but is it good? Let's test it.
Blog post: [ Ссылка ]
GitHub repo: [ Ссылка ]
Chat: [ Ссылка ]
Inference API: [ Ссылка ]
AI Bootcamp: [ Ссылка ]
LinkedIn: [ Ссылка ]
Follow me on X: [ Ссылка ]
Discord: [ Ссылка ]
Subscribe: [ Ссылка ]
GitHub repository: [ Ссылка ]
👍 Don't Forget to Like, Comment, and Subscribe for More Tutorials!
00:00 - Welcome
00:32 - DeepSeek v3
02:28 - GitHub repo and training methods
05:39 - Training cost
07:10 - Inference API
07:34 - Setup
08:16 - Hip Hop lyrics
09:40 - Coding
11:23 - Data labeling
13:45 - Text summarization
14:47 - LinkedIn post
15:25 - Structured data extraction
17:17 - Rag/Question-answering
18:36 - Table data extraction
19:32 - Conclusion
Join this channel to get access to the perks and support my work:
[ Ссылка ]
#deepseek #llm #artificialintelligence #chatgpt #chatbot #python
Ещё видео!