Workshop Links:
- Free WhyLabs Signup: [ Ссылка ]
- Notebook: [ Ссылка ]
- whylogs github (give us a star!) [ Ссылка ]
- Join The AI Slack group: [ Ссылка ]
LLM monitoring: [ Ссылка ]
If you want to build reliable pipelines, trustworthy data, and responsible AI applications, you need to validate and monitor your data & ML models!
In this workshop we’ll cover how to ensure model reliability and performance to implement your own AI observability solution from start to finish.
Once completed you'll also receive a certificate!
This workshop will cover:
Detecting data drift
Measuring model drift
Monitoring model performance
Data quality validation
Measuring Bias & Fairness
Model explainability
What you’ll need:
A modern web browser
A Google account (for saving a Google Colab)
Sign up free a free WhyLabs account ([ Ссылка ])
Who should attend:
Anyone interested in AI Observability, Model monitoring, MLOps, and DataOps! This workshop is designed to be approachable for most skill levels. Familiarity with machine learning and Python will be useful, but it's not required.
By the end of this workshop, you’ll be able to implement data and AI observability into your own pipelines (Kafka, Airflow, Flyte, etc) and ML applications to catch deviations and biases in data or ML model behavior.
About the instructor:
Sage Elliott enjoys breaking down the barrier to AI observability, talking to amazing people in the Robust & Responsible AI community, and teaching workshops on machine learning. Sage has worked in hardware and software engineering roles at various startups for over a decade.
Connect with Sage on LinkedIn: [ Ссылка ]
About WhyLabs:
WhyLabs.ai is an AI observability platform that prevents data & model performance degradation by allowing you to monitor your data and machine learning models in production. [ Ссылка ]
Check out our open-source data & ML monitoring project: [ Ссылка ]
Do you want to connect with the community, learn about WhyLabs, or get project support? Join the WhyLabs + Robust & Responsible AI community Slack: [ Ссылка ]
Ещё видео!