What is Big-O notation, and what are some misconceptions that even advanced engineers have?
Patreon: [ Ссылка ]
Follow me on:
Twitter: [ Ссылка ]
Instagram: [ Ссылка ]
Github: [ Ссылка ]
In this video we'll talk a bit about big-o notation and analysis, how to understand time complexity, and how it's related to understanding performance. We'll approach this from the mathematical definition, going over the limiting behaviour and talking about the strict definition of big-o. How programmers tend to use Big-O informally, what they mean, and how that differs from the strict mathematical definition. There will be some easy examples to work through, talking about the dominant terms and how/why other terms are dropped. We'll also talk about some of the more subtle aspects of big-o, when/why its good, where it kind of fails things, and cap it off with a story.
[ Ссылка ]
[ Ссылка ]
Ещё видео!