Giter Club home page Giter Club logo

dsc-2-21-01-introduction-nyc-career-ds-102218's Introduction

Introduction

Introduction

This lesson summarizes the topics we'll be covering in section 21 and why they'll be important to you as a data scientist.

Objectives

You will be able to:

  • Understand and explain what is covered in this section
  • Understand and explain why the section will help you to become a data scientist

Combinatorics Continued and Maximum Likelihood Estimation

In this section we'll be covering a number of additional statistical topics that are going to be important as we start to dig into the application of a range of additional machine learning models in modules 3 and 4.

Conditional Probability

We start the section off with an introduction to conditional probability. We look at the difference between dependent and independent events, look at how to calculate dependent probabilities, and then introduce some key theorems related to conditional probabilities - the product rule, the chain rule and Bayes theorem.

Partitioning and the Law of Total Probabilities

From there, we introduce the concept of partitioning a sample space, explain the law of total probabilities and then introduce the idea of conditional independence.

Bayes' Theorem

Next up, we introduce Bayes' theorem - an incredibly powerful and widely used approach to determining the probability of something based upon the prior probability and the likelihood. We use the example of the "Monty Hall problem" to provide some practice with applying Bayes' theorem.

Maximum Likelihood Estimation (MLE)

From Bayes' theorem we move onto MLE, explaining the idea of parametric inference for identifying optimal values for model parameters, the concept of likelihood (and it's difference from probability), and the common "IID" assumption on which MLE is based.

Maximum A Posteriori Estimation

Next up we look at MAP - another technuqie for estimating a variable in the space of a probability distribution.

Conjugate Prior Distributions

From there, we introduce the idea of a conjugate prior, justify the use of a beta prior distribuiton for bernoulli experiments, and get an overview of the conjugate priors for a number of different data distributions.

Beyond Bayesians and Frequentists

After a lab with some practice implementing Bayesian simulations for simple problems in Python, NumPy and Matplotlib, we then wrap up the section by asking you to read a non technical essay "Beyond Bayesians and Frequentists" by Dr Jacob Steinhardt. The goal is to help you to have a clearer understanding of the differences between frequentist and Bayesian approaches to statistics.

Summary

This is another stats heavy section and some of the discrete problems you're solving may not seem to be particularly relevant to machine learning, but we deliberately introduce them so that you'll have the foundations required to be able to make thoughtful choices as we introduce you to a range of new machine learnign models in the later modules.

dsc-2-21-01-introduction-nyc-career-ds-102218's People

Contributors

loredirick avatar peterbell avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.