cover image

Entropy quantifies uncertainty in data, forming the mathematical foundation for decision trees, feature selection, and information theory applications.

cover image

A primer on the math, logic, and pragmatic application of JS Divergence — including how it is best used in drift monitoring

Information Theory: A Gentle Introduction
11 Jun 2021
towardsdatascience.com

This is the first in a series of articles about Information Theory and its relationship to data driven enterprises and strategy. While…

cover image
Link Prediction and Information Theory: A Tutorial
17 Jan 2021
towardsdatascience.com

Using Mutual Information to measure the likelihood of candidate links in a graph.

cover image
Essential Math for Data Science: Information Theory
18 Dec 2020
towardsdatascience.com

Entropy, cross-entropy, log loss, and KL divergence

cover image

Information theory is a subfield of mathematics concerned with transmitting data across a noisy channel. A cornerstone of information theory is the idea of quantifying how much information there is in a message. More generally, this can be used to quantify the information in an event and a random variable, called entropy, and is calculated using probability. Calculating information and…

cover image
Entropy and Information Gain
1 Jun 2020
towardsdatascience.com

Yet another tool used to make Decision Tree splits.

Lambdaclass's blog about distributed systems, machine learning, compilers, operating systems, security and cryptography.

cover image
Gini coefficient
9 Feb 2020
en.wikipedia.org

In economics, the Gini coefficient, also known as the Gini index or Gini ratio, is a measure of statistical dispersion intended to represent the income inequality, the wealth inequality, or the consumption inequality within a nation or a social group. It was developed by Italian statistician and sociologist Corrado Gini.

cover image

Important concepts in information theory, machine learning, and statistics