cover image

Make stronger and simpler models by leveraging natural order

cover image
Scikit-learn 1.1 Comes with an Improved OneHotEncoder
24 Oct 2022
towardsdatascience.com

A simple yet highly practical feature

cover image

Efficient vector quantization for machine learning optimizations (eps. vector quantized variational autoencoders), better than straight…

cover image
What is Huffman Coding?
22 Feb 2021
baseclass.io

The Huffman Coding algorithm is a building block of many compression algorithms, such as DEFLATE - which is used by the PNG image format and GZIP.

cover image
Comparing Binary, Gray, and One-Hot Encoding
7 Jan 2021
allaboutcircuits.com

This article shows a comparison of the implementations that result from using binary, Gray, and one-hot encodings to implement state machines in an FPGA. These encodings are often evaluated and applied by the synthesis and implementation tools, so it’s important to know why the software makes these decisions.

cover image

All the encodings that are worth knowing — from OrdinalEncoder to CatBoostEncoder — explained and coded from scratch in Python

cover image
Stop One-Hot Encoding Your Categorical Variables.
18 Dec 2020
towardsdatascience.com

There are many better alternatives

cover image
A Mathematical Primer of Compression
20 Mar 2020
towardsdatascience.com

How to reduce storage size without losing information.

cover image
A gentle introduction to Hamming codes
9 Mar 2020
johndcook.com

A gentle introduction to Hamming codes, error correcting binary codes whose words are all a Hamming distance of at least 3 apart.

cover image
BCrypt Explained
9 Mar 2020
dev.to

Intro If you're into Cryptography For Beginners, you're in the right place. Maybe you're j...

cover image
RSA Algorithm
19 Feb 2020
leimao.github.io

A Self-Contained Tutorial on RSA Algorithm Theories for Number Theory Noobs

cover image

Audiophile On lists the best audio codecs for your music if you want the best sound quality, including descriptions of FLAC, OGG, ALAC, DSD, and MQA.

cover image

Knowledge distillation is a model compression technique whereby a small network (student) is taught by a larger trained neural network (teacher). The smaller network is trained to behave like the large neural network. This enables the deployment of such models… Continue reading Research Guide: Model Distillation Techniques for Deep Learning