Module: MoreMath::Entropy

Defined in:
lib/more_math/entropy.rb

Overview

Provides entropy calculation utilities for measuring information content and randomness in text data.

This module implements Shannon entropy calculations to quantify the unpredictability or information content of text strings. It’s commonly used in cryptography, data compression, and information theory applications.

The entropy measures help determine how “random” or “predictable” a text is, which can be useful for:

  • Password strength analysis

  • Data compression efficiency estimation

  • Cryptographic security assessment

  • Text analysis and classification

Examples:

Basic usage

require 'more_math'
include MoreMath

text = "hello world"
puts entropy(text)        # => 2.3219280948873626
puts entropy_ratio(text)   # => 0.7428571428571429

Using with different text samples

MoreMath::Entropy.entropy("aaaa")           # => 0.0 (no entropy)
MoreMath::Entropy.entropy("abcd")           # => 2.0 (maximum entropy)

Instance Method Summary collapse

Instance Method Details

#entropy(text) ⇒ Float

Calculates the Shannon entropy of a text string.

Shannon entropy measures the average amount of information (in bits) needed to encode characters in the text based on their frequencies.

Examples:

MoreMath::Entropy.entropy("hello") # => 2.3219280948873626
MoreMath::Entropy.entropy("aaaa")  # => 0.0

Parameters:

  • text (String)

    The input text to calculate entropy for

Returns:

  • (Float)

    The Shannon entropy in bits



39
40
41
42
43
44
45
46
47
48
# File 'lib/more_math/entropy.rb', line 39

def entropy(text)
  chars = text.chars
  size  = chars.size

  chars.each_with_object(Hash.new(0.0)) { |c, h| h[c] += 1 }.
    each_value.reduce(0.0) do |entropy, count|
      frequency = count / size
      entropy + frequency * Math.log2(frequency)
    end.abs
end

#entropy_ideal(size) ⇒ Float

Calculates the ideal (maximum) entropy for a given character set size.

This represents the maximum possible entropy when all characters in the alphabet have equal probability of occurrence.

Examples:

MoreMath::Entropy.entropy_ideal(2)  # => 1.0
MoreMath::Entropy.entropy_ideal(256) # => 8.0

Parameters:

  • size (Integer)

    The number of unique characters in the alphabet

Returns:

  • (Float)

    The maximum possible entropy in bits



61
62
63
64
65
# File 'lib/more_math/entropy.rb', line 61

def entropy_ideal(size)
  size <= 1 and return 0.0
  frequency = 1.0 / size
  -1.0 * size * frequency * Math.log2(frequency)
end

#entropy_ratio(text, size: text.each_char.size) ⇒ Float

Calculates the normalized entropy ratio of a text string.

The ratio is calculated as actual entropy divided by ideal entropy, giving a value between 0 and 1 where:

  • 0 indicates no entropy (all characters are identical)

  • 1 indicates maximum entropy (uniform distribution across the alphabet)

The normalization uses the specified alphabet size to calculate the theoretical maximum entropy for that character set.

Examples:

MoreMath::Entropy.entropy_ratio("hello")     # => 0.6834
MoreMath::Entropy.entropy_ratio("aaaaa")     # => 0.0
MoreMath::Entropy.entropy_ratio("abcde")     # => 1.0

With custom alphabet size

# Normalizing against a 26-letter alphabet (English)
MoreMath::Entropy.entropy_ratio("hello", size: 26) # => 0.394...

Parameters:

  • text (String)

    The input text to calculate entropy ratio for

  • size (Integer) (defaults to: text.each_char.size)

    The size of the character set to normalize against. Defaults to the total length of the text (‘text.each_char.size`), which normalizes the entropy relative to the text’s own character space. This allows comparison of texts with different lengths on the same scale.

Returns:

  • (Float)

    Normalized entropy ratio between 0 and 1



93
94
95
96
# File 'lib/more_math/entropy.rb', line 93

def entropy_ratio(text, size: text.each_char.size)
  size <= 1 and return 0.0
  entropy(text) / entropy_ideal(size)
end