cascada.primitives.simon_rf module
Simon-like round functions and their XOR and RX differential models.
Source: Observations on the SIMON block cipher family https://eprint.iacr.org/2015/145
- class cascada.primitives.simon_rf.SimonRF(**kwargs)[source]
Bases:
cascada.bitvector.operation.SecondaryOperation
The non-linear part of the round function of Simon.
This corresponds to
f(x) = ((x <<< a) & (x <<< b)) ^ (x <<< c)
, where(a, b, c) = (8, 1, 2)
.- classmethod eval(x)[source]
Evaluate the operator with given operands.
This is an internal method that assumes the list
args
has been parsed. To evaluate a bit-vector operation, instantiate a new object with the operands as the object arguments (i.e., use the Python operator()
).
- rx_model
- xor_model
- class cascada.primitives.simon_rf.XorModelSimonRF(input_prop)[source]
Bases:
cascada.differential.opmodel.OpModel
Represent the
XorDiff
differential.opmodel.OpModel
ofSimonRF
.>>> from cascada.bitvector.core import Constant, Variable >>> from cascada.differential.difference import XorDiff >>> from cascada.primitives.simon_rf import XorModelSimonRF >>> alpha = XorDiff(Constant(0, 16)) >>> f = XorModelSimonRF(alpha) >>> print(f.vrepr()) XorModelSimonRF(XorDiff(Constant(0x0000, width=16))) >>> x = Constant(0, 16) >>> f.eval_derivative(x) # f(x + alpha) - f(x) XorDiff(0x0000) >>> f.max_weight(), f.weight_width(), f.error(), f.num_frac_bits() (15, 5, 0, 0)
- diff_type
- op
alias of
cascada.primitives.simon_rf.SimonRF
- validity_constraint(output_diff)[source]
Return the validity constraint for a given output
XorDiff
difference.>>> from cascada.bitvector.core import Constant, Variable >>> from cascada.bitvector.printing import BvWrapPrinter >>> from cascada.differential.difference import XorDiff >>> from cascada.primitives.simon_rf import XorModelSimonRF >>> alpha = XorDiff(Constant(0, 16)) >>> f = XorModelSimonRF(alpha) >>> f.validity_constraint(XorDiff(Constant(0, 16))) 0b1 >>> u, v = Variable("u", 16), Variable("v", 16) >>> f = XorModelSimonRF(XorDiff(u)) >>> result = f.validity_constraint(XorDiff(v)) >>> print(BvWrapPrinter().doprint(result)) Ite(u == 0xffff, (PopCount(v ^ (u <<< 2))[0]) == 0b0, BvComp(0x0000, BvOr((v ^ (u <<< 2)) & ~((u <<< 8) | (u <<< 1)), (v ^ (u <<< 2) ^ ((v ^ (u <<< 2)) <<< 7)) & (u <<< 1) & ~(u <<< 8) & (u <<< 15) ) ) ) >>> result.xreplace({u: Constant(0, 16), v: Constant(0, 16)}) 0b1
See
OpModel.validity_constraint
for more information.
- pr_one_constraint(output_diff)[source]
Return the probability-one constraint for a given output
XorDiff
.>>> from cascada.bitvector.core import Constant, Variable >>> from cascada.differential.difference import XorDiff >>> from cascada.primitives.simon_rf import XorModelSimonRF >>> alpha = XorDiff(Constant(0, 16)) >>> f = XorModelSimonRF(alpha) >>> f.pr_one_constraint(XorDiff(Constant(0, 16))) 0b1
See
abstractproperty.opmodel.OpModel.pr_one_constraint
for more information.
- bv_weight(output_diff)[source]
Return the bit-vector weight for a given output
XorDiff
.>>> from cascada.bitvector.core import Constant, Variable >>> from cascada.bitvector.printing import BvWrapPrinter >>> from cascada.bitvector.secondaryop import PopCount >>> from cascada.differential.difference import XorDiff >>> from cascada.primitives.simon_rf import XorModelSimonRF >>> alpha = XorDiff(Constant(0, 16)) >>> f = XorModelSimonRF(alpha) >>> f.bv_weight(XorDiff(Constant(0, 16))) 0b00000 >>> alpha = XorDiff(Variable("u", 16)) >>> f = XorModelSimonRF(alpha) >>> beta = XorDiff(Variable("v", 16)) >>> print(BvWrapPrinter().doprint(f.bv_weight(beta))) Ite(u == 0xffff, 0b01111, PopCount(((u <<< 8) | (u <<< 1)) ^ ((u <<< 1) & ~(u <<< 8) & (u <<< 15))) )
- max_weight()[source]
Return the maximum value the weight variable can achieve in
OpModel.weight_constraint
.
- weight_width()[source]
Return the width of the weight variable used
OpModel.weight_constraint
.
- decimal_weight(output_diff)[source]
Return the
decimal.Decimal
weight for a given constant outputProperty
.This method returns, as a decimal number, the weight (negative binary logarithm) of the probability of the input property propagating to the output property.
This method only works when the input property and the output property are constant values, but provides a better approximation than the bit-vector weight from
OpModel.weight_constraint
.
- num_frac_bits()[source]
Return the number of fractional bits used in the weight variable of
OpModel.weight_constraint
.If the number of fractional bits is
k
, then the bit-vector weight variablew
ofOpModel.weight_constraint
represents the number2^{-k} * bv2int(w)
. In particular, ifk == 0
, thenw
represents an integer number. Otherwise, thek
least significant bits ofw
denote the fractional part of the number represented byw
.
- error()[source]
Return the maximum difference between
OpModel.weight_constraint
and the exact weight.The exact weight is exact value (without error) of the negative binary logarithm (weight) of the propagation probability of \((\alpha, \beta)\).
Note
The exact weight can be computed in
TestOpModelGeneric.get_empirical_weight_slow
.This method returns an upper bound (in absolute value) of the maximum difference (over all input and output properties) between the bit-vector weight from
OpModel.weight_constraint
and the exact weight.Note that the exact weight might still differ from
decimal_weight
.
- class cascada.primitives.simon_rf.RXModelSimonRF(input_prop)[source]
Bases:
cascada.primitives.simon_rf.XorModelSimonRF
Represent the
RXDiff
differential.opmodel.OpModel
ofSimonRF
.>>> from cascada.bitvector.core import Constant, Variable >>> from cascada.differential.difference import RXDiff >>> from cascada.primitives.simon_rf import RXModelSimonRF >>> alpha = RXDiff(Constant(0, 16)) >>> f = RXModelSimonRF(alpha) >>> print(f.vrepr()) RXModelSimonRF(RXDiff(Constant(0x0000, width=16))) >>> x = Constant(0, 16) >>> f.eval_derivative(x) # f(x + alpha) - f(x) RXDiff(0x0000) >>> f.max_weight(), f.weight_width(), f.error(), f.num_frac_bits() (15, 5, 0, 0)
- diff_type
- op
alias of
cascada.primitives.simon_rf.SimonRF