furiosa.quantizer.frontend.onnx.transformer package

Submodules

furiosa.quantizer.frontend.onnx.transformer.convert_2d_sum_to_add module

class furiosa.quantizer.frontend.onnx.transformer.convert_2d_sum_to_add.Convert2dSumToAdd

Bases: furiosa.quantizer.interfaces.transformer.Transformer

transform(model: onnx.onnx_ml_pb2.ModelProto) onnx.onnx_ml_pb2.ModelProto

furiosa.quantizer.frontend.onnx.transformer.convert_conv1d_to_conv2d module

class furiosa.quantizer.frontend.onnx.transformer.convert_conv1d_to_conv2d.ConvertConv1dToConv2d

Bases: furiosa.quantizer.interfaces.transformer.Transformer

transform(model: onnx.onnx_ml_pb2.ModelProto) onnx.onnx_ml_pb2.ModelProto
class furiosa.quantizer.frontend.onnx.transformer.convert_conv1d_to_conv2d.Pattern_1(model, name_nodes=True)

Bases: furiosa.quantizer.frontend.onnx.transformer.ONNXTransformer

transform

prev –> Conv(1d) –> next

to

prev –> Reshape –> Conv(2d) –> Reshape –> next

if Conv(1d).input[0].ndim == 3

make_new_init(matched_nodes: Iterable[onnx.onnx_ml_pb2.NodeProto]) List[onnx.onnx_ml_pb2.TensorProto]
make_new_node(matched_nodes: Iterable[onnx.onnx_ml_pb2.NodeProto]) List[onnx.onnx_ml_pb2.NodeProto]
make_new_vi(matched_nodes: Iterable[onnx.onnx_ml_pb2.NodeProto]) List[onnx.onnx_ml_pb2.ValueInfoProto]
pattern_condition_checker(nodes_to_check: Iterable[onnx.onnx_ml_pb2.NodeProto]) bool
pattern_matching(base_node: onnx.onnx_ml_pb2.NodeProto) Iterable[str]
pattern_to_match = ['Conv']

furiosa.quantizer.frontend.onnx.transformer.convert_negative_pads_to_slice module

class furiosa.quantizer.frontend.onnx.transformer.convert_negative_pads_to_slice.ConvertNegativePadsToSlice

Bases: furiosa.quantizer.interfaces.transformer.Transformer

transform(model: onnx.onnx_ml_pb2.ModelProto) onnx.onnx_ml_pb2.ModelProto
class furiosa.quantizer.frontend.onnx.transformer.convert_negative_pads_to_slice.Pattern_1(model, name_nodes=True)

Bases: furiosa.quantizer.frontend.onnx.transformer.ONNXTransformer

transform

prev –> Pad –> next

to

prev –> Slice –> Pad –> next

if
  1. Pad’s pads input (pad.input[1]) is an initializer

  2. at least one of pads value is negative

  3. Sum of negative pads value in axis i does not exceed corresponding input_shape[i] (if not, model is invalid)

  4. absolute value of each negative pads in axis i is less than corresponding input_shape[i] (if not, leads to invalid model/’nan’ output)

make_new_init_and_vi(matched_nodes: Iterable[onnx.onnx_ml_pb2.NodeProto]) List[onnx.onnx_ml_pb2.NodeProto]
make_new_node(matched_nodes: Iterable[onnx.onnx_ml_pb2.NodeProto]) List[onnx.onnx_ml_pb2.NodeProto]
pattern_condition_checker(nodes_to_check: Iterable[onnx.onnx_ml_pb2.NodeProto]) bool
pattern_matching(base_node: onnx.onnx_ml_pb2.NodeProto) Iterable[str]
pattern_to_match = ['Pad']
class furiosa.quantizer.frontend.onnx.transformer.convert_negative_pads_to_slice.Pattern_2(model, name_nodes=True)

Bases: furiosa.quantizer.frontend.onnx.transformer.ONNXTransformer

transform

prev –> Pad –> next

to

prev –> next

if
  1. Pad’s pads value are all zero

pattern_condition_checker(nodes_to_check: Iterable[onnx.onnx_ml_pb2.NodeProto]) bool
pattern_matching(base_node: onnx.onnx_ml_pb2.NodeProto) Iterable[str]
pattern_to_match = ['Pad']

furiosa.quantizer.frontend.onnx.transformer.convert_prelu_to_relu module

class furiosa.quantizer.frontend.onnx.transformer.convert_prelu_to_relu.ConvertPReluToRelu

Bases: furiosa.quantizer.interfaces.transformer.Transformer

transform(model: onnx.onnx_ml_pb2.ModelProto) onnx.onnx_ml_pb2.ModelProto
class furiosa.quantizer.frontend.onnx.transformer.convert_prelu_to_relu.Pattern_1(model, name_nodes=True)

Bases: furiosa.quantizer.frontend.onnx.transformer.ONNXTransformer

transform

PRelu(x) = slope * x if x < 0, x if x >=0

into

(1 - slope) * Relu(x) + slope * x

if
  1. PRelu’s input[1] is an initializer

make_new_init(matched_nodes: Iterable[onnx.onnx_ml_pb2.NodeProto]) List[onnx.onnx_ml_pb2.NodeProto]
make_new_node(matched_nodes: Iterable[onnx.onnx_ml_pb2.NodeProto]) List[onnx.onnx_ml_pb2.NodeProto]
make_new_vi(matched_nodes: Iterable[onnx.onnx_ml_pb2.NodeProto]) List[onnx.onnx_ml_pb2.NodeProto]
pattern_condition_checker(nodes_to_check: Iterable[onnx.onnx_ml_pb2.NodeProto]) bool
pattern_matching(base_node: onnx.onnx_ml_pb2.NodeProto) Iterable[str]
pattern_to_match = ['PRelu']
class furiosa.quantizer.frontend.onnx.transformer.convert_prelu_to_relu.Pattern_2(model, name_nodes=True)

Bases: furiosa.quantizer.frontend.onnx.transformer.ONNXTransformer

transform

PRelu(x) = slope * x if x < 0, x if x >=0

into

(1 - slope) * Relu(x) + slope * x

if
  1. PRelu’s input[1] is not an initializer

make_new_init(matched_nodes: Iterable[onnx.onnx_ml_pb2.NodeProto]) List[onnx.onnx_ml_pb2.NodeProto]
make_new_node(matched_nodes: Iterable[onnx.onnx_ml_pb2.NodeProto]) List[onnx.onnx_ml_pb2.NodeProto]
make_new_vi(matched_nodes: Iterable[onnx.onnx_ml_pb2.NodeProto]) List[onnx.onnx_ml_pb2.NodeProto]
pattern_condition_checker(nodes_to_check: Iterable[onnx.onnx_ml_pb2.NodeProto]) bool
pattern_matching(base_node: onnx.onnx_ml_pb2.NodeProto) Iterable[str]
pattern_to_match = ['PRelu']

furiosa.quantizer.frontend.onnx.transformer.eliminate_redundant_shape_pattern module

class furiosa.quantizer.frontend.onnx.transformer.eliminate_redundant_shape_pattern.EliminateRedundantShapePattern

Bases: furiosa.quantizer.interfaces.transformer.Transformer

transform(model: onnx.onnx_ml_pb2.ModelProto) onnx.onnx_ml_pb2.ModelProto
class furiosa.quantizer.frontend.onnx.transformer.eliminate_redundant_shape_pattern.Pattern_1(model, name_nodes=True)

Bases: furiosa.quantizer.frontend.onnx.transformer.ONNXTransformer

transform

prev –> Flatten/Squeeze –> Unsqueeze –> next

to

prev –> ( ) –> next

if prev.output[0].shape == next.input[0].shape

pattern_condition_checker(nodes_to_check)
pattern_matching(base_node)
pattern_to_match = ['Flatten/Squeeze', 'Unsqueeze']
class furiosa.quantizer.frontend.onnx.transformer.eliminate_redundant_shape_pattern.Pattern_2(model, name_nodes=True)

Bases: furiosa.quantizer.frontend.onnx.transformer.eliminate_redundant_shape_pattern.Pattern_1

transform

prev –> Reshape –> Flatten/Squeeze –> Unsqueeze –> next

to

prev –> ( ) –> next

if prev.output[0].shape == next.input[0].shape

pattern_to_match = ['Reshape', 'Flatten/Squeeze', 'Unsqueeze']
class furiosa.quantizer.frontend.onnx.transformer.eliminate_redundant_shape_pattern.Pattern_3(model, name_nodes=True)

Bases: furiosa.quantizer.frontend.onnx.transformer.eliminate_redundant_shape_pattern.Pattern_1

transform

prev –> Reshape –> next

to

prev –> ( ) –> next

if prev.output[0].shape == next.input[0].shape

pattern_to_match = ['Reshape']
class furiosa.quantizer.frontend.onnx.transformer.eliminate_redundant_shape_pattern.Pattern_4(model, name_nodes=True)

Bases: furiosa.quantizer.frontend.onnx.transformer.eliminate_redundant_shape_pattern.Pattern_1

transform

prev –> Reshape –> Expand –> Expand –> Reshape –> next

to

prev –> ( ) –> next

if prev.output[0].shape == next.input[0].shape

pattern_to_match = ['Reshape', 'Expand', 'Expand', 'Reshape']
class furiosa.quantizer.frontend.onnx.transformer.eliminate_redundant_shape_pattern.Pattern_5(model, name_nodes=True)

Bases: furiosa.quantizer.frontend.onnx.transformer.eliminate_redundant_shape_pattern.Pattern_1

transform

prev –> Reshape –> Expand –> Reshape –> next

to

prev –> ( ) –> next

if prev.output[0].shape == next.input[0].shape

pattern_to_match = ['Reshape', 'Expand', 'Reshape']
class furiosa.quantizer.frontend.onnx.transformer.eliminate_redundant_shape_pattern.Pattern_6(model, name_nodes=True)

Bases: furiosa.quantizer.frontend.onnx.transformer.eliminate_redundant_shape_pattern.Pattern_1

transform

prev –> Reshape –> Reshape –> next

to

prev –> ( ) –> next

if prev.output[0].shape == next.input[0].shape

pattern_to_match = ['Reshape', 'Reshape']
class furiosa.quantizer.frontend.onnx.transformer.eliminate_redundant_shape_pattern.Pattern_7(model, name_nodes=True)

Bases: furiosa.quantizer.frontend.onnx.transformer.eliminate_redundant_shape_pattern.Pattern_1

transform

prev –> Reshape –> Reshape –> Reshape –> next

to

prev –> ( ) –> next

if prev.output[0].shape == next.input[0].shape

pattern_to_match = ['Reshape', 'Reshape', 'Reshape']
class furiosa.quantizer.frontend.onnx.transformer.eliminate_redundant_shape_pattern.Pattern_8(model, name_nodes=True)

Bases: furiosa.quantizer.frontend.onnx.transformer.eliminate_redundant_shape_pattern.Pattern_1

transform

prev –> Expand –> next

to

prev –> ( ) –> next

if prev.output[0].shape == next.input[0].shape

pattern_to_match = ['Expand']

furiosa.quantizer.frontend.onnx.transformer.fuse_batchnorm module

class furiosa.quantizer.frontend.onnx.transformer.fuse_batchnorm.FuseBatchNorm

Bases: furiosa.quantizer.interfaces.transformer.Transformer

transform(model: onnx.onnx_ml_pb2.ModelProto) onnx.onnx_ml_pb2.ModelProto
class furiosa.quantizer.frontend.onnx.transformer.fuse_batchnorm.Pattern_1(model, name_nodes=True)

Bases: furiosa.quantizer.frontend.onnx.transformer.ONNXTransformer

transform

prev –> Conv –> BatchNormalization –> next

to

prev –> Conv –> next

pattern_matching(base_node: onnx.onnx_ml_pb2.NodeProto) Iterable[str]
pattern_to_match = ['Conv', 'BatchNormalization']
class furiosa.quantizer.frontend.onnx.transformer.fuse_batchnorm.Pattern_2(model, name_nodes=True)

Bases: furiosa.quantizer.frontend.onnx.transformer.ONNXTransformer

transform

prev –> ConvTranspose –> BatchNormalization –> next

to

prev –> ConvTranspose –> next

pattern_matching(base_node: onnx.onnx_ml_pb2.NodeProto) Iterable[str]
pattern_to_match = ['ConvTranspose', 'BatchNormalization']
class furiosa.quantizer.frontend.onnx.transformer.fuse_batchnorm.Pattern_3(model, name_nodes=True)

Bases: furiosa.quantizer.frontend.onnx.transformer.ONNXTransformer

transform

prev –> Conv –> Mul –> Add –> next

to

prev –> Conv –> next

if 1. Mul has only one initializer
  1. Add has only one initializer

pattern_condition_checker(nodes_to_check: List[onnx.onnx_ml_pb2.NodeProto]) bool
pattern_matching(base_node: onnx.onnx_ml_pb2.NodeProto) Iterable[str]
pattern_to_match = ['Conv', 'Mul', 'Add']
class furiosa.quantizer.frontend.onnx.transformer.fuse_batchnorm.Pattern_4(model, name_nodes=True)

Bases: furiosa.quantizer.frontend.onnx.transformer.ONNXTransformer

transform

prev –> BatchNormalization –> next

to

prev –> Mul –> Add –> next

if prev.op_type != Conv

make_new_init(matched_nodes: Iterable[onnx.onnx_ml_pb2.NodeProto]) List[onnx.onnx_ml_pb2.TensorProto]
make_new_vi(matched_nodes: Iterable[onnx.onnx_ml_pb2.NodeProto]) List[onnx.onnx_ml_pb2.ValueInfoProto]
pattern_matching(base_node: onnx.onnx_ml_pb2.NodeProto) Iterable[str]
pattern_to_match = ['BatchNormalization']

furiosa.quantizer.frontend.onnx.transformer.fuse_conv module

class furiosa.quantizer.frontend.onnx.transformer.fuse_conv.FuseConv

Bases: furiosa.quantizer.interfaces.transformer.Transformer

transform(model: onnx.onnx_ml_pb2.ModelProto) onnx.onnx_ml_pb2.ModelProto
class furiosa.quantizer.frontend.onnx.transformer.fuse_conv.Pattern_1(model, name_nodes=True)

Bases: furiosa.quantizer.frontend.onnx.transformer.ONNXTransformer

transform

prev –> MatMul –> Add –> next

to

prev –> Unsqueeze –> Conv –> Squeeze –> next

if 1. Opsetid is defined in ai.onnx domain and its version is in [12, 13]
  1. rank(MatMul.input[i]) == 2 for i = 1, 2

  2. MatMul must have exactly one initializer

  3. Add must have exactly one initializer

  4. Add’s input with initializer is multidirectional broadcastable to (1, oC)

check_condition_2(node: onnx.onnx_ml_pb2.NodeProto) bool
check_condition_3(node: onnx.onnx_ml_pb2.NodeProto) bool
check_condition_5(node: onnx.onnx_ml_pb2.NodeProto, node_1: onnx.onnx_ml_pb2.NodeProto) bool
make_new_init(matched_nodes: Iterable[onnx.onnx_ml_pb2.NodeProto]) List[onnx.onnx_ml_pb2.TensorProto]
make_new_node(matched_nodes: Iterable[onnx.onnx_ml_pb2.NodeProto]) List[onnx.onnx_ml_pb2.NodeProto]
make_new_vi(matched_nodes: Iterable[onnx.onnx_ml_pb2.NodeProto]) List[onnx.onnx_ml_pb2.ValueInfoProto]
pattern_condition_checker(nodes_to_check: Iterable[onnx.onnx_ml_pb2.NodeProto]) bool
pattern_matching(base_node: onnx.onnx_ml_pb2.NodeProto) Iterable[str]
pattern_to_match = ['MatMul', 'Add']
class furiosa.quantizer.frontend.onnx.transformer.fuse_conv.Pattern_2(model, name_nodes=True)

Bases: furiosa.quantizer.frontend.onnx.transformer.ONNXTransformer

transform

prev –> Gemm –> next

to

prev –> Unsqueeze –> Conv –> Squeeze –> next

if 1. Opsetid is defined in ai.onnx domain and its version is in [12, 13]
  1. Gemm.B must be defined in initializer whereas Gemm.A must not

  2. if Gemm.C is defined, Gemm.C must be an initializer and multidirectional broadcastable to (1, oC)

  3. all of Gemm.input must have onnx.TensorProto.FLOAT dtype

check_condition_2(node: onnx.onnx_ml_pb2.NodeProto) bool
check_condition_3(node: onnx.onnx_ml_pb2.NodeProto) bool
check_condition_4(node: onnx.onnx_ml_pb2.NodeProto) bool
make_new_init(matched_nodes: Iterable[onnx.onnx_ml_pb2.NodeProto]) List[onnx.onnx_ml_pb2.TensorProto]
make_new_node(matched_nodes: Iterable[onnx.onnx_ml_pb2.NodeProto]) List[onnx.onnx_ml_pb2.NodeProto]
make_new_vi(matched_nodes: Iterable[onnx.onnx_ml_pb2.NodeProto]) List[onnx.onnx_ml_pb2.ValueInfoProto]
pattern_condition_checker(nodes_to_check: Iterable[onnx.onnx_ml_pb2.NodeProto]) bool
pattern_matching(base_node: onnx.onnx_ml_pb2.NodeProto) Iterable[str]
pattern_to_match = ['Gemm']
class furiosa.quantizer.frontend.onnx.transformer.fuse_conv.Pattern_3(model, name_nodes=True)

Bases: furiosa.quantizer.frontend.onnx.transformer.ONNXTransformer

transform

prev –> Conv –> Add –> next

to

prev –> Conv –> next

if 1. len(Conv.input) == 2 or (len(Conv.input) == 3 and Conv.input[2] has initializer)
  1. Add has only one initializer

  2. Add’s input with initializer is multidirectional broadcastable to (1, oC, 1, 1)

check_condition_1(node: onnx.onnx_ml_pb2.NodeProto) bool
check_condition_2(node: onnx.onnx_ml_pb2.NodeProto) bool
check_condition_3(node: onnx.onnx_ml_pb2.NodeProto, node_1: onnx.onnx_ml_pb2.NodeProto) bool
make_new_init(matched_nodes: Iterable[onnx.onnx_ml_pb2.NodeProto]) List[onnx.onnx_ml_pb2.TensorProto]
make_new_node(matched_nodes: Iterable[onnx.onnx_ml_pb2.NodeProto]) List[onnx.onnx_ml_pb2.NodeProto]
pattern_condition_checker(nodes_to_check: Iterable[onnx.onnx_ml_pb2.NodeProto]) bool
pattern_matching(base_node: onnx.onnx_ml_pb2.NodeProto) Iterable[str]
pattern_to_match = ['Conv', 'Add']
furiosa.quantizer.frontend.onnx.transformer.fuse_conv.check_opset_version(model: onnx.onnx_ml_pb2.ModelProto)

furiosa.quantizer.frontend.onnx.transformer.fuse_depth_to_space module

class furiosa.quantizer.frontend.onnx.transformer.fuse_depth_to_space.FuseDepthToSpace

Bases: furiosa.quantizer.interfaces.transformer.Transformer

transform(model: onnx.onnx_ml_pb2.ModelProto) onnx.onnx_ml_pb2.ModelProto
class furiosa.quantizer.frontend.onnx.transformer.fuse_depth_to_space.Pattern_1(model, name_nodes=True)

Bases: furiosa.quantizer.frontend.onnx.transformer.ONNXTransformer

transform

prev –> Reshape –> Transpose –> Reshape –> next

to

prev –> DepthToSpace –> next

if Transpose.perm == [0, 1, 4, 2, 5, 3] or == [0, 3, 4, 1, 5, 2]

get_attrs(matched_nodes: Iterable[onnx.onnx_ml_pb2.NodeProto]) Dict
pattern_matching(base_node: onnx.onnx_ml_pb2.NodeProto) Iterable[str]

furiosa.quantizer.frontend.onnx.transformer.fuse_gather_matmul module

class furiosa.quantizer.frontend.onnx.transformer.fuse_gather_matmul.FuseGatherMatMul

Bases: furiosa.quantizer.interfaces.transformer.Transformer

transform(model: onnx.onnx_ml_pb2.ModelProto) onnx.onnx_ml_pb2.ModelProto
class furiosa.quantizer.frontend.onnx.transformer.fuse_gather_matmul.Pattern_1(model, name_nodes=True)

Bases: furiosa.quantizer.frontend.onnx.transformer.ONNXTransformer

transform

prev –> Gather –> MatMul –> next

to

prev –> Gather –> next

if 1. MatMul must have exactly one initializer
  1. Gather.data must be defined in graph.initializer

  2. MatMul weight’s data_type == onnx.TensorProto.FLOAT

  3. rank(MatMul weight) == 2

  4. Gather.data’s data_type == onnx.TensorProto.FLOAT

  5. rank(Gather.data) == 2

  6. (Gather.axis == 0 and Matmul.input[1] is initializer) or (Gather.axis == 1 and Matmul.input[0] is initializer)

make_new_init(matched_nodes: Iterable[onnx.onnx_ml_pb2.NodeProto]) List[onnx.onnx_ml_pb2.TensorProto]
static make_new_node(matched_nodes: Iterable[onnx.onnx_ml_pb2.NodeProto]) List[onnx.onnx_ml_pb2.NodeProto]
pattern_condition_checker(nodes_to_check: Iterable[onnx.onnx_ml_pb2.NodeProto]) bool
pattern_matching(base_node: onnx.onnx_ml_pb2.NodeProto) Iterable[str]
pattern_to_match = ['Gather', 'MatMul']

furiosa.quantizer.frontend.onnx.transformer.fuse_lp_normalization module

class furiosa.quantizer.frontend.onnx.transformer.fuse_lp_normalization.FuseLpNormalization

Bases: furiosa.quantizer.interfaces.transformer.Transformer

transform(model: onnx.onnx_ml_pb2.ModelProto) onnx.onnx_ml_pb2.ModelProto
class furiosa.quantizer.frontend.onnx.transformer.fuse_lp_normalization.Pattern_1(model, name_nodes=True)

Bases: furiosa.quantizer.frontend.onnx.transformer.ONNXTransformer

transform
prev –> ReduceL2/ReduceL1 –> Clip –> Expand –> Div –> next

——————————————–>

to

prev –> LpNormalization –> next

# TODO Check if Div has no initialzier

pattern_matching(base_node: onnx.onnx_ml_pb2.NodeProto) Iterable[str]

furiosa.quantizer.frontend.onnx.transformer.fuse_pad module

class furiosa.quantizer.frontend.onnx.transformer.fuse_pad.FusePad

Bases: furiosa.quantizer.interfaces.transformer.Transformer

transform(model: onnx.onnx_ml_pb2.ModelProto) onnx.onnx_ml_pb2.ModelProto
class furiosa.quantizer.frontend.onnx.transformer.fuse_pad.Pattern_1(model, name_nodes=True)

Bases: furiosa.quantizer.frontend.onnx.transformer.ONNXTransformer

transform

prev –> Pad –> MaxPool –> next

to

prev –> MaxPool –> next

if 1. Pad.mode == ‘constant’
  1. Pad.constant_value == -inf

  2. padded on spatial dimension

  3. fused_pads[i] < kernel_shape[i] and fused_pads[i + kernel_rank] < kernel_shape[i] for all i

check_condition_2(node)
check_condition_3(pads_input)
check_condition_6(node_attrs, pad_input)
get_attrs(node)
make_maxpool_pad(pad_input)
make_new_node(matched_nodes)
pattern_condition_checker(nodes_to_check)
pattern_matching(base_node)
pattern_to_match = ['Pad', 'MaxPool']
update_attrs(attrs, pad_input)
class furiosa.quantizer.frontend.onnx.transformer.fuse_pad.Pattern_2(model, name_nodes=True)

Bases: furiosa.quantizer.frontend.onnx.transformer.fuse_pad.Pattern_1

transform

prev –> Pad –> AveragePool –> next

to

prev –> AveragePool –> next

if 1. Pad.mode == ‘constant’
  1. Pad.constant_value == 0.0

  2. padded on spatial dimension

  3. AveragePool.count_include_pad == 1 or all AveragePool.pads == 0

  4. AveragePool.ceil_mode == 0

  5. fused_pads[i] < kernel_shape[i] and fused_pads[i + kernel_rank] < kernel_shape[i] for all i

check_condition_2(node)
check_condition_4(node)
check_condition_5(node)
get_attrs(node)
make_new_node(matched_nodes)
pattern_condition_checker(nodes_to_check)
pattern_to_match = ['Pad', 'AveragePool']
update_attrs(attrs, pad_input)

furiosa.quantizer.frontend.onnx.transformer.fuse_redundant_reshape_pattern module

class furiosa.quantizer.frontend.onnx.transformer.fuse_redundant_reshape_pattern.FuseRedundantReshapePattern

Bases: furiosa.quantizer.interfaces.transformer.Transformer

transform(model: onnx.onnx_ml_pb2.ModelProto) onnx.onnx_ml_pb2.ModelProto
class furiosa.quantizer.frontend.onnx.transformer.fuse_redundant_reshape_pattern.Pattern_1(model, name_nodes=True)

Bases: furiosa.quantizer.frontend.onnx.transformer.ONNXTransformer

transform

prev –> Reshape –> Reshape –> next

to

prev –> Reshape –> next

if prev.output[0].shape != next.input[0].shape

make_new_init(matched_nodes: Iterable[onnx.onnx_ml_pb2.NodeProto]) List[onnx.onnx_ml_pb2.ValueInfoProto]
static make_new_node(matched_nodes: Iterable[onnx.onnx_ml_pb2.NodeProto]) List[onnx.onnx_ml_pb2.TensorProto]
pattern_condition_checker(nodes_to_check: Iterable[onnx.onnx_ml_pb2.NodeProto]) bool
pattern_matching(base_node: onnx.onnx_ml_pb2.NodeProto) Iterable[str]
pattern_to_match = ['Reshape', 'Reshape']
class furiosa.quantizer.frontend.onnx.transformer.fuse_redundant_reshape_pattern.Pattern_2(model, name_nodes=True)

Bases: furiosa.quantizer.frontend.onnx.transformer.ONNXTransformer

transform

prev –> Reshape –> Reshape –> Reshape –> next

to

prev –> Reshape –> next

if prev.output[0].shape != next.input[0].shape

make_new_init(matched_nodes: Iterable[onnx.onnx_ml_pb2.NodeProto]) List[onnx.onnx_ml_pb2.ValueInfoProto]
static make_new_node(matched_nodes: Iterable[onnx.onnx_ml_pb2.NodeProto]) List[onnx.onnx_ml_pb2.TensorProto]
pattern_condition_checker(nodes_to_check: Iterable[onnx.onnx_ml_pb2.NodeProto]) bool
pattern_matching(base_node: onnx.onnx_ml_pb2.NodeProto) Iterable[str]
pattern_to_match = ['Reshape', 'Reshape', 'Reshape']
class furiosa.quantizer.frontend.onnx.transformer.fuse_redundant_reshape_pattern.Pattern_3(model, name_nodes=True)

Bases: furiosa.quantizer.frontend.onnx.transformer.ONNXTransformer

transform

prev –> Flatten/Squeeze –> Unsqueeze –> next

to

prev –> Reshape –> next

if prev.output[0].shape != next.input[0].shape

make_new_init(matched_nodes: Iterable[onnx.onnx_ml_pb2.NodeProto]) List[onnx.onnx_ml_pb2.TensorProto]
static make_new_node(matched_nodes: Iterable[onnx.onnx_ml_pb2.NodeProto]) List[onnx.onnx_ml_pb2.NodeProto]
pattern_condition_checker(nodes_to_check: Iterable[onnx.onnx_ml_pb2.NodeProto]) bool
pattern_matching(base_node: onnx.onnx_ml_pb2.NodeProto) Iterable[str]
pattern_to_match = ['Flatten/Squeeze', 'Unsqueeze']

furiosa.quantizer.frontend.onnx.transformer.infer_squeeze_axes module

class furiosa.quantizer.frontend.onnx.transformer.infer_squeeze_axes.InferSqueezeAxes

Bases: furiosa.quantizer.interfaces.transformer.Transformer

transform(model: onnx.onnx_ml_pb2.ModelProto) onnx.onnx_ml_pb2.ModelProto
class furiosa.quantizer.frontend.onnx.transformer.infer_squeeze_axes.Pattern_1(model, name_nodes=True)

Bases: furiosa.quantizer.frontend.onnx.transformer.ONNXTransformer

transform

prev –> Squeeze (axes attribute is None) –> next

to

prev –> Squeeze (axes attribute is filled using input’s value info) –> next

if 1. model’s opset < 13.
  1. axes attribute of Squeeze does not exist

  2. Squeeze.input[0] has shape info (graph input or shape inferred value info)

make_new_node(matched_nodes: Iterable[onnx.onnx_ml_pb2.NodeProto]) List[onnx.onnx_ml_pb2.NodeProto]
pattern_condition_checker(nodes_to_check: Iterable[onnx.onnx_ml_pb2.NodeProto]) bool
pattern_matching(base_node: onnx.onnx_ml_pb2.NodeProto) Iterable[str]
pattern_to_match = ['Squeeze']

furiosa.quantizer.frontend.onnx.transformer.polish_model module

class furiosa.quantizer.frontend.onnx.transformer.polish_model.PolishModel(input_shapes: Optional[Dict[str, List[int]]] = None)

Bases: furiosa.quantizer.interfaces.transformer.Transformer[onnx.onnx_ml_pb2.ModelProto]

Essential graph transformer/optimizers

transform(model: onnx.onnx_ml_pb2.ModelProto) onnx.onnx_ml_pb2.ModelProto

furiosa.quantizer.frontend.onnx.transformer.utils module

furiosa.quantizer.frontend.onnx.transformer.utils.check_value_info(model: onnx.onnx_ml_pb2.ModelProto) None
furiosa.quantizer.frontend.onnx.transformer.utils.eliminate_initializer_from_graph_input(model: onnx.onnx_ml_pb2.ModelProto) onnx.onnx_ml_pb2.ModelProto
furiosa.quantizer.frontend.onnx.transformer.utils.eliminate_unused_initializer(model)

This function eliminates every initializers not used by node input, regardless of any graph fields they are defined in.

furiosa.quantizer.frontend.onnx.transformer.utils.eliminate_unused_input(model)
furiosa.quantizer.frontend.onnx.transformer.utils.eliminate_unused_output(model)
furiosa.quantizer.frontend.onnx.transformer.utils.eliminate_unused_protos(model)
furiosa.quantizer.frontend.onnx.transformer.utils.eliminate_unused_value_info(model)
furiosa.quantizer.frontend.onnx.transformer.utils.fix_batch_size_as_one(model)

fix batch_size = 1 if dim_param is given.

furiosa.quantizer.frontend.onnx.transformer.utils.fixed_point(x: furiosa.quantizer.frontend.onnx.transformer.utils.T, functions: Iterable[Callable[[furiosa.quantizer.frontend.onnx.transformer.utils.T], furiosa.quantizer.frontend.onnx.transformer.utils.T]]) furiosa.quantizer.frontend.onnx.transformer.utils.T
furiosa.quantizer.frontend.onnx.transformer.utils.get_attribute(attrs: Iterable[onnx.onnx_ml_pb2.AttributeProto], attr_name: str, default: Optional[Any] = None) Any
furiosa.quantizer.frontend.onnx.transformer.utils.get_node_attributes(node: onnx.onnx_ml_pb2.NodeProto) Dict[str, Any]
furiosa.quantizer.frontend.onnx.transformer.utils.get_node_input_names(model)
furiosa.quantizer.frontend.onnx.transformer.utils.get_node_output_names(model)
furiosa.quantizer.frontend.onnx.transformer.utils.is_op_type(op_type: str, target_op_types: Iterable[str]) bool
furiosa.quantizer.frontend.onnx.transformer.utils.make_initializer_name_unique(model)
furiosa.quantizer.frontend.onnx.transformer.utils.make_unhashables_unique(values)
furiosa.quantizer.frontend.onnx.transformer.utils.name_nodes(model)
furiosa.quantizer.frontend.onnx.transformer.utils.rebuild_model(model: onnx.onnx_ml_pb2.ModelProto, new_nodes: List[onnx.onnx_ml_pb2.NodeProto], eliminate: bool = True, renaming: bool = True)

Module contents

class furiosa.quantizer.frontend.onnx.transformer.ONNXTransformer(model, name_nodes=True)

Bases: object

bridge_disconnected_nodes(node_0: onnx.onnx_ml_pb2.NodeProto, next_nodes: List[onnx.onnx_ml_pb2.NodeProto], new_input)
For a graph changed, for example,

before) prev –> node_1 –> node_0 –> next after) prev –> node_1 –> ( ) -/-> next

This function bridges node_1 and next as follows:

prev –> node_1 –> next by assigning next.input[y] = node_1.output[x]

build_optimized_model(model, check=True)
check_runnable = True
copy_value_info(name)
find_next_node(node: onnx.onnx_ml_pb2.NodeProto) List[onnx.onnx_ml_pb2.NodeProto]
find_prev_node(node_input: str) onnx.onnx_ml_pb2.NodeProto
get_data_node_input(node)
get_init_node_input(node)
get_initializer_array(node_input)
get_map_values(field)
get_node_input_idx(node_input)
get_value_info_dtype(tensor_name: str) int
get_value_info_shape(tensor_name: str) List[int]
is_same_shape(input_1, input_2)
make_int64_initializer(name, target_name)
pattern_matcher(node, pattern_to_match: List[str])
pattern_matching(base_node)
pop_multiple_initializer_map(nodes: List[onnx.onnx_ml_pb2.TensorProto])
pop_multiple_optimizer_map(nodes: List[onnx.onnx_ml_pb2.NodeProto])
pop_multiple_value_info_map(vis: List[onnx.onnx_ml_pb2.ValueInfoProto])
pop_single_initializer_map(init: onnx.onnx_ml_pb2.TensorProto)
pop_single_optimizer_map(node: onnx.onnx_ml_pb2.NodeProto)
pop_single_value_info_map(vi: onnx.onnx_ml_pb2.NodeProto)
transform()
transform_to_eliminate(nodes_to_remove: List[onnx.onnx_ml_pb2.NodeProto], new_input)

This function eliminates designated nodes and bridges the previous and next nodes of them.

For example, if [B, C] is given to be removed, then removes [B, C] in A - B - C - D and connects [A, D] to make A - D.

transform_to_fuse(nodes_to_remove: List[onnx.onnx_ml_pb2.NodeProto], nodes_to_add: Optional[List[onnx.onnx_ml_pb2.NodeProto]] = None, inits_to_add: Optional[List[onnx.onnx_ml_pb2.TensorProto]] = None, vis_to_add: Optional[List[onnx.onnx_ml_pb2.ValueInfoProto]] = None)
traverse_prev_node(producer_map_key: str, target_op_types: List[str])
update_graph_fields(model)
update_multiple_initializer_map(initializers: List[onnx.onnx_ml_pb2.TensorProto])
update_multiple_optimizer_map(nodes: List[onnx.onnx_ml_pb2.NodeProto], dest_name)
update_multiple_value_info_map(value_infos: List[onnx.onnx_ml_pb2.ValueInfoProto])
update_single_initializer_map(initializer: onnx.onnx_ml_pb2.TensorProto)
update_single_optimizer_map(node: onnx.onnx_ml_pb2.NodeProto, dest_name)
update_single_value_info_map(value_info: onnx.onnx_ml_pb2.ValueInfoProto)