furiosa.optimizer.frontend.onnx.transformer package

Subpackages

Submodules

furiosa.optimizer.frontend.onnx.transformer.convert_negative_pads_to_slice module

class furiosa.optimizer.frontend.onnx.transformer.convert_negative_pads_to_slice.ConvertNegativePadsToSlice

Bases: Transformer

transform(model: ModelProto) ModelProto
class furiosa.optimizer.frontend.onnx.transformer.convert_negative_pads_to_slice.Pattern_1(model, name_nodes=True)

Bases: ONNXTransformer

transform

prev –> Pad –> next

to

prev –> Slice –> Pad –> next

if
  1. Pad’s pads input (pad.input[1]) is an initializer

  2. at least one of pads value is negative

  3. Sum of negative pads value in axis i does not exceed corresponding input_shape[i]

(if not, model is invalid)

  1. absolute value of each negative pads in axis i is less than corresponding input_shape[i]

(if not, leads to invalid model/’nan’ output)

make_new_init_and_vi(matched_nodes: Iterable[NodeProto]) Mapping[str, List[NodeProto] | List[ValueInfoProto] | List[TensorProto]]
make_new_node(matched_nodes: Iterable[NodeProto]) List[NodeProto]
pattern_condition_checker(nodes_to_check: Iterable[NodeProto]) bool
pattern_matching(base_node: NodeProto) Iterable[str]
pattern_to_match = ['Pad']
class furiosa.optimizer.frontend.onnx.transformer.convert_negative_pads_to_slice.Pattern_2(model, name_nodes=True)

Bases: ONNXTransformer

transform

prev –> Pad –> next

to

prev –> next

if
  1. Pad’s pads value are all zero

pattern_condition_checker(nodes_to_check: Iterable[NodeProto]) bool
pattern_matching(base_node: NodeProto) Iterable[str]
pattern_to_match = ['Pad']

furiosa.optimizer.frontend.onnx.transformer.convert_prelu_to_relu module

class furiosa.optimizer.frontend.onnx.transformer.convert_prelu_to_relu.ConvertPReluToRelu

Bases: Transformer

transform(model: ModelProto) ModelProto
class furiosa.optimizer.frontend.onnx.transformer.convert_prelu_to_relu.Pattern_1(model, name_nodes=True)

Bases: ONNXTransformer

transform

PRelu(x) = slope * x if x < 0, x if x >=0

into

(1 - slope) * Relu(x) + slope * x

if
  1. PRelu’s input[1] is an initializer

make_new_init(matched_nodes: Iterable[NodeProto]) List[TensorProto]
make_new_node(matched_nodes: Iterable[NodeProto]) List[NodeProto]
make_new_vi(matched_nodes: Iterable[NodeProto]) List[ValueInfoProto]
pattern_condition_checker(nodes_to_check: Iterable[NodeProto]) bool
pattern_matching(base_node: NodeProto) Iterable[str]
pattern_to_match = ['PRelu']
class furiosa.optimizer.frontend.onnx.transformer.convert_prelu_to_relu.Pattern_2(model, name_nodes=True)

Bases: ONNXTransformer

transform

PRelu(x) = slope * x if x < 0, x if x >=0

into

(1 - slope) * Relu(x) + slope * x

if
  1. PRelu’s input[1] is not an initializer

make_new_init(matched_nodes: Iterable[NodeProto]) List[TensorProto]
make_new_node(matched_nodes: Iterable[NodeProto]) List[NodeProto]
make_new_vi(matched_nodes: Iterable[NodeProto]) List[ValueInfoProto]
pattern_condition_checker(nodes_to_check: Iterable[NodeProto]) bool
pattern_matching(base_node: NodeProto) Iterable[str]
pattern_to_match = ['PRelu']

furiosa.optimizer.frontend.onnx.transformer.fuse_batchnorm module

class furiosa.optimizer.frontend.onnx.transformer.fuse_batchnorm.FuseBatchNorm

Bases: Transformer

transform(model: ModelProto) ModelProto
class furiosa.optimizer.frontend.onnx.transformer.fuse_batchnorm.Pattern_1(model, name_nodes=True)

Bases: ONNXTransformer

transform

prev –> Conv –> BatchNormalization –> next

to

prev –> Conv –> next

pattern_matching(base_node: NodeProto) Iterable[str]
pattern_to_match = ['Conv', 'BatchNormalization']
class furiosa.optimizer.frontend.onnx.transformer.fuse_batchnorm.Pattern_2(model, name_nodes=True)

Bases: ONNXTransformer

transform

prev –> ConvTranspose –> BatchNormalization –> next

to

prev –> ConvTranspose –> next

pattern_matching(base_node: NodeProto) Iterable[str]
pattern_to_match = ['ConvTranspose', 'BatchNormalization']
class furiosa.optimizer.frontend.onnx.transformer.fuse_batchnorm.Pattern_3(model, name_nodes=True)

Bases: ONNXTransformer

transform

prev –> Conv –> Mul –> Add –> next

to

prev –> Conv –> next

if 1. Mul has only one initializer
  1. Add has only one initializer

pattern_condition_checker(nodes_to_check: List[NodeProto]) bool
pattern_matching(base_node: NodeProto) Iterable[str]
pattern_to_match = ['Conv', 'Mul', 'Add']
class furiosa.optimizer.frontend.onnx.transformer.fuse_batchnorm.Pattern_4(model, name_nodes=True)

Bases: ONNXTransformer

transform

prev –> BatchNormalization –> next

to

prev –> Mul –> Add –> next

if prev.op_type != Conv

make_new_init(matched_nodes: Iterable[NodeProto]) List[TensorProto]
make_new_vi(matched_nodes: Iterable[NodeProto]) List[ValueInfoProto]
pattern_matching(base_node: NodeProto) Iterable[str]
pattern_to_match = ['BatchNormalization']

furiosa.optimizer.frontend.onnx.transformer.fuse_gather_matmul module

class furiosa.optimizer.frontend.onnx.transformer.fuse_gather_matmul.FuseGatherMatMul

Bases: Transformer

transform(model: ModelProto) ModelProto
class furiosa.optimizer.frontend.onnx.transformer.fuse_gather_matmul.Pattern_1(model, name_nodes=True)

Bases: ONNXTransformer

transform

prev –> Gather –> MatMul –> next

to

prev –> Gather –> next

if 1. MatMul must have exactly one initializer
  1. Gather.data must be defined in graph.initializer

  2. MatMul weight’s data_type == onnx.TensorProto.FLOAT

  3. rank(MatMul weight) == 2

  4. Gather.data’s data_type == onnx.TensorProto.FLOAT

  5. rank(Gather.data) == 2

  6. (Gather.axis == 0 and Matmul.input[1] is initializer) or (Gather.axis == 1 and Matmul.input[0] is initializer)

make_new_init(matched_nodes: Iterable[NodeProto]) List[TensorProto]
static make_new_node(matched_nodes: Iterable[NodeProto]) List[NodeProto]
pattern_condition_checker(nodes_to_check: Iterable[NodeProto]) bool
pattern_matching(base_node: NodeProto) Iterable[str]
pattern_to_match = ['Gather', 'MatMul']

furiosa.optimizer.frontend.onnx.transformer.infer_squeeze_axes module

class furiosa.optimizer.frontend.onnx.transformer.infer_squeeze_axes.InferSqueezeAxes

Bases: Transformer

transform(model: ModelProto) ModelProto
class furiosa.optimizer.frontend.onnx.transformer.infer_squeeze_axes.Pattern_1(model, name_nodes=True)

Bases: ONNXTransformer

transform

prev –> Squeeze (axes attribute is None) –> next

to

prev –> Squeeze (axes attribute is filled using input’s value info) –> next

if 1. model’s opset < 13.
  1. axes attribute of Squeeze does not exist

  2. Squeeze.input[0] has shape info (graph input or shape inferred value info)

make_new_node(matched_nodes: Iterable[NodeProto]) List[NodeProto]
pattern_condition_checker(nodes_to_check: Iterable[NodeProto]) bool
pattern_matching(base_node: NodeProto) Iterable[str]
pattern_to_match = ['Squeeze']

furiosa.optimizer.frontend.onnx.transformer.polish_model module

class furiosa.optimizer.frontend.onnx.transformer.polish_model.PolishModel(input_shapes: Mapping[str, List[int | None]] | None = None)

Bases: Transformer[ModelProto]

Essential graph transformer/optimizers

transform(model: ModelProto) ModelProto

furiosa.optimizer.frontend.onnx.transformer.utils module

furiosa.optimizer.frontend.onnx.transformer.utils.check_value_info(model: ModelProto) None
furiosa.optimizer.frontend.onnx.transformer.utils.eliminate_initializer_from_graph_input(model: ModelProto) ModelProto
furiosa.optimizer.frontend.onnx.transformer.utils.eliminate_unused_initializer(model)

This function eliminates every initializers not used by node input, regardless of any graph fields they are defined in.

furiosa.optimizer.frontend.onnx.transformer.utils.eliminate_unused_input(model)
furiosa.optimizer.frontend.onnx.transformer.utils.eliminate_unused_output(model)
furiosa.optimizer.frontend.onnx.transformer.utils.eliminate_unused_protos(model)
furiosa.optimizer.frontend.onnx.transformer.utils.eliminate_unused_value_info(model)
furiosa.optimizer.frontend.onnx.transformer.utils.fix_batch_size_as_one(model)

fix batch_size = 1 if dim_param is given.

furiosa.optimizer.frontend.onnx.transformer.utils.fixed_point(x: T, functions: Iterable[Callable[[T], T]]) T
furiosa.optimizer.frontend.onnx.transformer.utils.get_attribute(attrs: Iterable[AttributeProto], attr_name: str, default: Any | None = None) Any
furiosa.optimizer.frontend.onnx.transformer.utils.get_node_attributes(node: NodeProto) Dict[str, Any]
furiosa.optimizer.frontend.onnx.transformer.utils.get_node_input_names(model)
furiosa.optimizer.frontend.onnx.transformer.utils.get_node_output_names(model)
furiosa.optimizer.frontend.onnx.transformer.utils.is_op_type(op_type: str, target_op_types: Iterable[str]) bool
furiosa.optimizer.frontend.onnx.transformer.utils.make_initializer_name_unique(model)
furiosa.optimizer.frontend.onnx.transformer.utils.make_unhashables_unique(values)
furiosa.optimizer.frontend.onnx.transformer.utils.name_nodes(model)
furiosa.optimizer.frontend.onnx.transformer.utils.rebuild_model(model: ModelProto, new_nodes: List[NodeProto], eliminate: bool = True, renaming: bool = True)

Module contents

class furiosa.optimizer.frontend.onnx.transformer.ONNXTransformer(model, name_nodes=True)

Bases: object

bridge_disconnected_nodes(node_0: NodeProto, next_nodes: List[NodeProto], new_input)
For a graph changed, for example,

before) prev –> node_1 –> node_0 –> next after) prev –> node_1 –> ( ) -/-> next

This function bridges node_1 and next as follows:

prev –> node_1 –> next by assigning next.input[y] = node_1.output[x]

build_optimized_model(model, check=True)
check_runnable = True
copy_value_info(name)
find_next_node(node: NodeProto) List[NodeProto]
find_prev_node(node_input: str) NodeProto | None
get_data_node_input(node)
get_init_node_input(node)
get_initializer_array(node_input)
get_map_values(field)
get_value_info_dtype(tensor_name: str) int
get_value_info_shape(tensor_name: str) List[int]
is_same_shape(input_1, input_2)
pattern_matcher(node, pattern_to_match: List[str])
pattern_matching(base_node)
pop_multiple_optimizer_map(nodes: List[NodeProto])
pop_single_optimizer_map(node: NodeProto)
transform()
transform_to_eliminate(nodes_to_remove: List[NodeProto], new_input)

This function eliminates designated nodes and bridges the previous and next nodes of them.

For example, if [B, C] is given to be removed, then removes [B, C] in A - B - C - D and connects [A, D] to make A - D.

transform_to_fuse(nodes_to_remove: List[NodeProto], nodes_to_add: List[NodeProto], inits_to_add: List[TensorProto] | None = None, vis_to_add: List[ValueInfoProto] | None = None)
traverse_prev_node(producer_map_key: str, target_op_types: List[str])
update_graph_fields(model)
update_multiple_initializer_map(initializers: List[TensorProto])
update_multiple_optimizer_map(nodes: List[NodeProto], dest_name)
update_multiple_value_info_map(value_infos: List[ValueInfoProto])
update_single_initializer_map(initializer: TensorProto)
update_single_optimizer_map(node: NodeProto, dest_name)
update_single_value_info_map(value_info: ValueInfoProto)