furiosa.optimizer.frontend.onnx.transformer package
Subpackages
- furiosa.optimizer.frontend.onnx.transformer.experimental package
- Submodules
- furiosa.optimizer.frontend.onnx.transformer.experimental.eliminate_detection_postprocess module
- furiosa.optimizer.frontend.onnx.transformer.experimental.embedding_bag_porting module
- furiosa.optimizer.frontend.onnx.transformer.experimental.fuse_div_for_bert module
- furiosa.optimizer.frontend.onnx.transformer.experimental.reify_conv_for_bert module
- Module contents
Submodules
furiosa.optimizer.frontend.onnx.transformer.convert_negative_pads_to_slice module
- class furiosa.optimizer.frontend.onnx.transformer.convert_negative_pads_to_slice.ConvertNegativePadsToSlice
Bases:
Transformer
- transform(model: ModelProto) ModelProto
- class furiosa.optimizer.frontend.onnx.transformer.convert_negative_pads_to_slice.Pattern_1(model, name_nodes=True)
Bases:
ONNXTransformer
- transform
prev –> Pad –> next
- to
prev –> Slice –> Pad –> next
- if
Pad’s pads input (pad.input[1]) is an initializer
at least one of pads value is negative
Sum of negative pads value in axis i does not exceed corresponding input_shape[i]
(if not, model is invalid)
absolute value of each negative pads in axis i is less than corresponding input_shape[i]
(if not, leads to invalid model/’nan’ output)
- make_new_init_and_vi(matched_nodes: Iterable[NodeProto]) Mapping[str, List[NodeProto] | List[ValueInfoProto] | List[TensorProto]]
- pattern_to_match = ['Pad']
- class furiosa.optimizer.frontend.onnx.transformer.convert_negative_pads_to_slice.Pattern_2(model, name_nodes=True)
Bases:
ONNXTransformer
- transform
prev –> Pad –> next
- to
prev –> next
- if
Pad’s pads value are all zero
- pattern_to_match = ['Pad']
furiosa.optimizer.frontend.onnx.transformer.convert_prelu_to_relu module
- class furiosa.optimizer.frontend.onnx.transformer.convert_prelu_to_relu.ConvertPReluToRelu
Bases:
Transformer
- transform(model: ModelProto) ModelProto
- class furiosa.optimizer.frontend.onnx.transformer.convert_prelu_to_relu.Pattern_1(model, name_nodes=True)
Bases:
ONNXTransformer
- transform
PRelu(x) = slope * x if x < 0, x if x >=0
- into
(1 - slope) * Relu(x) + slope * x
- if
PRelu’s input[1] is an initializer
- pattern_to_match = ['PRelu']
- class furiosa.optimizer.frontend.onnx.transformer.convert_prelu_to_relu.Pattern_2(model, name_nodes=True)
Bases:
ONNXTransformer
- transform
PRelu(x) = slope * x if x < 0, x if x >=0
- into
(1 - slope) * Relu(x) + slope * x
- if
PRelu’s input[1] is not an initializer
- pattern_to_match = ['PRelu']
furiosa.optimizer.frontend.onnx.transformer.fuse_batchnorm module
- class furiosa.optimizer.frontend.onnx.transformer.fuse_batchnorm.FuseBatchNorm
Bases:
Transformer
- transform(model: ModelProto) ModelProto
- class furiosa.optimizer.frontend.onnx.transformer.fuse_batchnorm.Pattern_1(model, name_nodes=True)
Bases:
ONNXTransformer
- transform
prev –> Conv –> BatchNormalization –> next
- to
prev –> Conv –> next
- pattern_to_match = ['Conv', 'BatchNormalization']
- class furiosa.optimizer.frontend.onnx.transformer.fuse_batchnorm.Pattern_2(model, name_nodes=True)
Bases:
ONNXTransformer
- transform
prev –> ConvTranspose –> BatchNormalization –> next
- to
prev –> ConvTranspose –> next
- pattern_to_match = ['ConvTranspose', 'BatchNormalization']
- class furiosa.optimizer.frontend.onnx.transformer.fuse_batchnorm.Pattern_3(model, name_nodes=True)
Bases:
ONNXTransformer
- transform
prev –> Conv –> Mul –> Add –> next
- to
prev –> Conv –> next
- if 1. Mul has only one initializer
Add has only one initializer
- pattern_to_match = ['Conv', 'Mul', 'Add']
- class furiosa.optimizer.frontend.onnx.transformer.fuse_batchnorm.Pattern_4(model, name_nodes=True)
Bases:
ONNXTransformer
- transform
prev –> BatchNormalization –> next
- to
prev –> Mul –> Add –> next
if prev.op_type != Conv
- pattern_to_match = ['BatchNormalization']
furiosa.optimizer.frontend.onnx.transformer.fuse_gather_matmul module
- class furiosa.optimizer.frontend.onnx.transformer.fuse_gather_matmul.FuseGatherMatMul
Bases:
Transformer
- transform(model: ModelProto) ModelProto
- class furiosa.optimizer.frontend.onnx.transformer.fuse_gather_matmul.Pattern_1(model, name_nodes=True)
Bases:
ONNXTransformer
- transform
prev –> Gather –> MatMul –> next
- to
prev –> Gather –> next
- if 1. MatMul must have exactly one initializer
Gather.data must be defined in graph.initializer
MatMul weight’s data_type == onnx.TensorProto.FLOAT
rank(MatMul weight) == 2
Gather.data’s data_type == onnx.TensorProto.FLOAT
rank(Gather.data) == 2
(Gather.axis == 0 and Matmul.input[1] is initializer) or (Gather.axis == 1 and Matmul.input[0] is initializer)
- pattern_to_match = ['Gather', 'MatMul']
furiosa.optimizer.frontend.onnx.transformer.infer_squeeze_axes module
- class furiosa.optimizer.frontend.onnx.transformer.infer_squeeze_axes.InferSqueezeAxes
Bases:
Transformer
- transform(model: ModelProto) ModelProto
- class furiosa.optimizer.frontend.onnx.transformer.infer_squeeze_axes.Pattern_1(model, name_nodes=True)
Bases:
ONNXTransformer
- transform
prev –> Squeeze (axes attribute is None) –> next
- to
prev –> Squeeze (axes attribute is filled using input’s value info) –> next
- if 1. model’s opset < 13.
axes attribute of Squeeze does not exist
Squeeze.input[0] has shape info (graph input or shape inferred value info)
- pattern_to_match = ['Squeeze']
furiosa.optimizer.frontend.onnx.transformer.polish_model module
furiosa.optimizer.frontend.onnx.transformer.utils module
- furiosa.optimizer.frontend.onnx.transformer.utils.eliminate_initializer_from_graph_input(model: ModelProto) ModelProto
- furiosa.optimizer.frontend.onnx.transformer.utils.eliminate_unused_initializer(model)
This function eliminates every initializers not used by node input, regardless of any graph fields they are defined in.
- furiosa.optimizer.frontend.onnx.transformer.utils.eliminate_unused_input(model)
- furiosa.optimizer.frontend.onnx.transformer.utils.eliminate_unused_output(model)
- furiosa.optimizer.frontend.onnx.transformer.utils.eliminate_unused_protos(model)
- furiosa.optimizer.frontend.onnx.transformer.utils.eliminate_unused_value_info(model)
- furiosa.optimizer.frontend.onnx.transformer.utils.fix_batch_size_as_one(model)
fix batch_size = 1 if dim_param is given.
- furiosa.optimizer.frontend.onnx.transformer.utils.fixed_point(x: T, functions: Iterable[Callable[[T], T]]) T
- furiosa.optimizer.frontend.onnx.transformer.utils.get_attribute(attrs: Iterable[AttributeProto], attr_name: str, default: Any | None = None) Any
- furiosa.optimizer.frontend.onnx.transformer.utils.get_node_attributes(node: NodeProto) Dict[str, Any]
- furiosa.optimizer.frontend.onnx.transformer.utils.get_node_input_names(model)
- furiosa.optimizer.frontend.onnx.transformer.utils.get_node_output_names(model)
- furiosa.optimizer.frontend.onnx.transformer.utils.is_op_type(op_type: str, target_op_types: Iterable[str]) bool
- furiosa.optimizer.frontend.onnx.transformer.utils.make_initializer_name_unique(model)
- furiosa.optimizer.frontend.onnx.transformer.utils.make_unhashables_unique(values)
- furiosa.optimizer.frontend.onnx.transformer.utils.name_nodes(model)
Module contents
- class furiosa.optimizer.frontend.onnx.transformer.ONNXTransformer(model, name_nodes=True)
Bases:
object
- bridge_disconnected_nodes(node_0: NodeProto, next_nodes: List[NodeProto], new_input)
- For a graph changed, for example,
before) prev –> node_1 –> node_0 –> next after) prev –> node_1 –> ( ) -/-> next
- This function bridges node_1 and next as follows:
prev –> node_1 –> next by assigning next.input[y] = node_1.output[x]
- build_optimized_model(model, check=True)
- check_runnable = True
- copy_value_info(name)
- get_data_node_input(node)
- get_init_node_input(node)
- get_initializer_array(node_input)
- get_map_values(field)
- is_same_shape(input_1, input_2)
- pattern_matching(base_node)
- pop_single_optimizer_map(node: NodeProto)
- transform()
- transform_to_eliminate(nodes_to_remove: List[NodeProto], new_input)
This function eliminates designated nodes and bridges the previous and next nodes of them.
For example, if [B, C] is given to be removed, then removes [B, C] in A - B - C - D and connects [A, D] to make A - D.
- transform_to_fuse(nodes_to_remove: List[NodeProto], nodes_to_add: List[NodeProto], inits_to_add: List[TensorProto] | None = None, vis_to_add: List[ValueInfoProto] | None = None)
- update_graph_fields(model)
- update_single_initializer_map(initializer: TensorProto)
- update_single_optimizer_map(node: NodeProto, dest_name)
- update_single_value_info_map(value_info: ValueInfoProto)