How To split >2GB ONNX model with external data? #5407
Replies: 4 comments 1 reply
-
Did you try converting the model data to be external, as described here ? |
Beta Was this translation helpful? Give feedback.
-
I want answer for this question too. I have a model which is more than 10GB. How to break the model into multiple parts? |
Beta Was this translation helpful? Give feedback.
-
Guys Anyone solved the issue? |
Beta Was this translation helpful? Give feedback.
-
For new readers, here is a pass that does it microsoft/onnxscript#2119. Combined with |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I have a large (substantially larger than 2GB) ONNX model with external data files. I've been having trouble with what I thought would be a pretty simple task. Slice a part of that model at a node. The resulting "small" model is also over 2GB BTW.
Does anyone know any way this can be done without converting .onnx binary file format to txt with Protocol Buffers and then spending ages editing the resulting txt file in an editor by hand?
I've tried onnx.util.extract_model, it fails with:
ValueError: Message onnx.ModelProto exceeds maximum protobuf size of 2GB: 28869931499
I tried with check_model = False with same result.
I tried a python tool that has been written to bypass the 2GB limitation of onnx.util.extract_model named sne4onnx, but it fails with the exact same error message. I suspect it would allow me to work with models over 2GB, but the chunk being sliced has to be under 2GB.
So, how do people split large (substantially larger than 2GB) ONNX models?
Beta Was this translation helpful? Give feedback.
All reactions