huggingface / exporters

Export Hugging Face models to Core ML and TensorFlow Lite

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Exporter being killed

willswire opened this issue · comments

Similar to #61, my exporter process is being killed. I'd like to verify this is a resource constraint, and not an issue in project. I am running python3 -m exporters.coreml --model=mistralai/Mistral-7B-v0.1 mistral.mlpackage on a M3 MacBook Pro with 18GB of memory.

model-00001-of-00002.safetensors: 100%|████| 9.94G/9.94G [07:47<00:00, 21.3MB/s]
model-00002-of-00002.safetensors: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 4.54G/4.54G [04:42<00:00, 16.1MB/s]
Downloading shards: 100%|████████████████████████| 2/2 [12:31<00:00, 375.71s/it]████████████████████████████████████████████████████████████████████████████████████████████████▉| 4.54G/4.54G [04:42<00:00, 16.7MB/s]
Loading checkpoint shards: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 2/2 [00:25<00:00, 12.58s/it]
Using framework PyTorch: 2.1.0
Overriding 1 configuration item(s)
	- use_cache -> False
/opt/homebrew/lib/python3.11/site-packages/transformers/modeling_attn_mask_utils.py:114: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
  if (input_shape[-1] > 1 or self.sliding_window is not None) and self.is_causal:
/opt/homebrew/lib/python3.11/site-packages/transformers/modeling_attn_mask_utils.py:161: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
  if past_key_values_length > 0:
/opt/homebrew/lib/python3.11/site-packages/transformers/models/mistral/modeling_mistral.py:119: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
  if seq_len > self.max_seq_len_cached:
/opt/homebrew/lib/python3.11/site-packages/transformers/models/mistral/modeling_mistral.py:285: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
  if attn_weights.size() != (bsz, self.num_heads, q_len, kv_seq_len):
/opt/homebrew/lib/python3.11/site-packages/transformers/models/mistral/modeling_mistral.py:292: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
  if attention_mask.size() != (bsz, 1, q_len, kv_seq_len):
/opt/homebrew/lib/python3.11/site-packages/transformers/models/mistral/modeling_mistral.py:304: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
  if attn_output.size() != (bsz, self.num_heads, q_len, self.head_dim):
Skipping token_type_ids input
Patching PyTorch conversion 'log' with <function MistralCoreMLConfig.patch_pytorch_ops.<locals>.log at 0x13a115300>
/opt/homebrew/lib/python3.11/site-packages/coremltools/models/_deprecation.py:27: FutureWarning: Function _TORCH_OPS_REGISTRY.__contains__ is deprecated and will be removed in 7.2.; Please use coremltools.converters.mil.frontend.torch.register_torch_op
  warnings.warn(msg, category=FutureWarning)
/opt/homebrew/lib/python3.11/site-packages/coremltools/models/_deprecation.py:27: FutureWarning: Function _TORCH_OPS_REGISTRY.__getitem__ is deprecated and will be removed in 7.2.; Please use coremltools.converters.mil.frontend.torch.register_torch_op
  warnings.warn(msg, category=FutureWarning)
/opt/homebrew/lib/python3.11/site-packages/coremltools/models/_deprecation.py:27: FutureWarning: Function _TORCH_OPS_REGISTRY.__delitem__ is deprecated and will be removed in 7.2.; Please use coremltools.converters.mil.frontend.torch.register_torch_op
  warnings.warn(msg, category=FutureWarning)
/opt/homebrew/lib/python3.11/site-packages/coremltools/models/_deprecation.py:27: FutureWarning: Function _TORCH_OPS_REGISTRY.__setitem__ is deprecated and will be removed in 7.2.; Please use coremltools.converters.mil.frontend.torch.register_torch_op
  warnings.warn(msg, category=FutureWarning)
Converting PyTorch Frontend ==> MIL Ops:   0%|                                                                                                                                             | 0/4506 [00:00<?, ? ops/s]Saving value type of int64 into a builtin type of int32, might lose precision!
Saving value type of int64 into a builtin type of int32, might lose precision!
Converting PyTorch Frontend ==> MIL Ops: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▉| 4505/4506 [00:01<00:00, 3255.50 ops/s]
Running MIL frontend_pytorch pipeline: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 5/5 [00:00<00:00, 13.02 passes/s]
Running MIL default pipeline:  14%|████████████████████                                                                                                                          | 10/71 [00:00<00:03, 15.93 passes/s]/opt/homebrew/lib/python3.11/site-packages/coremltools/converters/mil/mil/passes/defs/preprocess.py:267: UserWarning: Output, '5409', of the source model, has been renamed to 'var_5409' in the Core ML model.
  warnings.warn(msg.format(var.name, new_name))
Running MIL default pipeline:  73%|████████████████████████████████████████████████████████████████████████████████████████████████████████                                      | 52/71 [03:36<02:09,  6.79s/ passes]/opt/homebrew/lib/python3.11/site-packages/coremltools/converters/mil/mil/ops/defs/iOS15/elementwise_unary.py:894: RuntimeWarning: overflow encountered in cast
  return input_var.val.astype(dtype=string_to_nptype(dtype_val))
/opt/homebrew/lib/python3.11/site-packages/coremltools/converters/mil/mil/ops/defs/iOS15/elementwise_unary.py:896: RuntimeWarning: overflow encountered in cast
  return np.array(input_var.val).astype(dtype=string_to_nptype(dtype_val))
Running MIL default pipeline: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 71/71 [07:27<00:00,  6.30s/ passes]
Running MIL backend_mlprogram pipeline: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 12/12 [00:00<00:00, 168.96 passes/s]
zsh: killed     python3 -m exporters.coreml --model=mistralai/Mistral-7B-v0.1 
willwalker misty > /opt/homebrew/Cellar/python@3.11/3.11.7/Frameworks/Python.framework/Versions/3.11/lib/python3.11/multiprocessing/resource_tracker.py:254: UserWarning: resource_tracker: There appear to be 1 leaked semaphore objects to clean up at shutdown
  warnings.warn('resource_tracker: There appear to be %d '