ELF.Binary.virtual_size returns a different value on different platforms
1337-42 opened this issue · comments
Describe the bug
When I parse a simple ELF binary on a MacOS system the return value of virtual_size
differs from the value returned when analyzing the same binary on a Linux system.
To Reproduce
Steps to reproduce the behavior:
# On MacOS
> sha256sum /tmp/hello.elf
1c7459e525bb0e38e8e02b8483a27f62ae6b58c36b8f74ae70ec70b7fda3c2aa /tmp/hello.elf
> python
Python 3.12.3 (main, May 22 2024, 17:08:53) [Clang 15.0.0 (clang-1500.3.9.4)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import lief
>>> binary = lief.ELF.parse("/tmp/hello.elf")
>>> binary.virtual_size
32768
>>> lief.__version__
'0.14.1-bae887e0'
# On Linux
$ sha256sum /tmp/hello.elf
1c7459e525bb0e38e8e02b8483a27f62ae6b58c36b8f74ae70ec70b7fda3c2aa /tmp/hello.elf
$ python
Python 3.12.3 (main, Jun 6 2024, 10:02:01) [GCC 11.4.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import lief
>>> binary = lief.ELF.parse("/tmp/hello.elf")
>>> binary.virtual_size
20480
>>> lief.__version__
'0.14.1-bae887e0'
Expected behavior
I would expect to see the same value.
Environment (please complete the following information):
- System and Version: Ubuntu 22.04.4 LTS / Apple M2 - MacOS 14.5
- Target format: ELF
- LIEF commit version:
0.14.1-bae887e0