powerapi-ng / pyJoules

A Python library to capture the energy consumption of code snippets

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

OSError: [Errno 24] Too many open files: '/sys/class/powercap/intel-rapl/intel-rapl:1/name'

danglotb opened this issue · comments

Hello,

I'm running pyJoules on keon/algorithms by putting decorator on all functions and calling the save of csv_handler at the end of each unit test.

Basically, I have the following:

@pyJoules.energy_meter.measureit(handler=pyjoules_handler.handler)
def quick_sort(arr, simulation=False):
    """ Quick sort
        Complexity: best O(n log(n)) avg O(n log(n)), worst O(N^2)
    """
    
    iteration = 0
    if simulation:
        print("iteration",iteration,":",*arr)
    arr, _ = quick_sort_recur(arr, 0, len(arr) - 1, iteration, simulation)
    return arr

and

def test_quick_sort(self):
        self.assertEqual([1, 5, 23, 57, 65, 1232],
                         quick_sort([1, 5, 65, 23, 57, 1232]))

        pyjoules_handler.handler.save_data()

for all functions/tests.

I have the following errors when I run the tests:

======================================================================
ERROR: test_huffman_coding (test_compression.TestHuffmanCoding)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/benjamin/workspace/algorithms/tests/test_compression.py", line 27, in test_huffman_coding
    HuffmanCoding.encode_file(self.file_in_name, self.file_out_bin_name)
  File "/home/benjamin/.local/lib/python3.7/site-packages/pyJoules/energy_meter.py", line 286, in wrapper_measure
    val = func(*args, **kwargs)
  File "/home/benjamin/workspace/algorithms/algorithms/compression/huffman_coding.py", line 292, in encode_file
    with open(file_in_name, "rb") as file_in, open(file_out_name, mode="wb+") as file_out:
OSError: [Errno 24] Too many open files: 'huffman_coding_out.bin'

======================================================================
ERROR: test_huffman_coding (test_compression.TestHuffmanCoding)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/benjamin/workspace/algorithms/tests/test_compression.py", line 40, in tearDown
    os.remove(self.file_out_bin_name)
FileNotFoundError: [Errno 2] No such file or directory: 'huffman_coding_out.bin'

======================================================================
ERROR: test_histogram (unittest.loader._FailedTest)
----------------------------------------------------------------------
ImportError: Failed to import test module: test_histogram
Traceback (most recent call last):
  File "/usr/lib/python3.7/unittest/loader.py", line 436, in _find_test_path
  File "/usr/lib/python3.7/unittest/loader.py", line 377, in _get_module_from_name
  File "/home/benjamin/workspace/algorithms/tests/test_histogram.py", line 1, in <module>
  File "/home/benjamin/workspace/algorithms/algorithms/distribution/histogram.py", line 20, in <module>
  File "/home/benjamin/.local/lib/python3.7/site-packages/pyJoules/energy_meter.py", line 280, in decorator_measure_energy
  File "/home/benjamin/.local/lib/python3.7/site-packages/pyJoules/energy_device/energy_device_factory.py", line 63, in create_devices
  File "/home/benjamin/.local/lib/python3.7/site-packages/pyJoules/energy_device/energy_device_factory.py", line 47, in _gen_all_available_domains
  File "/home/benjamin/.local/lib/python3.7/site-packages/pyJoules/energy_device/rapl_device.py", line 111, in available_domains
  File "/home/benjamin/.local/lib/python3.7/site-packages/pyJoules/energy_device/rapl_device.py", line 135, in available_package_domains
OSError: [Errno 24] Too many open files: '/sys/class/powercap/intel-rapl/intel-rapl:1/name'

I show the 3 first errors, then the repeated error is OSError: [Errno 24] Too many open files: '/sys/class/powercap/intel-rapl/intel-rapl:1/name'

It seems like pyJoules opens files without closing them. Do you have any clue? Is there a solution or should I just run the test one by one?

Best, thank you very much!