pytorch / kineto

A CPU+GPU Profiling library that provides access to timeline traces and hardware performance counters.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

How to remove log output similar to “ActivityProfilerController.cpp:294] Completed Stage: Warm Up”

SolenoidWGT opened this issue · comments

What I encounter

when I use torch.profie to profie a large model, I found my log file have many lines like:

STAGE:2023-02-21 15:15:48 101902:101902 ActivityProfilerController.cpp:294] Completed Stage: Warm Up
STAGE:2023-02-21 15:15:48 101903:101903 ActivityProfilerController.cpp:294] Completed Stage: Warm Up
STAGE:2023-02-21 15:15:48 101898:101898 ActivityProfilerController.cpp:300] Completed Stage: Collection
STAGE:2023-02-21 15:15:48 101899:101899 ActivityProfilerController.cpp:300] Completed Stage: Collection
STAGE:2023-02-21 15:15:48 101903:101903 ActivityProfilerController.cpp:300] Completed Stage: Collection
STAGE:2023-02-21 15:15:48 101902:101902 ActivityProfilerController.cpp:300] Completed Stage: Collection
STAGE:2023-02-21 15:15:48 101898:101898 ActivityProfilerController.cpp:294] Completed Stage: Warm Up
STAGE:2023-02-21 15:15:48 101899:101899 ActivityProfilerController.cpp:294] Completed Stage: Warm Up
STAGE:2023-02-21 15:15:48 101903:101903 ActivityProfilerController.cpp:294] Completed Stage: Warm Up
STAGE:2023-02-21 15:15:48 101902:101902 ActivityProfilerController.cpp:294] Completed Stage: Warm Up
STAGE:2023-02-21 15:15:48 101898:101898 ActivityProfilerController.cpp:300] Completed Stage: Collection
STAGE:2023-02-21 15:15:48 101899:101899 ActivityProfilerController.cpp:300] Completed Stage: Collection
STAGE:2023-02-21 15:15:48 101903:101903 ActivityProfilerController.cpp:300] Completed Stage: Collection
STAGE:2023-02-21 15:15:48 101902:101902 ActivityProfilerController.cpp:300] Completed Stage: Collection
STAGE:2023-02-21 15:15:48 101898:101898 ActivityProfilerController.cpp:294] Completed Stage: Warm Up
STAGE:2023-02-21 15:15:48 101903:101903 ActivityProfilerController.cpp:294] Completed Stage: Warm Up
STAGE:2023-02-21 15:15:48 101899:101899 ActivityProfilerController.cpp:294] Completed Stage: Warm Up
STAGE:2023-02-21 15:15:48 101902:101902 ActivityProfilerController.cpp:294] Completed Stage: Warm Up
STAGE:2023-02-21 15:15:48 101903:101903 ActivityProfilerController.cpp:300] Completed Stage: Collection
STAGE:2023-02-21 15:15:48 101902:101902 ActivityProfilerController.cpp:300] Completed Stage: Collection
STAGE:2023-02-21 15:15:48 101903:101903 ActivityProfilerController.cpp:294] Completed Stage: Warm Up
STAGE:2023-02-21 15:15:48 101898:101898 ActivityProfilerController.cpp:300] Completed Stage: Collection
STAGE:2023-02-21 15:15:48 101899:101899 ActivityProfilerController.cpp:300] Completed Stage: Collection
STAGE:2023-02-21 15:15:48 101902:101902 ActivityProfilerController.cpp:294] Completed Stage: Warm Up
STAGE:2023-02-21 15:15:48 101903:101903 ActivityProfilerController.cpp:300] Completed Stage: Collection
STAGE:2023-02-21 15:15:48 101902:101902 ActivityProfilerController.cpp:300] Completed Stage: Collection
STAGE:2023-02-21 15:15:48 101903:101903 ActivityProfilerController.cpp:294] Completed Stage: Warm Up
STAGE:2023-02-21 15:15:48 101902:101902 ActivityProfilerController.cpp:294] Completed Stage: Warm Up
STAGE:2023-02-21 15:15:48 101898:101898 ActivityProfilerController.cpp:294] Completed Stage: Warm Up

What I expect

Is there any way to ignore or turn off the output of these useless logs? I tried to set the environment variable KINETO_LOG_LEVEL equal to 99, but it didn't work. thanks you all.

import os
os.environ.update({'KINETO_LOG_LEVEL' : '99'})

Version and platform

CentOS-7 Linux
torch 1.13.1+cu117
torch-tb-profiler 0.4.1
torchaudio 0.13.1+cu117
torchvision 0.14.1+cu117

Hi @SolenoidWGT, thank you for your report. PR #714 should suppress that log when you set the environment variable, KINETO_LOG_LEVEL=3. Another reason might be that you need a newer version of PyTorch + Kineto (one that includes the PR).

Hi @SolenoidWGT, thank you for your report. PR #714 should suppress that log when you set the environment variable, KINETO_LOG_LEVEL=3. Another reason might be that you need a newer version of PyTorch + Kineto (one that includes the PR).

Thanks for your quick reply, I'll update to the latest version.

commented

Is kineto included in the torch pip package? I am on version 2.0.1 of torch and it does not work. Or is the PR just not included yet?
Thanks :)

@lukasbm Kineto should be included in pytorch as part of the profiler, but the changes from #714 probably aren't in 2.0.1.