jerryscript-project / iotjs

Platform for Internet of Things with JavaScript http://www.iotjs.net

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Why the time of gc is increasing

jerry-weng opened this issue · comments

I add some patch to calculate the execution time of ecma_gc_run function, which is located in thedeps/jerry/jerry-core/ecma/base/ecma-gc.c file, and print the time to stdout. So I can observe the consumed time of gc.

Then I run the following script:

!function next(){setTimeout(next, 10)}()

The log shows the time of gc is increasing.

jerry gc time consume: 0.054932
jerry gc time consume: 0.047852
jerry gc time consume: 0.352051
jerry gc time consume: 0.500977
jerry gc time consume: 0.703125
jerry gc time consume: 0.441162
jerry gc time consume: 0.473145
jerry gc time consume: 1.086914

...

jerry gc time consume: 1.934082
jerry gc time consume: 4.325928
jerry gc time consume: 2.452148
jerry gc time consume: 4.174072
jerry gc time consume: 5.083008
jerry gc time consume: 5.315918

...

jerry gc time consume: 6.258057
jerry gc time consume: 6.624023
jerry gc time consume: 6.314941
jerry gc time consume: 6.881836

Is it reasonable?

Such thing never happened on me when I print the GC time. Would you mind show you testing code to me?

It is tested on MacOS 10.13.6.

The js script file test.js only contains one line !function next(){setTimeout(next, 10)}().

This is my patch to jerryscript.

diff --git a/jerry-core/ecma/base/ecma-gc.c b/jerry-core/ecma/base/ecma-gc.c
index c1627567..cdef432a 100644
--- a/jerry-core/ecma/base/ecma-gc.c
+++ b/jerry-core/ecma/base/ecma-gc.c
@@ -30,6 +30,8 @@
 #include "vm-defines.h"
 #include "vm-stack.h"
 
+#include <sys/time.h>
+
 #ifndef CONFIG_DISABLE_ES2015_TYPEDARRAY_BUILTIN
 #include "ecma-typedarray-object.h"
 #endif /* !CONFIG_DISABLE_ES2015_TYPEDARRAY_BUILTIN */
@@ -818,6 +820,12 @@ ecma_gc_free_object (ecma_object_t *object_p) /**< object to free */
 void
 ecma_gc_run (jmem_free_unused_memory_severity_t severity) /**< gc severity */
 {
+
+  struct timeval tv;
+  double t1,t2;
+  gettimeofday (&tv, NULL);
+  t1 = ((double) tv.tv_sec) * 1000.0 + ((double) tv.tv_usec) / 1000.0;
+
   JERRY_CONTEXT (ecma_gc_new_objects) = 0;
 
   ecma_object_t *white_gray_objects_p = JERRY_CONTEXT (ecma_gc_objects_p);
@@ -964,6 +972,11 @@ ecma_gc_run (jmem_free_unused_memory_severity_t severity) /**< gc severity */
   /* Free RegExp bytecodes stored in cache */
   re_cache_gc_run ();
 #endif /* !CONFIG_DISABLE_REGEXP_BUILTIN */
+
+  gettimeofday (&tv, NULL);
+  t2 = ((double) tv.tv_sec) * 1000.0 + ((double) tv.tv_usec) / 1000.0;
+  printf("jerry gc time consume: %f\n", t2 - t1);
+
 } /* ecma_gc_run */
 
 /**

I build iotjs using tools/build.py command, and run test.js using ./build/x86_64-darwin/debug/bin/iotjs test.js

Use your code, but your issues don't happen on my machine. Maybe you have some memory leak?

It seem i could reproduce this issue on Ubuntu setup(with latest iot.js) easily.
just git clone latest iot.js from github, sync submodule and apply the patch provided by jerry-weng.

gc times keep increasing from '0.086182' the first time to '110.652100' when i stop with ctrl-c

Following is my setup

➜  iotjs git:(master) ✗ sudo lsb_release -a
No LSB modules are available.
Distributor ID:	Ubuntu
Description:	Ubuntu 18.04.1 LTS
Release:	18.04
Codename:	bionic

log information

jerry gc time consume: 0.059082
jerry gc time consume: 0.049072
jerry gc time consume: 0.350098
jerry gc time consume: 0.420898
... ...
jerry gc time consume: 90.711914
jerry gc time consume: 106.624023
jerry gc time consume: 106.192139
jerry gc time consume: 121.405029
jerry gc time consume: 108.804199
jerry gc time consume: 111.093018
jerry gc time consume: 110.652100

Use your code, but your issues don't happen on my machine. Maybe you have some memory leak?

@fighterkin I think there are no memory leak issues would be involved, since there are no significant codes to iotjs and jerryscript. Can you describe your test result ?

I have checked this issue and it seems the root object (marked externally) are increasing to thousands. Probably something is not dereferenced after use. => this is a strong leak.

I think the behavior is correct. Every time you pass next to setTimeout a new next function instance is created, which refers the previous next function instance. The result is that a long function chain is created.

Recommended change:

function next(){
  setTimeout(next, 10)
}
next();

Maybe this reveals a bit more insight:

function f() {
  return function() {}
}

Every time f is called a new function is returned, which uses the lexical environment of f.

function g() {}
function f() {
  return g
}

In this case the same function is returned (with the same lexical environment).

Note: lexical environments are special purpose JS objects.

I change the timeout from 10 to 1, and test the same script in node.js with command node --trace_gc test.js.

The log is:

[46370:0x104802a00]       38 ms: Scavenge 2.7 (4.2) -> 2.3 (5.2) MB, 1.1 / 0.0 ms  (average mu = 1.000, current mu = 1.000) allocation failure 
[46370:0x104802a00]      793 ms: Scavenge 3.1 (5.7) -> 2.8 (6.7) MB, 1.5 / 0.0 ms  (average mu = 1.000, current mu = 1.000) allocation failure 
[46370:0x104802a00]     6781 ms: Scavenge 4.2 (8.2) -> 3.3 (7.7) MB, 0.8 / 0.0 ms  (average mu = 1.000, current mu = 1.000) allocation failure 
[46370:0x104802a00]    17325 ms: Scavenge 4.3 (7.7) -> 3.3 (7.7) MB, 0.2 / 0.0 ms  (average mu = 1.000, current mu = 1.000) allocation failure 
[46370:0x104802a00]    27623 ms: Scavenge 4.3 (7.7) -> 3.3 (7.7) MB, 0.2 / 0.0 ms  (average mu = 1.000, current mu = 1.000) allocation failure 
[46370:0x104802a00]    38098 ms: Scavenge 4.3 (7.7) -> 3.3 (7.7) MB, 0.3 / 0.0 ms  (average mu = 1.000, current mu = 1.000) allocation failure 
[46370:0x104802a00]    48723 ms: Scavenge 4.3 (7.7) -> 3.3 (7.7) MB, 0.3 / 0.0 ms  (average mu = 1.000, current mu = 1.000) allocation failure 
[46370:0x104802a00]    59358 ms: Scavenge 4.3 (7.7) -> 3.3 (7.7) MB, 0.3 / 0.0 ms  (average mu = 1.000, current mu = 1.000) allocation failure 
[46370:0x104802a00]    69948 ms: Scavenge 4.3 (7.7) -> 3.3 (7.7) MB, 0.3 / 0.0 ms  (average mu = 1.000, current mu = 1.000) allocation failure 
[46370:0x104802a00]    80729 ms: Scavenge 4.3 (7.7) -> 3.3 (7.7) MB, 0.4 / 0.0 ms  (average mu = 1.000, current mu = 1.000) allocation failure 
[46370:0x104802a00]    91400 ms: Scavenge 4.3 (7.7) -> 3.3 (7.7) MB, 0.3 / 0.0 ms  (average mu = 1.000, current mu = 1.000) allocation failure 
[46370:0x104802a00]   102159 ms: Scavenge 4.3 (7.7) -> 3.3 (7.7) MB, 0.3 / 0.0 ms  (average mu = 1.000, current mu = 1.000) allocation failure 
[46370:0x104802a00]   112967 ms: Scavenge 4.3 (7.7) -> 3.3 (7.7) MB, 0.2 / 0.0 ms  (average mu = 1.000, current mu = 1.000) allocation failure 
[46370:0x104802a00]   123554 ms: Scavenge 4.3 (7.7) -> 3.3 (7.7) MB, 0.2 / 0.0 ms  (average mu = 1.000, current mu = 1.000) allocation failure 
[46370:0x104802a00]   134247 ms: Scavenge 4.3 (7.7) -> 3.3 (7.7) MB, 0.2 / 0.0 ms  (average mu = 1.000, current mu = 1.000) allocation failure 
[46370:0x104802a00]   144817 ms: Scavenge 4.3 (7.7) -> 3.3 (7.7) MB, 0.3 / 0.0 ms  (average mu = 1.000, current mu = 1.000) allocation failure 
[46370:0x104802a00]   155533 ms: Scavenge 4.3 (7.7) -> 3.3 (7.7) MB, 0.3 / 0.0 ms  (average mu = 1.000, current mu = 1.000) allocation failure 
[46370:0x104802a00]   166233 ms: Scavenge 4.3 (7.7) -> 3.3 (7.7) MB, 0.3 / 0.0 ms  (average mu = 1.000, current mu = 1.000) allocation failure 
[46370:0x104802a00]   176730 ms: Scavenge 4.3 (7.7) -> 3.3 (7.7) MB, 0.2 / 0.0 ms  (average mu = 1.000, current mu = 1.000) allocation failure 
[46370:0x104802a00]   187497 ms: Scavenge 4.3 (7.7) -> 3.3 (7.7) MB, 0.2 / 0.0 ms  (average mu = 1.000, current mu = 1.000) allocation failure 

Refer to https://github.com/nodejs/node/blob/8aca934009225b30bff7d7927d2eb9f667dbff9f/deps/v8/src/heap/gc-tracer.cc, I think the value before / in the log represents the duration of gc. It seems there is no gc issue.

Is it that it might be a bug of jerryscript?

Yes, it looks like there is a difference:

var old_next = null;

!function next(){
  if (!old_next)
    old_next = next;
  else
    console.log(old_next == next);

  setTimeout(next, 1)
}()

This code prints true in nodejs and false in iotjs. It looks like nodejs reuse the function object instead of creating a new one.

I am newer to ECMA-262 spec. Can you tell me the behavior of nodejs is spec-compliant or an optimization.

Third part of http://www.ecma-international.org/ecma-262/5.1/#sec-13

It is not an optimization. JerryScript creates a new closure instead of reusing the original one.

JerryScript creates a new closure instead of reusing the original one.

@zherczeg I think this might be a bug at JerryScript, which cause the following has a incorrect behavior, too:

var emitter = new EventEmitter();
emitter.on('rm', function foo() {
  console.log('trigger');
  emitter.removeListener('rm', foo);
})
emitter.emit('rm'); // trigger
emitter.emit('rm'); // trigger

In the function removeListener(), the check on foo would return false because they are different objects, yodaos-project/ShadowNode@325e279 should address this issue :)

@yorkie Your patch also works for my test case. Would you mind pushing this patch to the main branch of jerryscript.