GifflarJS-Framework / gifflar-iotcocoa-case-study

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Gifflar IoTCocoa case study and performance evaluation

This repo is a case study of using Gifflar in an IoT project called IoTCocoa, a project that aims to create a system for monitoring the gourmet cocoa production. The IoTCocoa uses sensors and actuators to capture data and later save it in the blockchain using smart contracts.

To simulate these sensors and actuators, this repository created four types of devices objects that could be used by the IoTCocoa project. The devices where a Rele, a DHT11 sensor (temperature and humidity sensor), Servo Motor and a Air Conditioner. You can see in src/performance/sensors.ts that, in this sequence, the device object data increases. This was intentional, so we can understand more about Gifflar's performance as we increase the object JSON data.

Index

Dependencies

  • Install the dependencies:
yarn

or

npm i
  • Install Gifflar globally
npm i -g @gifflar/core
  • Generate gifflar config file gifflarconfig.json.
gifflar init .

This command will ask you if you want to init a gifflar config file inside an existing project, you can confirm that by typing y in terminal.

Run the application

Run this following command to execute the application and generate the sensors/actuators smart contracts with gifflar.

yarn start

or

npm start

This command will generate the smart contracts inside the src/contracts folder. Every time you run the command, the contracts are rewritten.

Run all performance tests

yarn performance

This will generate a csv file inside src/out folder for each performance test. The csv file will contain the mean, max and min values of all the repetitions the code mande. The default repetitions is 200, that is, the performance test will run the test 200 times (you can change this later).

Run individual performance test

yarn performance:it --sensor=rele --step=modeling --measure=time
  • measure: "time" | "memory" | "cpu"
  • sensor: "rele" | "dht11" | "servoMotor" | "airConditioner"
  • step: "modeling" | "writing"

This will generate a csv file inside src/out folder of this unique performance test. The csv file will contain the mean, max and min values of all the repetitions the code mande. The default repetitions is 200, that is, the performance test will run the test 200 times (you can change this later).

Generating all the repetition results

You might want to receive the data result from each repetition, that is, generate the 200 output values. For that, you can go to the src/performance/measuring-functions and uncomment a single line in each measure function.

measureTimeOf:

    [...]
    
    // *Uncomment this next line to get the data for each repetition
    // console.log(secs);

    [...]

measureMemoryOf:

    [...]
    
    // *Uncomment this next line to get the data for each repetition
    // console.log(memoryUsedInKb);

    [...]

measureCpuOf:

    [...]
    
    // *Uncomment this next line to get the data for each repetition
    // console.log(cpuUsagePercentage);

    [...]

Then you can use the scripts in scripts/performance to execute each performance test. For example, if you want to run the modeling step of gifflar using the rele actuator as test entry and testing the time execution, you should run:

./src/scripts/performance/rele/time_modeling.sh

Before running that command, you might need to give permission to the user to execute this script:

chmod 777 ./src/scripts/performance/rele/time_modeling.sh

These will generate .txt files inside src/out folder, each .txt file corresponds to the individual performance test executed. They will look like the files inside results/ folder.

Note: Remember to run these tests in an isolated machine. Close all the unneeded apps, close the text editor, turn the WiFi and Bluetooth off, and run the tests in the terminal.

About


Languages

Language:TypeScript 71.5%Language:Solidity 20.4%Language:Shell 8.1%