Bigomby / polkastats-crawlers

polkastats.io backend repository

Home Page:http://polkastats.io

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

PolkaStats Backend v3

New improved backend for https://polkastats.io!

Table of Contents

Installation Instructions

git clone https://github.com/Colm3na/polkastats-backend-v3.git
cd polkastats-backend-v3
npm install

Usage Instructions

To launch all docker containers at once:

npm run docker

To run them separately:

npm run docker:<container-name>

List of current containers

  • substrate-node
  • postgres
  • graphql-engine
  • crawler
  • phragmen (temporarily disabled)

Updating containers

git pull
npm run docker:clean
npm run docker:build
npm run docker

Crawler

This crawler container listens to new blocks and fills the database. There are a number of processes executed within this container. Some of this processes are triggered based on time configuration that can be seen in this file: backend.config.js The crawler is able to detect and fill the gaps in postgres database by harvesting all the missing data, so it's safe and resilience against node outages or restarts.

Phragmen

This container includes an offline-phragmen binary. It is a forked modification of Kianenigma repository.

Hasura demo

The crawler needs to wait for your substrate-node container to get synced before starting to collect data. You can use an already synced external RPC for instant testing by changing the environment variable WS_PROVIDER_URL in docker-compose.yml file:

crawler:
  image: polkastats-backend:latest
  build:
    context: ../../
    dockerfile: ./docker/polkastats-backend/backend/Dockerfile
  depends_on:
    - "postgres"
    - "substrate-node"
  restart: on-failure
  environment:
    - NODE_ENV=production
    - WS_PROVIDER_URL=wss://kusama-rpc.polkadot.io # Change this line

Just uncomment out the first one and comment the second and rebuild the dockers.

npm run docker:clean
npm run docker

Then browse to http://localhost:8082

Click on "Data" at the top menu

Then add all tables to the tracking process

From now on, hasura will be collecting and tracking all the changes in the data base.

In order to check it and see its power you could start a new subscription or just perform an example query such us this one:

Query example. Static

  • Block query example:
query {
  block  {
    block_hash
    block_author
    block_number
    block_author_name
    current_era
    current_index
    new_accounts
    session_length
    session_per_era
    session_progress
  }
}
  • Rewards query example:
query {
  rewards {
    era_index
    era_rewards
    stash_id
    timestamp
  }
}
  • Validator by number of nominators example:
query {
  validator_num_nominators {
    block_number
    nominators
    timestamp
  }
}
  • Account query example:
query {
  account {
    account_id
    balances
    identity
  }
}

Subscription example. Dynamic

  • Block subscription example:
subscription {
  block {
    block_number
    block_hash
    current_era
    current_index
  }
}
  • Validator active subscription example:
subscription MySubscription {
	validator_active {
    account_id
    active
    block_number
    session_index
    timestamp
  }
}
  • Account subscription example:
subscription MySubscription {
  account {
    account_id
    balances
  }
}

About

polkastats.io backend repository

http://polkastats.io

License:Apache License 2.0


Languages

Language:JavaScript 92.3%Language:Shell 5.4%Language:Dockerfile 2.3%