jnv / superface-station

Comlink metadata for self-integrating applications.

Home Page:https://superface.ai/catalog

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

station

CI / CD TypeScript

superface logo

Where use-cases are born. In this repository we build curated use-cases. Examples in this repository are an ideal starting point for writing your own.

Table of Contents

Background

Superface (super-interface) is a higher-order API, an abstraction on top of modern APIs like GraphQL and REST. Superface is one interface to discover, connect, and query any use cases available via conventional APIs.

Through its focus on application-level semantics, Superface decouples the clients from servers, enabling fully autonomous evolution. As such, it minimizes the code base size as well as errors and downtimes while providing unmatched resiliency and redundancy.

Superface allows for switching providers without development at runtime in milliseconds. Furthermore, Superface decentralizes the composition and aggregation, and thus creates an Autonomous Integration Mesh.

Motivation behind Superface is nicely described in this video from APIdays conference.

You can learn more at https://superface.ai and https://superface.ai/docs.

Install

Install dependencies:

yarn install

Usage

# Check all files are correctly linked together
$ yarn check

# Runs linter on Profiles and maps
$ yarn lint

# Run tests
$ yarn test

# Record new trafic with live API calls
$ yarn test:record grid/path/to/test.ts

Security

Superface is not a man-in-the-middle so it does not require any access to secrets that are needed to communicate with provider API. Superface CLI only prepares super.json file with authorization fields in the form of environment variables. You just set correct variables and communicate directly with provider API.

You can find more information in OneSDK repository.

Support

If you need any additional support, have any questions, or you just want to talk you can do that through our support page.

Adding new use-case

If you are starting with authoring, check our guide.

Station repository has a defined structure. Here are the Superface CLI commands for creating boilerplate code for profiles, maps, mock map and providers.

Create new profile

yarn create:profile [scope](optional)/[name]

Create new provider

yarn create:provider [provider_name]

Create map for profile and provider

yarn create:map [scope](optional)/[name] [provider_name]

Create mock map

yarn create:mock-map [scope](optional)/[name]

Test the map

We encourage using the Superface Testing to write tests.

1. Generate test boilerplate code

For mock provider call command:

yarn create:mock-map-test [scope](optional)/[name]

For real provider call command:

yarn create:test [scope](optional)/[name] [provider_name]

The create:test command creates test file alongside map .suma file. The test inputs and expected result will be pregenerated from profile examples.

The created code looks like this:

import { SuperfaceTest } from '@superfaceai/testing';

import { buildSuperfaceTest } from '../../../test-config';

describe(`scope/name/provider_name}`, () => {
  let superface: SuperfaceTest;

  beforeEach(() => {
    superface = buildSuperfaceTest({
      profile: 'scope/name',
      provider: 'provider_name',
    });
  });

  describe('UseCase', () => {
    it('performs successfully', async () => {
      const result = await superface.run({
        useCase: 'UseCase',
        input: {
          field1: '',
          field2: '',
        },
      });

      expect(() => result.unwrap()).not.toThrow();
      expect(result).toMatchSnapshot();
    });
  });
});

All inputs should be written directly to the test file and shouldn't use environment variables.

2. Make a call against live API to record traffic and create a snapshot

$ yarn test:record grid/[scope]/[name]/maps/[provider].test.ts

3. Check result in the snapshot

Snapshot for test run should be created in location:

grid/[scope]/[name]/maps/__snapshots__/[provider].test.ts.snap

4. Do post processing for traffic recording

We try to sanitize recordings and remove any sensitive data. But you should still look at the recording and make sure it doesn't contain sensitive data such as credentials or personal information that shouldn't be public.

5. Run tests with recorded traffic

$ yarn test grid/[scope]/[name]/maps/example.test.ts

Debugging maps

You can set the OneSDK DEBUG environment variable to enable logging for debugging purposes:

DEBUG="superface:http*"

For example, when recording traffic:

$ DEBUG="superface:http*" yarn test:record grid/[scope]/[name]/maps/[provider].test.ts

Enviroment variables

Secretes used for authentication during tests are stored in .env and loaded using dotenv. Run cp .env.example .env to start from the template.

Automated publishing

Station uses Workflow to automate publishing. For details see the CI / CD workflow.

Contributing

Please open an issue first if you want to make larger changes

Feel free to contribute! Please follow the Contribution Guide.

Licenses of node_modules are checked during CI/CD for every commit. Only the following licenses are allowed:

  • 0BDS
  • MIT
  • Apache-2.0
  • ISC
  • BSD-3-Clause
  • BSD-2-Clause
  • CC-BY-4.0
  • CC-BY-3.0;BSD
  • CC0-1.0
  • Unlicense
  • UNLICENSED

Note: If editing the README, please conform to the standard-readme specification.

License

The Superface is licensed under the MIT. © 2022 Superface

About

Comlink metadata for self-integrating applications.

https://superface.ai/catalog

License:MIT License


Languages

Language:TypeScript 99.0%Language:JavaScript 1.0%