watson-developer-cloud / visual-recognition-code-pattern

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

# DEPRECATED

This code pattern is no longer supported. You can find the newly supported Visual Recognition Code Pattern here.

Visual Recognition Code Pattern πŸ“·

The Visual Recognition service uses deep learning algorithms to analyze images for scenes, objects, text, and other subjects.

Travis semantic-release

✨ Demo: https://visual-recognition-code-pattern.ng.bluemix.net/ ✨

Flow

architecture

  1. User sends messages to the application (running locally, in the IBM Cloud).
  2. The application sends the user message to IBM Watson Visual Recognition service.
  3. Watson Visual Recognition uses deep learning algorithms to analyze images for scenes, objects, text, and other subjects. The service can be provisioned on IBM Cloud.

Prerequisites

Public Cloud

  1. Sign up for an IBM Cloud account.
  2. Download the IBM Cloud CLI.
  3. Create an instance of the Visual Recognition service and get your credentials:
    • Go to the Visual Recognition page in the IBM Cloud Catalog.
    • Log in to your IBM Cloud account.
    • Click Create.
    • Click Show to view the service credentials.
    • Copy the apikey value.
    • Copy the url value.

Configuring the application

Depending on where your service instance is you may have different ways to download the credentials file.

Need more information? See the authentication wiki.

Automatically

Copy the credential file to the application folder.

Public Cloud

public

Manually

  1. In the application folder, copy the .env.example file and create a file called .env

    cp .env.example .env
    
  2. Open the .env file and add the service credentials depending on your environment.

    Example .env file that configures the apikey and url for a Watson Visual Recognitions service instance hosted in the US East region:

    WATSON_VISION_COMBINED_APIKEY=X4rbi8vwZmKpXfowaS3GAsA7vdy17Qh7km5D6EzKLHL2
    WATSON_VISION_COMBINED_URL=https://gateway-wdc.watsonplatform.net/visual-recognition/api
    

Running locally

  1. Install the dependencies

    npm install
    
  2. Build the application

    npm run build
    
  3. Run the application

    npm run dev
    
  4. View the application in a browser at localhost:3000

Deploying to IBM Cloud as a Cloud Foundry Application

Click on the button below to deploy this demo to the IBM Cloud.

Deploy to IBM Cloud

Manually

  1. Build the application

    npm run build
    
  2. Login to IBM Cloud with the IBM Cloud CLI

    ibmcloud login
    
  3. Target a Cloud Foundry organization and space.

    ibmcloud target --cf
    
  4. Edit the manifest.yml file. Change the name field to something unique. For example, - name: my-app-name.

  5. Deploy the application

    ibmcloud app push
    
  6. View the application online at the app URL, for example: https://my-app-name.mybluemix.net

Tests

Unit tests

Run unit tests with:

npm run test:components

See the output for more info.

Integration tests

First you have to make sure your code is built:

npm run build

Then run integration tests with:

npm run test:integration

Directory structure

.
β”œβ”€β”€ app.js                      // express routes
β”œβ”€β”€ config                      // express configuration
β”‚   β”œβ”€β”€ error-handler.js
β”‚   β”œβ”€β”€ express.js
β”‚   └── security.js
β”œβ”€β”€ package.json
β”œβ”€β”€ public                      // static resources
β”œβ”€β”€ server.js                   // entry point
β”œβ”€β”€ test                        // integration tests
└── src                         // react client
    β”œβ”€β”€ __test__                // unit tests
    └── index.js                // app entry point

License

This sample code is licensed under the MIT License.

Open Source @ IBM

Find more open source projects on the IBM Github Page

About


Languages

Language:JavaScript 71.1%Language:SCSS 17.7%Language:HTML 9.6%Language:Dockerfile 1.3%Language:Shell 0.3%