This repo contains a microservice that has an API for making requests to IBM Natural Language Understanding to analyze text and return identified location, concepts, and entities. This service can then be integrated with a Watson Assistant virtual assistant through a custom extension. The OpenAPI document is also listed in this repo which is required to create a custom extension in Watson Assistant.
- app.js: Routes NLU requests for the
/nlu
path over toroutes/nlu_handler.js
- nlu_handler.js: Takes text from the
text
query parameter and sends it to NLU for text analysis. Returns an object containing identified location, an array of entities, and an array of concepts. - openapi.json: OpenAPI document that is required for creating a custom extension in Watson Assistant
- Dockerfile: The file used to build the container image
- Nodejs (I used Node 18)
- An IBM Cloud Account
- Docker or Podman (If you want to run as a container)
-
Build the application with:
npm install
-
Create an environment variable for the NLU apikey
export nlu_apikey=xxxxxxxxxxxxxxxx
-
Create an environment variable for the NLU service endpoint
export nlu_url=xxxxxxxxxxxxxxxxxxxx
-
Run the application
npm start
There is no frontend for this service. You can access the application by sending a GET request with a REST client or by using your browser.
-
Come up with some text that you want NLU to analyze. For example:
Austin has a lot of tech communities involving data science and DevOps with members from companies such as IBM and Salesforce.
-
Reach your locally running application at
localhost:300/nlu
and adding the text query param with the text you want to analyze as seen below :- If you use a browser, you can just enter the following directly into your address bar.
localhost:3000/nlu?text=Austin has a lot of tech communities involving data science and DevOps with members from companies such as IBM and Salesforce
- If you are using CURL:
curl "localhost:3000/nlu?text=Austin has a lot of tech communities involving data science and DevOps with members from companies such as IBM and Salesforce"
-
View the results
{"location":"Austin","concepts":["DevOps","Data science","Science"],"entities":["IBM","Salesforce"]}
If you want to containerize the application, you can use Podman or Docker to create the container image using the included Dockerfile.
You will need to replace all values in <>
with their respective values.
-
To build
docker build -t <image registry>/<organization/<image name>:<your image tag here> .
-
To push to your registry
docker push <image registry>/<organization/<image name>:<your image tag here>
-
If you want to run from the container locally you can do the following, replacing the
nlu_apikey
andnlu_url
values with the credentials from your own instance:docker run -p 3000:3000 -e nlu_apikey=xxxxxxxx -e nlu_url=xxxxxxxxxx <image registry>/<organization/<image name>:<your image tag here>
There are many ways to deploy the application whether you use a PaaS, a container platform like OpenShift or Kubernetes, or any other container hosting solution.
For my demo, I used IBM Code Engine which allows you to run a container workload without needing to manage the container orchestration platform underneath as it was the easiest for me to implement.