node-red-contrib-model-asset-exchange

Node-RED node for Model Asset eXchange

Downloads in past

Stats

StarsIssuesVersionUpdatedCreatedSize
node-red-contrib-model-asset-exchange
2030.2.84 years ago5 years agoMinified + gzip package size for node-red-contrib-model-asset-exchange in KB

Readme

Build Status npm version
Node-RED nodes for deep learning microservices from the Model Asset eXchange, providing support for common audio, image, video, and text processing tasks.
Sample Node-RED Flow for MAX Object Detector

Getting started

To get started follow the comprehensive tutorial or complete the quick start steps listed below.

Setup

Docker installation

If you have Docker installed, you can use this Docker image to try out the examples.

Native installation

  1. Install Node-RED.
> Before you can install Node-RED, you'll need a working install of Node.js. We recommend the use of Node.js LTS 8.x or 10.x, as Node-RED no longer supports Node.js 6.x or earlier.
  1. Run the following command in your Node-RED user directory - typically ~/.node-red to install the node-red-contrib-model-asset-exchange module:
$ cd ~/.node-red
$ npm install node-red-contrib-model-asset-exchange
> You can also install the module in the Node-RED editor. Choose > Manage palette > Install and enter model-asset as the search term.
  1. Launch Node-RED and open the displayed URL in a web browser to access the flow editor.
$ node-red
  ...
  ... - [info] Server now running at http://127.0.0.1:1880/
  1. The nodes are displayed in the palette under the Model-Asset-eXchange category.

Explore the sample flows

The node-red-contrib-model-asset-exchange module includes a couple of example flows to get you started. To import the flows into the workspace:
  1. In the Node-RED editor open > Import > Examples > model asset exchange.
  2. Select one of the sub-directories to choose between the basic flows in getting started, some more complex examples in beyond the basics, or some flows designed to run on the raspberry pi.
  3. Choose a flow.
![import sample flows](/docs/images/import_sample_flows.png) 
Note: The flows utilize nodes from the node-red-contrib-browser-util and node-red-contrib-image-output modules. See the flow description for more details on which nodes are used in a particular example.

You can deploy and run these flows as is. The deep learning nodes in these flows have been pre-configured (service: cloud) to connect to hosted evaluation instances of the deep learning microservices.

Use the nodes in your own flows

Microservice evaluation instances are not suitable for production use. We recommend running microservice instance(s) on your local machine or in the cloud using IBM Cloud Kubernetes, Azure Kubernetes Service, or Google Kubernetes Engine:
  1. Deploy the deep learning microservice in the desired environment.
  2. Take note of its URL (e.g. http://localhost:5000)
  3. Add the corresponding deep learning node to your canvas.
  4. Open the node properties.
  5. Add a service entry for the URL and assign it a unique name.

configure microservice connectivity

Supported application domains

This Node-RED node module supports the following application domains:
Generate captions that describe the contents of images.
Localize and identify multiple objects in a single image.
Detect humans in an image and estimate the pose for each person.
Identify sounds in short audio clips.
Identify objects in images using a third-generation deep residual network.
Identify objects in an image, additionally assigning each pixel of the image to a particular object.
Classify images according to the place/location labels in the Places365 data set.
Note: file inject node in node-red-contrib-browser-utils is useful to test these nodes.

License

Apache-2.0