1. Access Server

  1. Access Server to do an Apache Nifi Installation

    1. Linux (Ubuntu, Centos, etc.) or Windows 10 Server

      1. 4 CPU Cores

      2. 8 GB Memory

      3. At least 10 GB of hard disk storage

    2. For Windows, sometimes AD services may need to be added to the server (if not enabled already)

  2. You may need to ask for VPN and RDP access to the server to complete the installation

2. Firewall Rules and IP Whitelisting

Important: This step must be completed before connecting your on-premise API to the Lingk app.

Ports to open up in firewall for the server

:443, :3001

Other firewall ports may need to be opened depending on your configuration. Each port represents a different base API endpoint in the default configuration.

The AWS Lingk Cloud IP addresses for whitelisting:


The IP addresses enable Lingk batch recipes to communicate to your instance.

3. Install Apache Nifi (est. time 10 minutes)

  1. Install Nifi on Windows or Linux  (Horton Works DataFlow - Linux, Windows)

    The Data Flow installation page has instructions and download links for Linux and Windows (MSI) installations.

    1. Supported Apache Nifi  v1.11.0 (which is Hortonworks Dataflow v3.5.1)

    2. On Windows

      1. Install java into a C:/java directory

      2. Install Nifi into a C:/nifi directory

      3. Set up Nifi as a Windows Service

        1. http://nssm.cc/

You may need to install Active Directory Powershell scripts on the server to install

Open your browser to see Nifi running (typically http://localhost:8080/nifi)

4. Add Lingk API Plugin  (est. time 15 minutes)


Download Lingk API Plugin for Apache Nifi (v1.1 beta)

Download the Lingk API Plugin Nifi Flow template - v1.1.5 (xml file)

v1.1.5 Changes (2019 Oct 7):

  • Added complete Handlebars support to API templates for stronger JSON path and iterator support.
  • Added Handlebars helper function for replacing blank strings with null values (nullValue()) for SQL scripts.
  • Improved support for JSON encoded values.
  • Improved support for errors coming from JAR file response payloads.

v1.1.4 Changes (2019 Aug 2):

  • Fixed bug related to Java running out of heap space after a template processing error.
  • Changed to mustache templates from Schnauzer templates to support JSON keys with special characters.
  • Added support for HTTP POST in JDBC flow with a response value.
  • Updated all responses to be JSON formatted.

v1.1.3 Changes (2019 Jun 4):

  • Changed  to Schnauzer templates from micromustache templates to support iterators in SQL and JSON API templates.

v1.1.2 Changes (2019 Apr 22):

  • Bug fixes for JDBC Oracle and MS SQL Server usage


  1. Drag the Lingk API Plugin extras folder to the Nifi Root directory (C:/nifi/nifi_xxxx/) 

  2. Drag the Lingk API Plugin custom processors (under "/extras/processors") into the Nifi lib directory (C:/nifi/nifi_xxxx/lib) 

  3. Start Nifi up via command line (bin/nifi-start.bat for Windows or bin/nifi.sh start for Linux)

  4. Open your browser to http://localhost:8080/nifi

    Starting Nifi can take several minutes. You can reload your browser and check the logs until start-up completes.
  5. Upload the API Plugin process flow template from the Extras folder using template upload icon the "Operate" pane.

    Download the latest template from the download lingk above.
    Note: You can delete templates under the main right hamburger menu “Templates” navigation.
  6. From the top icons, drag the Template icon onto the screen.

  7. Select the uploaded template.

    Your Nifi instance will look similar to this screenshot after dropping the Lingk API Plugin template.

  8. Configure your Process Group variables by right-clicking on the Process Group

    Configuration Key




    • jwt (default)
    • key

    To connect to APIs from the Lingk Adapter connect use authType of “jwt”.

    To connect to APIs from the HTTP Connector use the authType of “key”. 

    In both cases, you need to set the Client ID and Secret inside the process group.


    • default (default)

    Create a process group per environment (dev, prod, etc). This value reflects where queries will be saved for the environment in the “extras” folder.


    Empty by default.

    Connection string for Oracle SQLcl or Microsoft SQLcmd connections

    When writing data to a database using a full SQL script beyond the capabilities of a JDBC request, configure connection strings from SQLcl or SQLcmd.

    • ./template-cache (default)


    • postgres (default)
    • mssql
    • oracle

    Change based on your database type. This configuration dictates how data is returned in pages from a JDBC SELECT statement.


    Empty by default.

    The name of the nifiConnectJar file for connecting to Unidata Colleague.

  9. Click on the gear icon in the "Operate" pane (you may need to click the background on the process flow to ensure focus on the processor group).

  10. Configure SSL Context Service

    1. Create a JKS keystore (learn how to making a self-signed keystore) or, for testing purposes, use a pre-built keystore in the "extras" folder (selfsigned.jks)

      1. Values for both the "Keystore password"  and "Key password" properties: Welcome@123

    2.  Set custom properties on SSL Context Service, if you are not doing a test installation.

    3. Click the "lightning bolt" icon to enable the controller service on all the HTTP processors.

  11. Configure the database controllers (DBCPConnectionPool) with the JDBC connection string and credentials for your database(s)

    1. Click the "lightning bolt" icon to enable the controller service on all the Database processors.

  12. Turn all processors on at the process group level by clicking the "play" button in the "Operate" pane.

    1. You should see no processors with Yellow or Red indicators. If you do, there is a configuration problem.

  13. Go to https://localhost:9001/dashboard in your browser to view the "API Dashboard" to configure APIs.

If you have problems getting all the processors to start or having Nifi start, look at the nifi logs at: "nifi_xxx/logs" for detailed information.

5. Configure database queries for APIs (depends on use case)

Lingk will automatically apply paging to your database queries to support high-performance reads of large datasets. Each Lingk API Plugin template (for Oracle, MS SQL, Postgres, etc.) automatically adds paging support.

To build queries that work with paging, you MUST add a sort order to all queries that don't change from one request to the next to ensure consist order from page to page.

Use CaseDatabase Options
Read Data - GET
  • JDBC SELECT Statement
  • SQL Script as File passed to Java executable
  • Raw data passed to Java executable
  • Flat file (CSV) stored on server
Write Data - PUT, POST
  • JDBC INSERT or UPDATE Statement
  • SQL Script as File passed to Java executable
  • Raw data passed to Java executable

6. Connect Apache Nifi APIs to Lingk App (est. time 2 minute)

Important: You must have have ports :443 and :3001 (or others) accessible through your firewall before this step to connect to Lingk cloud services.

  1. Copy the public URL of the Apache Nifi APIs
    Example: https://mydomain.edu:3001 or https://[ipaddress]:3001 (testing only).
  2. Go into an Environment in the Lingk App.
  3. Add the Lingk Adapter (i.e. REST Plugin for Apache Nifi) connector.
  4. Paste URL value into Environment for Lingk Adapter URL field.
  5. Click "Create credentials" for the Lingk Adapter.
  6. In Nifi, stop the Client Secret processor after clicking into the main process group.
  7. Double click into the Client Secret processor and go to properties.
  8. Add a new property.
    1. Paste in the Client Key
    2. Paste in the Client Secret
  9. Start the Client Secret processor.
  10. In the Lingk App, click "Test Connection".

You will see a green indicator in the Lingk app if you are successful.

7. Configure Recipe for Integration Between Systems (depends on use case)

Choose Lingk Adapter (i.e. Apache Nifi) reader recipe from the template library and customize the properties to your configuration. 

Click run to see the data from your database get pulled into the recipe.