Enhancements

Event-Triggered Data Processing 
Bring your data processing to the next level with event-triggered wizards and recipes. The new Events tab in a workspace enables you to subscribe system and custom events with wizards, recipes and webhooks. Event sources include Lingk REST API, Lingk DX platform events, and BrickFTP file upload.

Easy Webhooks 
Use Lingk to add events to any of your processes using our REST Event API. Create custom objects and start sending events to subscriptions in a matter of minutes.

PowerCampus by Ellucian provider 
Connect to PowerCampus by Ellucian through the Lingk Adapter to power wizard-based integration experiences.

Ellucian Ethos support in Environments
With a valid Ethos token, you can do a test connection in Environments and centrally manage your Ethos credentials.

Enhanced DataDiff Capability for Recipes
Now any provider can have a data diff applied. In Spark SQL queries, you can easily differentiate between prior sets of data (added, updated, and deleted) based on specified keys. Data can be stored in AWS S3 or Google Cloud Storage buckets.

Salesforce Provider: Allow System Resources to be included in metadata
New toggle button in a Wizard enables additional fields to be included in mapping.

Other enhancements

  • Non-admin users of an org can now see who the admin users are of an org under Org setting
  • Display Modified By for Wizards, Recipes, and Environments
  • Recipe text editor enables different color themes to support developer preferences
  • Permanently store how many records were created/updated/deleted per provider and recipe (GDPR)

Bug Fixes

  • Non-admin users of an org no longer see the org listed in Workspace list when creating a new workspace
  • Run button in recipe editor sometimes no longer loses state when closing the execution plan
  • In execution log, the tabular data now shows the same different data as the JSON data for numeric values
  • Flat File to Salesforce: spaces in column names are now respected in joins
  • CanvasWriter no longer returns 'empty collection' error when there are no records to be written
  • Learning Stream reader no longer returns an error the record set is 0 records