When building recipes, it valuable to the stages of recipe execution when configuring connectors and debugging errors.

The steps of recipe processing are:

  1. Start Job
    The Spark job takes a few seconds to start up and could be queue depending on the number of jobs being processed. All jobs are processed asynchronously.

    Common issues:
    Jobs take a long time to start. When this occurs reach out to Lingk customer support.
  2. Parse YAML
    The Recipe Editor has inline general YAML validation for showing errors prior to starting a job. After starting a job, the server processes the YAML and ensures that all the connectors and types are valid per the Lingk domain model.

    Common issues:
    YAML errors. If this happens, you can paste your recipe into another YAML parser like:
    http://yaml-online-parser.appspot.com
    To identify any general YAML issues.

    Also, you should ensure that all top-level structures are in order and that statements are not intermingled with connectors or schemas.
  3. Read Data
    1. Expression language processing (Jinja)
    2. Read credentials
    3. Execute providers 
    4. Execute post-processing (Data Diff)
  4. Execute Statements (Lingk Query Language [LQL] & Spark SQL)
    1. SELECT statements are pure Spark SQL
    2. INSERT, UPDATE, DELETE, and other statements are LQL
  5. Complete
    Once a recipe finishes, it is completed.

    Common issues:
    The recipe is not completing. If this is the case, then a specific connector is not responding and the recipe is waiting for a connector to finish (i.e. a Salesforce bulk job). In other cases, there may performance issues when reading or writing data based on recipe logic.