The Recipe Execution Lifecycle
When building recipes, it valuable to the stages of recipe execution when configuring connectors and debugging errors.
The steps of recipe processing are:
- Start Job
The Spark job takes a few seconds to start up and could be queue depending on the number of jobs being processed. All jobs are processed asynchronously.
Jobs take a long time to start. When this occurs reach out to Lingk customer support.
- Parse YAML
The Recipe Editor has inline general YAML validation for showing errors prior to starting a job. After starting a job, the server processes the YAML and ensures that all the connectors and types are valid per the Lingk domain model.
YAML errors. If this happens, you can paste your recipe into another YAML parser like:
To identify any general YAML issues.
Also, you should ensure that all top-level structures are in order and that statements are not intermingled with connectors or schemas.
- Read Data
- Expression language processing (Jinja)
- Read credentials
- Execute providers
- Execute post-processing (Data Diff)
- Execute Statements (Lingk Query Language [LQL] & Spark SQL)
- SELECT statements are pure Spark SQL
- INSERT, UPDATE, DELETE, and other statements are LQL
Once a recipe finishes, it is completed.
The recipe is not completing. If this is the case, then a specific connector is not responding and the recipe is waiting for a connector to finish (i.e. a Salesforce bulk job). In other cases, there may performance issues when reading or writing data based on recipe logic.