Import Jobs to Projects
Last updated
Was this helpful?
Last updated
Was this helpful?
Projects is a solution to continuously optimize and monitor a repeat production Databricks workloads. To implement projects, integration of the Sync library in a user's orchestration system (e.g. Airflow, Databricks Workflows) is necessary.
Once integrated, the Gradient UI will provide high level metrics and easy to use controls to monitor and manage your Apache Spark clusters.
From the Projects tab, click on the button.
Use the Databricks Auto Import wizard to easily create multiple projects, each linked to a Databricks Job in your workspace.
NOTICE: The import wizard will make the following changes to your selected Databricks Jobs:
Add the web-hook notification destination to the job so that Gradient is notified on every successful run
Update the job cluster with the init script, env vars, and instance profile to collect worker instance and volume information.
Review the compatible Databricks jobs and select the jobs for which you would like to create a Gradient project and select create projects
for each of the selected jobs. By creating a project, the following properties will be added for each.
You should now see the project[s] you created on you Projects summary dashboard. New projects will have a status of "Pending Setup" until the project is configured to receive logs for recommendations.
The Auto Import wizard connects to your specified Databricks workspace using a obtained during the .
If you want to manually import a single job, follow the instructions.