A company is looking for a Databricks Data Engineer to work remotely in California for a 12-month duration.Key ResponsibilitiesManage and optimize the Databricks platform, including workspace management and job schedulingDevelop and maintain ETL pipelines using Azure Data Factory and ensure data quality and governanceUtilize Spark and Python for data manipulation, automation, and advanced queryingRequired QualificationsExpertise in the Databricks platform and Spark (PySpark/Scala)Experience with SQL optimization and data warehousing concepts, including Delta LakeProficiency in Python scripting and familiarity with Azure Cloud servicesKnowledge of data governance, security, and compliance standardsExperience with BI/reporting tools such as Power BI or Tableau
Job Type
Fulltime role
Skills required
No particular skills mentioned.
Location
Fresno, California
Salary
No salary information was found.
Date Posted
April 12, 2025
A California-based company is seeking a Databricks Data Engineer to manage and optimize their Databricks platform remotely. The role involves developing ETL pipelines and ensuring data quality using Azure Data Factory.