Databricks Engineer Contractor (Onsite)

Cincinnati, Ohio, United States

Position Summary

Contractor

Location: Cincinnati, OH


Work you'll do

Work You’ll Do

The services you will provide the Deloitte Project Team as a Databricks Engineer Contractor:

Design, develop, and optimize scalable data pipelines and workflows using Databricks and Spark
• Collaborate with data scientists, analysts, and business teams to understand data requirements and deliver solutions
• Implement and maintain ETL/ELT processes for ingesting, transforming, and loading data from diverse sources
• Ensure data quality, reliability, and performance across all data engineering solutions
• Manage and monitor Databricks clusters, jobs, and resources for cost and performance efficiency
• Apply best practices for data security, privacy, and compliance within the Databricks environment
• Document data engineering processes, workflows, and technical solutions
• Troubleshoot and resolve data pipeline and processing issues in a timely manner
• Stay current with emerging Databricks features, Spark advancements, and cloud data engineering trends
• Databricks expert who can design and drive architecture decisions across GCP and Azure for data federation on Databricks using medallion architecture
• Solution architect who can manage and liaison with both Deloitte and Client teams on the ground
• Technical SME who can help provide PoV on Client’s broader data strategy
• Support the legacy pipelines (Java, XML) and GCP managed services (PubSub, BigQuery, Dataflow)

Qualifications:

6-9 years of experience int the following areas

• Strong experience with Databricks (preferably Azure Databricks) for data engineering and analytics workloads
• Proficiency in Apache Spark (PySpark, Scala, or Spark SQL)
• Solid programming skills in Python and/or Scala
• Hands-on experience with cloud platforms (Azure preferred; AWS or GCP also valuable)
• Expertise in building ETL/ELT pipelines and data integration workflows
• Strong SQL skills for data manipulation and transformation
• Experience with data lake architectures (e.g., Delta Lake, Azure Data Lake Storage)
• Familiarity with CI/CD practices for data engineering (e.g., DevOps, version control)
• Understanding of data modeling, data warehousing, and data governance concepts
• Excellent problem-solving and communication skills

The expected pay range for this contract assignment is 75 to 80 per hour.  The exact pay rate will vary based on skills, experience, and location and will be determined by the third-party whose employees provide services to Deloitte. 

Candidates interested in applying for this opportunity must be geographically based in the United States and must be legally authorized to work in the United States without the need for employer sponsorship

We do not accept agency resumes and are not responsible for any fees related to unsolicited resumes.

Deloitte is not the employer for this role.

 

This work is contracted through a third-party whose employees provide services to Deloitte.

#LI-AR8

#LI-Contract

Expected Work Schedule

During team’s core business hours

Approximate hours per week

About Deloitte

Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It makes Deloitte one of the most rewarding places to work. 

As used in this posting, “Deloitte” means , a subsidiary of Deloitte LLP. Please see www.deloitte.com/us/about for a detailed description of the legal structure of Deloitte LLP and its subsidiaries.

Requisition code: 312538