We are seeking an experienced HVR (now Fivetran HVR) Implementation Contractor to manage, optimize, and support high-volume data replication across diverse database and cloud environments. The ideal candidate will have deep expertise in HVR/Fivetran, strong database and cloud integration skills, and a proven ability to ensure reliable, high-performance data flows through automation and advanced troubleshooting.
Roles and Responsibilities
• Platform Management: Install, configure, and upgrade HVR (now Fivetran HVR) software, including the Hub, agents (HVAs), and associated components, across source and target systems. • Connection & Replication Setup: Establish and manage data source connections for diverse databases (e.g., Oracle, SQL Server, SAP HANA, PostgreSQL, MySQL) and cloud platforms (e.g., Snowflake, Databricks, BigQuery, Redshift). Configure robust replication channels, defining source/target locations, selecting tables, and setting up data transformations.
• Performance Optimization: Fine-tune HVR configurations to maximize data replication performance, including adjusting batch sizes, parallelization, and network settings. Proactively identify and address system capacity bottlenecks (e.g., storage, CPU, memory, network bandwidth) by recommending improvements.
• High-Volume CDC Implementation: Develop and execute strategies for high-volume data replication and efficient log-based Change Data Capture (CDC).
• Operational Health: Ensure HVR agents are running optimally and maintaining reliable connectivity to all source and target endpoints.
Qualifications:
• HVR/Fivetran Expertise: Minimum of 5 years of hands-on implementation experience with HVR (or Fivetran HVR) and administration, configuration, and advanced troubleshooting.
• Database Proficiency: Strong command of various database technologies (e.g., Oracle, SQL Server, SAP HANA, PostgreSQL, MySQL) and their respective Change Data Capture (CDC) mechanisms.
• Cloud Data Integration: Proven experience with major cloud platforms (AWS, Azure, GCP) and cloud data warehouses (Snowflake, BigQuery, Redshift).
• Data Concepts: Solid understanding of data warehousing concepts, ETL/ELT processes, and common data integration patterns.
• Problem-Solving: Excellent analytical and problem-solving abilities to diagnose and resolve complex data flow issues.
• Communication: Strong communication and collaboration skills to work effectively with diverse technical and business teams.
• Automation: Scripting proficiency (e.g., Shell scripting, Python) for automating tasks and enhancing operational efficiency.
• Networking & Security: Fundamental knowledge of network configurations and security principles as they apply to data replication.
The expected pay range for this contract assignment is $65-$70 per hour. The exact pay rate will vary based on skills, experience, and location and will be determined by the third-party whose employees provide services to Deloitte.
Candidates interested in applying for this opportunity must be geographically based in the United States and must be legally authorized to work in the United States without the need for employer sponsorship
We do not accept agency resumes and are not responsible for any fees related to unsolicited resumes.
Deloitte is not the employer for this role.
This work is contracted through a third-party whose employees provide services to Deloitte.
#LI-AR8
#Remote