Jobs

Databricks Architect/Admin

|  Posted On: Apr 27, 2026

location:Hartford, CT 06183

Duration:6 Months, Contract

mode of work:Hybrid Remote

Log In and Apply

Job Summary

Job Title:  
Databricks Architect/Admin
Posted Date:  
Apr 27, 2026
Duration:  
6 Months, Contract
Shift(s):  

08:00 - 16:00 EST

Salary ($): 
48.00 - 50.00 per Hourly (compensation based on experience and qualifications)
We care about you! Explore Rangam’s benefits information

Talk to our Recruiter

Name:
 
Anurag Vohra

Email:
 
anurag@rangam.com

Phone:
 
908-704-8843

Description

Temp-to-Perm

Job Title: Databricks Architect/Admin Department: Data & Analytics Platform

Job Type: Full-Time Travel: Minimal

POSITION SUMMARY

  • The Databricks Architect/ADMIN is a senior individual contributor responsible for the design, implementation, and continuous optimization of the enterprise Databricks platform.
  • This role serves as the technical authority for all aspects of the Databricks environment — including workspace governance, Unity Catalog, cluster and compute strategy, data pipeline architecture, and cost management.
  • The Architect works in close partnership with data engineering, analytics, and infrastructure teams, and operates within a broader multi-platform data ecosystem that includes Ab Initio and Fivetran.
  • A strong background in Unix/Linux systems administration and scripting is essential, as the role requires deep engagement with the underlying compute infrastructure supporting the platform.

KEY RESPONSIBILITIES

Platform Architecture & Design

  • Architect and govern the enterprise Databricks environment, including workspace topology, Unity Catalog structure, and access control frameworks.
  • Define and enforce standards for cluster configuration, runtime versions, instance pool utilization, and auto-scaling policies.
  • Design scalable, performant data pipeline patterns using Delta Live Tables, Databricks Workflows, and structured streaming.
  • Establish architectural standards for Delta Lake — including table formats, partitioning strategies, Z-ordering, and OPTIMIZE/VACUUM scheduling.
  • Lead platform integration design with upstream ingestion tools including Fivetran and Ab Initio, ensuring reliable, governed data delivery.

Unix/Linux Infrastructure & Operations

  • Administer and troubleshoot Unix/Linux environments underpinning Databricks compute nodes, init scripts, and cluster lifecycle management.
  • Develop and maintain shell scripts (Bash) and Python automation for platform operations, monitoring, log aggregation, and maintenance tasks.

Automation & Artificial Intelligence

  • Design and implement end-to-end automation frameworks for platform operations, including cluster lifecycle management, job scheduling, alerting, and self-healing workflows.
  • Leverage Databricks AutoML, MLflow, and Model Serving capabilities to support the operationalization of machine learning models within the enterprise data platform.
  • Integrate AI-assisted development tooling (e.g., Databricks Assistant, GitHub Copilot) into engineering workflows to accelerate pipeline development and reduce manual effort.
  • Evaluate and recommend emerging AI/ML platform capabilities, including generative AI integrations and LLM-backed data workflows, in alignment with enterprise strategy.

REQUIRED QUALIFICATIONS

  • 7+ years of experience in data engineering or data platform roles, with a minimum of 4 years hands-on Databricks implementation experience.
  • Demonstrated expertise with Databricks platform capabilities: Unity Catalog, Delta Lake, Databricks Workflows, Delta Live Tables, and SQL Warehouses.
  • Strong Unix/Linux proficiency — shell scripting, process management, file system operations, cron scheduling, and environment configuration.
  • Proficiency in Python and PySpark for distributed data processing, pipeline development, and platform automation.
  • Experience with cloud infrastructure (AWS, Azure, or GCP), including compute, storage, networking, and IAM/security constructs.
  • Familiarity with AI/ML concepts and tooling within the Databricks ecosystem, including MLflow, AutoML, and Model Serving; exposure to generative AI or LLM-integrated workflows is a plus.
  • Experience with Oracle database environments, including SQL development, schema design, and integration patterns for data extraction and pipeline sourcing.
  • Strong written and verbal communication skills, with the ability to convey complex architectural concepts to both technical and non-technical audiences.

AI-Assisted Application Screening

As part of our recruitment process, we may use automated tools or AI-enabled technologies to assist with resume screening and candidate matching. These tools help our recruitment team review applications more efficiently, but they do not make hiring decisions. All final decisions are made by human reviewers.