Role - Platform Architect : Databricks
Location - KY / Remote Option possible
Tentative Duration - 6 months
Rate - All-invlusive
Deep understanding of Databricks architecture, including clusters, notebooks, jobs, and the underlying compute and storage layers
Experience building Databricks as a global platform across multiple regions
Proficiency in Apache Spark, including its core components (Spark SQL, Spark Streaming, and MLlib)
Knowledge of Delta Lake and its features (ACID transactions, time travel, etc.)
Experience in using Databricks SQL for data querying, analysis, and visualization
Ability to create and manage complex data pipelines and workflows using Databricks Jobs
Understanding of cluster configurations, autoscaling, and performance optimization
Unity Catalog experience
Deep understanding of AWS or Azure cloud essentials, including Storage, Networking, Identity and Access Management, and Data Security
Understanding of network configurations, VPCs, and security groups for Databricks deployments
Ability to analyze and optimize Databricks costs
Nice-to-haves
Certifications for any of the cloud services like Azure, AWS, or GCP
Experience working with code repositories and continuous integration pipelines using AWS code build / code pipelines or similar tools / technologies
Experience in data governance and lineage implementation
Multi-geo and distributed delivery experience in large programs.
Platform Architect • United States