Job Title : Senior Data Engineer
Location : Santa Monica, Los Angeles, New Jersey, or Seattle, WA
Work Type : Hybrid 4 days onsite per week
Experience : 5+ Years
Work Authorization : US Citizens and Green Card Holders only
About the Role :
We are seeking an experienced Senior Data Engineer to design, build, and optimize scalable data pipelines that drive business insights and analytics capabilities. The ideal candidate will have strong expertise in data engineering, ETL pipeline development, SQL performance tuning, and modern data platforms . You will collaborate cross-functionally with analytics, infrastructure, and business teams to support enterprise-wide reporting, KPIs, and data-driven decisions.
This role requires a hands-on technologist with strong analytical skills and the ability to operate in a fast-paced, highly collaborative environment.
Key Responsibilities :
- Collaborate with technical and non-technical teams to gather and translate data and reporting requirements.
- Design and build robust, scalable, and efficient ETL pipelines for internal and external data sources.
- Develop and maintain data models and schemas optimized for analytics and reporting.
- Implement and automate data quality checks to ensure accuracy and reliability of critical datasets.
- Create and manage ETL workflows using orchestration tools such as Airflow or Prefect .
- Utilize Snowflake , Databricks , and Spark to process and transform large-scale data efficiently.
- Conduct SQL and ETL performance tuning for high-performance data operations.
- Develop automated deployment pipelines using CI / CD tools (e.g., Jenkins, GitHub Actions).
- Collaborate with infrastructure and DevOps teams to monitor pipelines using tools like Datadog .
- Perform ad hoc analysis, contribute to architecture reviews, and support the implementation of data governance best practices.
Required Qualifications :
5+ years of hands-on data engineering experience in a modern cloud or hybrid data environment.Proficiency in :Advanced SQL (complex queries, optimization, performance tuning)
Python for data processing and automationSpark , Snowflake , Databricks , and Airflow (or Prefect)Strong understanding of data modeling principles (dimensional modeling, normalization).Experience working with at least one major cloud platform (AWS, Azure, or GCP).Excellent analytical, problem-solving, and strategic thinking abilities.Strong communication skills-both written and verbal-with the ability to explain complex concepts clearly.Comfortable working in an Agile / Scrum environment.Preferred Qualifications :
Bachelor's or Master's degree in Computer Science , Engineering , or related technical field.Experience with :Kafka or other streaming data tools
Data integration and visualization platforms (e.g., Datorama, Improvado, FiveTran)Docker containers for deployment and scalabilityMonitoring tools such as Datadog or similarBackground in media, entertainment , or digital analytics environments is a plus.Additional Information :
Work Location : 4 days per week onsite from one of the following hubs :Santa Monica, CA
Los Angeles, CANew JerseySeattle, WAWork Authorization : US Citizens and Green Card Holders onlyStability : No job hoppers seeking candidates with consistent, long-term experience.#SeniorDataEngineer #DataEngineering #ETLDevelopment #SnowflakeJobs #Databricks #PythonDeveloper #BigDataEngineer #SparkDeveloper #DataPipeline #SQLDeveloper #Airflow #Prefect #AWSJobs #AzureJobs #GCPJobs #Kafka #CloudDataEngineering #CI_CD #Docker #AnalyticsJobs #MediaAndEntertainment #TechCareers #DataAnalytics #DataArchitecture #CloudComputing #DataDriven #DigitalTransformation #NowHiring #HiringAlert #HiringInCalifornia #HiringInSeattle #HiringInLosAngeles #HiringInNewJersey #USJobs #HybridJobs #TechJobsUSA