Senior Data Engineer
Recruiting Location
US-IL-Chicago
Department
Data and AI
Summary
The Senior Data Engineer will design, build, and maintain the scalable data pipelines, models, and infrastructure that power analytics, business intelligence, and machinelearning products across the company. Partnering closely with business, product, and analytics teams, you will translate complex requirements into elegant, reliable data solutions and help drive the delivery of innovative data products. This role reports to the SeniorManager, Data Engineering.
Duties and Responsibilities
Design, develop, and maintain robust, scalable data pipelines and ETL processes, ensuring efficient ingestion, transformation, and storage of data.
- Build and optimize data models and schemas for analytics, reporting, and operational data stores
- Implement and maintain data quality frameworks, including data validation, monitoring, and alerting mechanisms.
- Collaborate closely with data architects, analysts, data scientists, and product teams to align data engineering activities with business goals.
- Leverage cloud data platforms (AWS, Azure, GCP) to build and optimize data storage solutions, including data warehouses, data lakehouses, and real-time data processing.
- Develop automation processes and frameworks for CI / CD supported by version control, linting, automated testing, security scanning, and monitoring
- Contribute to the maintenance and improvement of data governance practices, helping to ensure data integrity, accessibility, and compliance with regulations such as GDPR.
- Provide technical mentorship and guidance to junior team members, promoting best practices in software engineering, data engineering, and agile development.
- Troubleshoot and resolve complex data infrastructure and pipeline issues, ensuring minimal downtime and optimal performance.
Salaries vary by location and are based on numerous factors, including, but not limited to, the relevant market, skills, experience, and education of the selected candidate. If an estimated salary range for this role is available, it will be provided in our Target Salary Range section. Our compensation package also includes bonus eligibility and a comprehensive benefits program. Benefits information can be found at Sidley.com / Benefits.
Target Salary Range
$148,000 - $164,000 if located in Illinois
Qualifications
To perform this job successfully, an individual must be able to perform the Duties and Responsibilities (Duties) above satisfactorily and meet the requirements below. The requirements listed below are representative of the minimum knowledge, skill, and / or ability required. Reasonable accommodations will be made to enable individuals with disabilities to perform the essential functions of the job. If you need such an accommodation, please email staffrecruiting@sidley.com (current employees should contact Human Resources).
Education and / or Experience :
Required :
Bachelor's degree in Computer Science, Engineering, Data Science, or a related fieldA minimum of 5 years of hands-on experience in data engineering, building scalable data pipelines, ETL / ELT processesExtensive experience with cloud data platforms in Azure, AWS, GoogleStrong proficiency with Python, SQL, and Apache Spark for data processingHands-on experience with modern data-platform components (object storage, Lakehouse engines, orchestration tools, columnar warehouses, streaming services).Proven experience with data modeling, schema design, and performance tuning of large-scale data systems.Deep understanding of data engineering best practices : code repositories, CI / CD pipelines, test automation, monitoring, and alerting systems.Skilled at crafting compelling data narratives through tables, reports, dashboards, and other visualization toolsStrong problem-solving and analytical skills with excellent attention to detail.Excellent communication skills and experience collaborating with technical and business stakeholders.Preferred :
Master's degree in Computer Science, EngineeringExperience building data pipelines in an Azure Databricks environmentExperience migrating to-or building-data platforms from the ground upExperience with Infrastructure as Code (IAC) and Governance as CodeFamiliarity with machine-learning workloads and partnering on feature engineeringExperience working in an Agile delivery modelOther Skills and Abilities :
The following will also be required of the successful candidate :
Strong organizational skillsStrong attention to detailGood judgmentStrong interpersonal communication skillsStrong analytical and problem-solving skillsAble to work harmoniously and effectively with othersAble to preserve confidentiality and exercise discretionAble to work under pressureAble to manage multiple projects with competing deadlines and prioritiesSidley Austin LLP is an Equal Opportunity Employer
#LI-Hybrid
#LI-OE1