Job Description
Job Description
Title : Data Engineer (Python, PySpark and AWS)
Location : Columbus, OH
Client : Mphasis / JPMC
Type : Contract
Position summary
A Data Engineer at CMS is a software engineer with proficiency in data. The data engineer will build and maintain the CMS data warehouse which is used for both reporting and analytics across the company.
The individual works cross functionally with technical and business teams to identify opportunities to better leverage data.
The data comes from a variety of sources and it is the responsibility of the data engineer to make sense of the data using cloud based systems (AWS) and provide a reliable and structured format to meet the different business needs at CMS.
Duties and responsibilities
Collaborate with the team to build out features for the data platform and consolidate data assets
Build, maintain and optimize data pipelines built using Spark
Advise, consult, and coach other data professionals on standards and practices
Work with the team to define company data assets
Migrate CMS’ data platform into Chase’s environment
Partner with business analysts and solutions architects to develop technical architectures for strategic enterprise projects and initiatives
Build libraries to standardize how we process data
Loves to teach and learn, and knows that continuous learning is the cornerstone of every successful engineer
Has a solid understanding of AWS tools such as EMR or Glue, their pros and cons and is able to intelligently convey such knowledge
Implement automation on applicable processes
The ideal candidate
5+ years of experience in a data engineering position
Proficiency is Python (or similar) and SQL
Strong experience building data pipelines with Spark
Strong verbal written communication
Strong analytical and problem solving skills
Experience with relational datastores, NoSQL datastores and cloud object stores
Experience building data processing infrastructure in AWS
Bonus : Experience with infrastructure as code solutions, preferably Terraform
Bonus : Cloud certification
Bonus : Production experience with ACID compliant formats such as Hudi, Iceberg or Delta Lake
Bonus : Familiar with data observability solutions, data governance frameworks