We have a 12 month contract, opportunity to hire, for a highly motivated and collaborative professional with 2-4 years of experience developing and operationalizing data pipelines for ingestion, transformation, validation, and optimization. They possess strong expertise in Snowflake, SQL, and Python, with a proven ability to source and manage data from diverse systems, including relational databases like Oracle / SQL Server and JSON-based sources such as MongoDB and Kafka. This individual has hands-on experience leveraging tools in the AWS ecosystem and vendor platforms like Upsolver and Informatica IDMC, as well as using Git for version control. They thrive in agile, sprint-based environments, can work independently after onboarding, and excel at collaborating with source and downstream teams to deliver clean, structured data for reporting and analytics. A strong problem-solver and communicator, they contribute valuable insights to team discussions and help drive continuous improvement in data engineering practices. 100% Remote.
MUST HAVES :
NICE TO HAVES :
Disqualifiers :
About the Role :
Responsibilities :
Job Description : Job Profile Summary
Position Purpose :
Develops and operationalizes data pipelines to make data available for consumption (reports and advanced analytics), including data ingestion, data transformation, data validation / quality, data pipeline optimization, and orchestration. Engages with the DevSecOps Engineer during continuous integration and continuous deployment
Education / Experience :
A Bachelor's degree in a quantitative or business field (e.g., statistics, mathematics, engineering, computer science) and requires 2 - 4 years of related experience.
Or equivalent experience acquired through accomplishments of applicable knowledge, duties, scope and skill reflective of the level of this position.
Technical Skills :
One or more of the following skills are desired.
Experience with Big Data; Data Processing
Experience with Other : diagnosing system issues, engaging in data validation, and providing quality assurance testing
Experience with Data Manipulation; Data Mining
Experience with Other :
Experience with one or more of the following C# (Programming Language); Java (Programming Language); Programming Concepts; Programming Tools; Python (Programming Language); SQL (Programming Language)
Knowledge of Microsoft SQL Servers; SQL (Programming Language)
Soft Skills :
Intermediate - Seeks to acquire knowledge in area of specialty
Intermediate - Ability to identify basic problems and procedural irregularities, collect data, establish facts, and draw valid conclusions
Intermediate - Ability to work independently
Data Engineer Python • St Louis, MO, United States