Job Title : Data Engineer
Location : Chicago, IL
Duration : Long Term
Position Summary :
The Senior Data Engineer will be responsible for the design, development, implementation and support of the Data Initiatives, to ensure that optimal data delivery architecture is consistent throughout ongoing projects. You will engage in supporting the data analysts and data scientists, and data needs of multiple teams, systems and products.
Essential Duties and Responsibilities
- Drive requirements, scope, and technical design of the integration workflows, to make sure the build is conducted accurately and according to spec.
- Develop and maintain requirements, design documentation and test plans.
- Seek out, design, and implement internal process improvements : automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability.
- Coordinate with BI Engineers, Financial Applications and Oracle HR teams around data management including schemas, failure conditions, reconciliation, test data set up, etc.
- Build the infrastructure required for optimal ETL / ELT pipelines to ingest data from a wide variety of data sources using Microsoft Azure technologies such as Azure Data Factory and Databricks.
- Construct and maintain of enterprise level integrations using the Snowflake platform, Azure Synapse, Azure SQL and SQL Server.
- Create data tools for data analytics and data science team members that assist them in building and optimizing our product into an innovative industry leader.
- Design analytics tools that utilize the data pipeline to deliver actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Troubleshoot issues helping to drive root-cause analysis, and work with infrastructure teams to resolve incidents and arrive to a permanent resolution.
- Partner with data and analytics teams to strive for greater functionality in our data systems.
- Provide direction and coordination for development, and support teams, including globally located resources.
- Understand the layout and working of existing integrations that send and receive data between Oracle, Concur, JDE, Corporate Data Platform and other systems.
Required :
A relevant technical BS Degree in Information Technology3 years writing SQL queries against any RDBMS with query optimization3 years of data engineering experience leveraging technologies such as Snowflake, Azure Data Factory, ADLS Gen 2, Logic Apps, Azure Functions, Databricks, Apache Spark, Scala, Synapse, SQL ServerExperience with scripting tools such as Power Shell, Python, Scala, Java and XMLUnderstanding the pros and cons, and best practices of implementing Data Lake, using Microsoft Azure Data Lake StorageExperience structuring Data Lake for the reliability, security and performanceExperience implementing ETL for Data Warehouse and Business intelligence solutionsSkills to read and write effective, modular,dynamic, parameterized and robust code, establish and follow already established code standards, and ETL frameworkStrong analytical, problem solving, and troubleshooting abilitiesGood understanding of unit testing, software change management, and software release managementKnowledge of Dev-Ops processes (including CI / CD) and Infrastructure as Code fundamentalsExperience performing root cause analysis on data and processes to answer specific business questions, and identify opportunities for improvementExperience working within an agile teamExcellent communication skills