Data Engineer

TRG
Okemos, MI
Full-time

RESPONSIBILITIES

  • Participate in the analysis and development of technical specifications, programming and testing of Data Engineering components.
  • Create data pipelines that use change data capture (CDC) mechanisms to move data to a cloud provider and then transform data to make it available to Customers to consume.
  • Perform general data extraction, transformation and load (ETL) work, along with traditional Enterprise Data Warehousing (EDW) work.
  • Participate creating data pipelines and ETL workflows to ensure that design and enterprise programming standards and guidelines are followed.

Assist with updating the enterprise standards when gaps are identified.

  • Follow technology best practices and standards and escalates any issues as deemed appropriate. Follows architecture and design best practices (as guided by the Lead Data Engineer, BI Architect and Architectural team).
  • Assist in configuration and scripting to implement fully automated data pipelines, stored procedures and functions and ETL workflows that allow for data to flow from on-premises Oracle databases to Snowflake where the data will be consumable by our end customers.
  • Follows standard change control and configuration management practices.
  • Participates in 24-hour on-call rotation in support of the platform.

REQUIREMENTS

  • 3+ years of experience
  • Database Platforms - Snowflake, Oracle and SQL Server
  • OS Platforms - Linux and Windows Server
  • Languages and Tools - PL / SQL, Python, T-SQL, StreamSets, Snowflake Streams and Tasks and Informatica PowerCenter, DBeaver
  • Drive and desire to automate repeatable processes.
  • Excellent interpersonal skills and communication, as well as the willingness to collaborate with teams across the organization.
  • Experience loading data from files in Snowflake file stages into existing tables.
  • Experience creating and working with near-real-time data pipelines between relational sources and destinations.
  • Experience working with StreamSets Data Collector or similar data streaming / pipelining tools (Fivetran, Striim, Airbyte etc)

WORK ENVIRONMENT

  • Hybrid - at least 2 days a week on site inOkemos, MI
  • 2 hours ago
Related jobs
Promoted
Contemporary Amperex Technology Kentucky LLC
MI, United States

As a Data Center Engineer at CATK, you will play a critical role in managing and maintaining our central and edge data centers. Minimum of 3 years of experience in data center management or system integration engineering. Experience in system integration engineering or data center management. The id...

Promoted
Insight Global
MI, United States

Performs assigned (or initiates) engineering studies, proposing solutions to engineering-related problems; develops models and analyzes data to support improvement programs, techniques & solutions. Bachelor's Degree in Engineering or Engineering Technology and a minimum of 4 years of job relevant ex...

Vertafore
East Lansing, Michigan

Operate, manage, and maintain data pipelines that utilize multiple data sources and reporting tools such as Oracle, PostgreSQL and Pentaho. Serve as the liaison and collaborator with development and data operations teams. Write automated tests across to continuously validate data pipeline functional...

Highmark Health
MI, Working at Home, Michigan

In partnership with other business, platform, technology, and analytic teams across the enterprise, design, build and maintain well-engineered data solutions in a variety of environments, including traditional data warehouses, Big Data solutions, and cloud-oriented platforms. Align with security, da...

CVS Health
Work from home, MI, US
Remote

We are seeking a highly skilled and motivated individual to join our team as a Big Data Cloud-Based Vulnerability Management Data Analytics Developer. This is an exciting opportunity to work on cutting-edge technology and contribute to our mission of safeguarding critical data and infrastructure. Th...

Jackson
Lansing, Michigan

Agile methodologies, data pipelines, data streaming, data as a service (REST APIs), CI/CD pipelines, cloud technologies, IBM Infosphere (DataStage, Quality Stage), Parquet, JSON, AVRO required. Data Engineer will be responsible for building and optimizing the data platform. Collaborates with team to...

Highmark Health
MI, Working at Home, Michigan

This role within the 'Data Engineering & Self-Service Products' team involves architecting and engineering analytic data solutions, including designing and developing data marts in Databricks using PySpark or Spark SQL, building interactive Power BI dashboards to visualize KPIs and trends, and creat...

Miracle Software Systems
Michigan, USA

Experience in analyzing complex data, organizing raw data, and integrating massive datasets from multiple data sources to build analytical domains and reusable data products . Experience in working with architects to evaluate and productionalize data pipelines for data ingestion, curation, and consu...

Highmark Health
MI, Working at Home, Michigan

Performs modeling of data sources and flows, works with data management staff to define an operational framework that will define, develop and implement policies and security frameworks to ensure precise and secure delivery of knowledge and information. Perform modeling of data sources and flows, wo...

Vertafore
East Lansing, Michigan

Operate, manage, and maintain data pipelines that utilize multiple data sources and reporting tools such as Oracle, PostgreSQL and Pentaho. Serve as the liaison and collaborator with development and data operations teams. Write automated tests across to continuously validate data pipeline functional...