Location and Workplace Flexibility :
We have offices in Atlanta GA, Boston MA, Morristown NJ, Plano TX, St. Louis MO, St. Petersburg FL, and Hyderabad, India.
We foster a hybrid and remote friendly culture and all of our employee's work locations are based on the needs of the position and determined by the Leadership team.
In-office work and activities, if applicable, vary based on the work and team objectives in accordance with Company policies.
Senior Data Engineer
Responsibilities :
Build High level technical design both for Streaming and batch processing systems
Design and build reusable components, frameworks and libraries at scale to support analytics data products
Perform POCs on new technology, architecture patterns
Design and implement product features in collaboration with business and Technology stakeholders
Anticipate, identify, and solve issues concerning data management to improve data quality
Clean, prepare and optimize data at scale for ingestion and consumption
Drive the implementation of new data management projects and re-structure of the current data architecture
Implement complex automated workflows and routines using workflow scheduling tools
Build continuous integration, test-driven development and production deployment frameworks
Drive collaborative reviews of design, code, test plans and dataset implementation performed by other data engineers in support of maintaining data engineering standards
Analyze and profile data for the purpose of designing scalable solutions
Troubleshoot complex data issues and perform root cause analysis to proactively resolve product and operational issues
Lead, Mentor and develop offshore Data Engineers in adopting best practices and deliver data products.
Partner closely with product management to understand business requirements, breakdown Epics,
Partner with Engineering Managers to define technology roadmaps, align on design, architecture, and enterprise strategy
Requirements :
Minimum of 6 years experience with the following :
Snowflake (Columnar MPP Cloud data warehouse)
ETL tool to include any of the following : DBT; Informatica; Matillion; Talend; or Azure Data Factory
Python
Experience designing and implementing Data Warehouse
Preferred Skills :
SQL objects (procedures, triggers, views, functions) in SQL Server. SQL query optimizations
Understanding of T-SQL, indexes, stored procedures, triggers, functions, views, etc.
Design and development of Azure / AWS Data Factory Pipelines preferred.
Design and development of data marts in Snowflake preferred
Working knowledge of Azure / AWS Architecture, Data Lake, Data Factory
Business analysis experience to analyze data to write code and drive solutions
Knowledge of : Git, Azure DevOps, Agile, Jira and Confluence
Working knowledge of Erwin for data modelling