Must Have :
- AWS Platform
- Snowflake
- ETL Tools
Informatica Power center Required; Informatica IICS, Fivetran, Dbt labs, AWS Airflow Nice to have
Python, SQL – RequiredFamiliarity with :
Cloud Technologies - AWS services like Lambda, SQS / SNS, EMR, Cloud watch, RDS. EKSCompute Technologies – Spark / HadoopStreaming Technologies – Kafka, AWS KinesisAPI Technologies – MuleSoft, AWS Gateway APICI / CD – Atlassian Bitbucket and BambooBuild tools - Apache ANT and MavenRESPONSIBILITIES :
Be an expert in Data Warehouse domain and relevant business function.Work with minimal supervision and provide status updates.Provide application support as part of an on-call rotation to resolve outages and user questions.Build large scale datasets for a wide range of consumer needs.Build, test and implement highly efficient data pipelines using a variety of technologies.Analyze new data sources to understand quality, availability, and content.Create and execute unit tests.Support QA team during testing.Knowledge / Skills / Abilities :
Proven ability to model data for reporting and analytics needs.Proven ability to design and implement applications using best practices.Proven ability to analyze and understand existing processes and code.Works and communicates effectively with all levels of management.Excellent written, verbal, and people skills.Must work well within team-oriented environments.Proven ability to work in Agile development methodology.Mandatory Skills : Snowflake, dbt, ETL, AWS
J-18808-Ljbffr