Job Description
- Development of ETL / ELT data mappings and workflows for data pipeline development with Informatica Cloud Data Integration.
- Practical experience using and building Informatica Mass Ingestion Pipelines.
- Demonstrated experience with Oracle Database as a Data Connector source.
- Expert with Snowflake as a Target database platform.
- Experience with the Snowflake platform and ecosystem. Knowledge of Snowflake data sharing and Snowpark is a plus.
- Knowledge of the advantages as well as previous experience working with Informatica push-down optimization
- Experience with Snowflake database creation, optimization, and architectural advantages.
- Practical experience with Snowflake SQL.
- Determines database requirements by analyzing business operations, applications, and programming; reviewing business objectives;
and evaluating current systems.
- Obtains data model requirements, develops, and implements data models for new projects, and maintains existing data models and data architectures.
- Creates graphics and other flow diagrams (including ERD) to show complex database design and database modeling more simply.
- Performs related work as assigned.
- Practical experience with one time data loads as well as Change Data Capture (CDC) for bulk data movement.
- Creation of technical documentation for process and interface documentation is a key element of this role, as the Team is working on release two of a multiple release effort.
- Ability to review the work of others, troubleshoot, and provide feedback and guidance to meet tight deliverable deadlines is required.
- Ability to promote code from development environments to production.
- Familiarity with GitHub or equivalent version control systems.
- Experience working with state agencies as well as security protocols and processes.
Salary Range : 100K - 110K Per Annum
CANDIDATE SKILLS AND QUALIFICATIONS :
8 years of experience : 6+ years with Informatica Products, with at least 4 years direct experience with Informatica Cloud Data Integration.
5 years of experience : Generating advanced SQL queries and using other data interrogation methods.
5 years of experience : Reviewing, interpreting, and translating business requirements into data mappings and pipeline development using all major data integration patterns.
4 years of experience : Relational database design concepts, including direct experience with Oracle RDBMS.
3 years of experience : Data Warehouse architectural patterns, including modeling Facts and Dimensions.
3 years of experience : Experience with static mapping and Change Data Capture (CDC) for bulk data movement.
2 years of experience : Experience with Informatica Mass Ingestion.
2 years of experience : Experience using Snowflake, including creating, managing, and optimizing databases, as well as utilizing time-travel and other technical advantages of cloud data warehousing.
Preferred Requirements :
1 year of experience : Any experience working with Snowpark, servicing the needs of Python programmers, or understanding Data Scientists requirements against Snowflake databases.