Job Description
This is a remote position.
Develops specifications and plans the full range of programming actions to produce data integration components. Develops and maintains complex automated ETL pipeline architecture using assigned tools and programming languages.
Leads moderately complex projects or supports larger initiatives as part of a team and partners with users to understand business requirements.
Develops complex data objects for business analytics using data modeling techniques.
Essential Skills : Data Analysis :
Data Analysis :
- Develops, tests, and maintains pipeline architecture and infrastructure :
- Experienced with Data Modeling, Data Stores on Enterprise Applications. Training within our internal Operational Data Stores will be provided once hired.
- Experience with Life and Annuity business applications.
- Investigates and analyzes feasibility of data integration prototypes and program requirements
- Develops specifications and plans the full range of programming actions to produce data integration components
- Implements and supports reporting and analytics infrastructure for internal business customers using data integration services (Kafka, Mulesoft, Salesforce).
- Guides internal business customers to develop, troubleshoot, an optimize complex SQL solutions to solve reporting and analytics problems.
Develops, tests, and deploys code using internal software development toolsets, including the codes for deploying infrastructure and solutions for secure data storage, data catalogs, and data queries.
- Leads complex projects or supports larger initiatives as part of a team
- Partners with stakeholders to understand business requirements
- Ability to conceptualize and develop new data solutions to meet the business requirements
- Researches, performs analysis, and proposes effective solutions related to system developments and enhancements
- Analyzes and reviews potential adjustments or modifications for impacts on other programs
- Collaborates with business areas to develop solutions to meet business requirements
Beneficial, but not required Skills :
Data modeling :
- Develops complex data objects for business analytics using data modeling techniques
- Develops and optimizes complex data models using best practices for data definition language (DDL), physical and logical tables, data partitioning, compression and parallelization
Documentation :
- Works with internal business customers and software development teams to gather and document requirements for data publishing, data consumption, and analytic solutions.
- Develop and maintain master test plan, test cases, and solution within ADO (Azure DevOps).
Requirements
Education and Experience :
- Bachelor’s degree, preferably in a computer related field or equivalent relevant experience
- Five years of data integration development experience, or related experience
- Minimum of two years with Salesforce development platform
Knowledge, Skills and Abilities :
- Strong knowledge in processes supporting data transformation, data structures, metadata, dependency and workload management
- Strong SQL knowledge and experience working with relational databases
- Experience with SOQL (Salesforce Data Model Knowledge)
- Experience with Agile Methodology
- Excellent analytic skills to work with unstructured datasets
- Experience with applicable programming languages, such as Mongo DB, Dataweave, Informatica
- Strong ability to manipulate, process and extract value from large disconnected datasets
- Excellent verbal and written communication skills
- Strong attention to detail, organizational and multi-tasking skills required with the ability to adapt to changing priorities
- Ability to maintain confidentiality
The skills that are a must-have for these roles are :
- Salesforce experience in creating custom queries using SOQL and integrating data with external systems.
- Understanding of Salesforce objects, data structure and best practice for data managements.
- Develops, tests, and maintains pipeline architecture and infrastructure.
- Experienced in Data Modeling, Data Stores on Enterprise Applications. Internal Operational Data Stores Training will be provided once hired.
Requirements
Education and Experience :
- Bachelor’s degree, preferably in a computer related field or equivalent relevant experience
- Five years of data integration development experience, or related experience
- Minimum of two years with Salesforce development platform Knowledge, Skills and Abilities :
- Strong knowledge in processes supporting data transformation, data structures, metadata, dependency and workload management
- Strong SQL knowledge and experience working with relational databases
- Experience with SOQL (Salesforce Data Model Knowledge)
- Experience with Agile Methodology
- Excellent analytic skills to work with unstructured datasets
- Experience with applicable programming languages, such as Mongo DB, Dataweave, Informatica
- Strong ability to manipulate, process and extract value from large disconnected datasets
- Excellent verbal and written communication skills
- Strong attention to detail, organizational and multi-tasking skills required with the ability to adapt to changing priorities
- Ability to maintain confidentiality The skills that are a must-have for these roles are :
- Salesforce experience in creating custom queries using SOQL and integrating data with external systems.
- Understanding of Salesforce objects, data structure and best practice for data managements.
- Develops, tests, and maintains pipeline architecture and infrastructure.
- Experienced in Data Modeling, Data Stores on Enterprise Applications. Internal Operational Data Stores Training will be provided once hired.