Greetings
Our client, Navy Federal Credit Union, is the world's largest credit union with over 10 million members, more than $149 billion in assets, and over 23,000 employees.
They are seeking an IT Engineer - Data-15507-Hybrid . Your profile appears to be a strong match for this position.
Please review the following information for clarity on the role :
Basic Purpose
Develop strategies for data acquisition, pipelines, and database implementation. Responsible for designing, building, integrating data from various sources, and managing big data. Develop complex queries to optimize the performance of Navy Federal's big data ecosystem within CI / CD pipelines. Recognized as an expert with specialized expertise, solving highly complex problems, leading projects, and working independently.
Responsibilities
- Provide Data Intelligence and Data Warehousing solutions, leveraging project standards and data platforms.
- Build and maintain Azure data pipelines using DevSecOps processes.
- Define and develop data integration processes across the organization.
- Create conceptual and logical data models for stakeholders and management.
- Collaborate with business leadership to understand data needs and develop solutions to support decision-making and business objectives.
- Develop detailed project implementation plans with milestones and deliverables.
- Document processes to create technical and non-technical reference materials.
- Identify potential issues and risks during analytics projects and suggest mitigation strategies.
- Coach and mentor team members in analytics project activities.
- Manage large datasets, including manipulation and merging processes.
- Perform other duties as assigned.
Qualifications and Education Requirements
Master's degree in Information Systems, Computer Science, Engineering, or related field, or equivalent experience.Expertise in Azure Data Factory and Databricks.Advanced skills in Azure SQL, Azure Data Lake, Azure App Service, Python, and T-SQL.Experience sourcing, maintaining, and updating data in On-Prem and Cloud environments.Knowledge of basic statistical analysis.Thorough understanding of SQL.Experience with ETL tools and techniques.Experience designing data pipelines with API and Streaming ingestion methods.Ability to understand business problems and articulate optimization needs clearly.Capability to consolidate analytical needs across projects or functions.Strong change management and communication skills.Understanding of data warehousing, cleaning, pipelines, and analytical techniques.Deep understanding of data concepts, data mapping, and requirements building.Knowledge of data models, large datasets, BI tools, and statistical programming languages.Experience with GIT & source control.Skilled in managing data source updates and requirements implementation.J-18808-Ljbffr