What Working at Hexaware offers :
Hexaware is a dynamic and innovative IT organization committed to delivering cutting-edge solutions to our clients worldwide.
We pride ourselves on fostering a collaborative and inclusive work environment where every team member is valued and empowered to succeed.
Hexaware provides access to a vast array of tools that enhance, revolutionize, and advance professional profile. We complete the circle with excellent growth opportunities, chances to collaborate with highly visible customers, chances to work alongside bright brains, and the perfect work-life balance.
With an ever-expanding portfolio of capabilities, we delve deep into and identify the source of our motivation. Although technology is at the core of our solutions, it is still the people and their passion that fuel Hexaware’s commitment towards creating smiles.
At Hexaware we encourage to challenge oneself to achieve full potential and propel growth. We trust and empower to disrupt the status quo and innovate for a better future.
We encourage an open and inspiring culture that fosters learning and brings talented, passionate, and caring people together.
We are always interested in, and want to support, the professional and personal you. We offer a wide array of programs to help expand skills and supercharge careers.
We help discover passion the driving force that makes one smile and innovate, create, and make a difference every day.
The Hexaware Advantage : Your Workplace Benefits
- Excellent Health benefits with low-cost employee premium.
- Wide range of voluntary benefits such as Legal, Identity theft and Critical Care Coverage
- Unlimited training and upskilling opportunities through Udemy and Hexavarsity
Role : Data Governance
Location : Fort Worth, Texas
Work Mode : Hybrid
Salary Range : $120K - $130K
Proven experience (3+Yrs) in data governance roles, preferably with hands-on experience using Alation Admin / Collibra
Responsibilities
- Leads the delivery processes of data extraction, transformation, and load from disparate sources into a form that is consumable by analytics processes, for projects with moderate complexity, using strong technical capabilities and sense of database performance
- Designs, develops and produces data models of relatively high complexity, leveraging a sound understanding of data modelling standards to suggest the right model depending on the requirement
- Batch Processing - Capability to design an efficient way of processing
high volumes of data where a group of transactions is collected over a period
Data Integration (Sourcing, Storage and Migration) - Capability to design and implement models, capabilities, and solutions to manage data within the enterprise (structured and unstructured, data archiving principles, data warehousing, data sourcing, etc.
This includes the data models, storage requirements and migration of data from one system to another
Data Quality, Profiling and Cleansing - Capability to review (profile) a
data set to establish its quality against a defined set of parameters and to highlight data where corrective action (cleansing) is required to
remediate the data
- Stream Systems - Capability to discover, integrate, and ingest all available data from the machines that produce it, as fast as it is produced, in any format, and at any quality
- Excellent interpersonal skills to build network with variety of department across business to understand data and deliver business value and may interface and communicate with program teams, management and stakeholders as required to deliver small to medium-sized projects
- Understand the difference between on-prem and cloud-based data integration technologies.
The Role offers
- Opportunity to join a global team to do meaningful work that contributes to global strategy and individual development
- An outstanding opportunity to re-imagine, redesign, and apply technology to add value to the business and operations
- Gives an opportunity to showcase candidates’ strong analytical skills and problem-solving ability
- Learning & Growth opportunities in cloud and Big data engineering spaces
Essential Skills
- 6+ years’ experience in developing large scale data pipelines in a cloud / on-prem environment.
- Highly Proficient in any or more of market leading ETL tools like Informatica, DataStage, SSIS, Talend, etc.,
- Deep knowledge in Data warehouse / Data Mart architecture and modelling
- Define and develop data ingest, validation, and transform pipelines.
- Deep knowledge of distributed data processing and storage
- Deep knowledge of working with structured, unstructured, and semi structured data
- Working experience needed with ETL / ELT patterns
- Extensive experience in the application of analytics, insights and datamining to commercial real-world problems
- Technical experience in anyone programming language preferably, Java, .NET or Python
Essential Qualification
BE / Btech in Computer Science, Engineering or relevant field.
Privacy Statement :
The information you provide will be used in accordance with the terms of our Privacy Policy and will be used specifically for the business / processing purpose of the event.
You should be aware that we may share your details with our approved vendors for this event to be handled successfully.