Location : Charlotte, NC or Richmond, VA or Kennesaw, GA (onsite 3x a week)
Hadoop SME (actual Hadoop not just worked in an environment)
Day to Day Responsibilities / project specifics : Joining an ongoing Incentive Comp project in Global Human Resources as they transition to Oracle and Hadoop. The team doesn't have the Hadoop experience necessary and needs a SME to come in and help guide / teach them.
Job Description :
As a Hadoop & ETL Developer to develop required code in Hadoop platform to support Incentive Compensation Management (ICM) requirements, individual will be responsible for understanding design, propose high level and detailed design solutions, propose out of box technical solutions for business requirements and also provide solutions to production related issues. Individual should be flexible to work with offshore team and provide the SME knowledge on Hadoop and related tools. As an individual contributor, person should have good analytical skills to take a quick decision during production down and process abend situations. Engage in discussions with information architecture team for coming out with design solutions, proposing new technology adoption ideas, attending project meetings, partnering with offshore teammates, coordinating with other support teams like APS, testing, upstream and downstream partners, etc.
Job Responsibilities :
BAU development activities like analysis, coding, propose improvement ideas, drive the development activities at offshore
Work on multiple projects concurrently, take ownership & pride in the work done by them, attending project meetings, understanding requirements, designing solutions, develop the code
Partnering with offshore teammates and work with other support teams like L2, testing, up / down stream partners, etc.
Work with the Business Analysts to understand the requirements work on high level & detailed design to address the real time issues in production
Partner with Information Architecture team for proposing technical solutions to business problems
Identify gaps in technology and propose viable solutions
Take accountability of the technical deliveries from offshore
Work with the development teams and QA during the post code development phase
Identify improvement areas within the application and work with the respective teams to implement the same
Ensuring adherence to defined process & quality standards, best practices, high quality levels in all deliverables
Required Qualifications :
Required Qualifications :
10+ years of experience developing ETL solutions using any ETL tool like Informatica.
10+ years of experience developing applications using Hadoop, Spark, Impala, Hive & Python
10+ years of experience in running, using and troubleshooting the ETL & Cloudera Hadoop Ecosystem i.e. Hadoop FS, Hive, Impala, Spark, Kafka, Hue, Oozie, Yarn, Sqoop, Flume
Experience on Autosys JIL scripting
Proficient scripting skills i.e. Unix shell & Perl Scripting
Knowledge of troubleshooting data-related issues
Experience processing large amounts of structured and unstructured data with MapReduce
Experience in SQL and Relation Database developing data extraction applications
Experience with data movement and transformation technologies
Skills :
Application Development
Automation
Solution Design
Technical Strategy Development
Architecture
Developer • GA, United States