Senior Kafka Administrator with Ansible at Centurion Consulting Group LLC summary :
Senior Kafka Administrator with expertise in designing, implementing, and managing Kafka platforms using Ansible automation and Linux administration. Responsible for architecting data streaming and event-driven architectures, ensuring platform reliability, automation, monitoring, and incident triage. Experienced in DevOps practices, scripting, containerization, and infrastructure-as-code to support mission-critical federal government systems.
Job Description
Centurion is looking for a Senior Kafka Admin with Ansible for our federal government client. This is a long term position that is 100% onsite in Woodlawn, MD.
Key Required Skills :
Kafka Architecture, Ansible Automation, RHEL / Linux Administration, Scripting (Bash, Shell, Python), Availability Monitoring / Triage (Splunk, Dynatrace, Prometheus).
Position Description :
- Architect, design, develop, and implement next-generation data streaming and event-based architecture / platform using software engineering best practices in the latest technologies :
o Data Streaming, Event Driven Architecture, Event Processing Frameworks
o DevOps (Jenkins, Red Hat OpenShift, Docker, SonarQube)
o Infrastructure-as-Code and Configuration-as-Code (Ansible, Terraform / CloudFormation, Scripting)
Administer Kafka including automating, installing, migrating, upgrading, deploying, troubleshooting, and configuring on Linux.Provide expertise in one or more of these areas : Kafka administration, event-driven architecture, automation, application integration, monitoring and alerting, security, business process management / business rules processing, CI / CD pipeline and containerization, or data ingestion / data modeling.Investigate, repair, and actively ensure business continuity regardless of impacted component : Kafka Platform, business logic, middleware, networking, CI / CD pipeline, or database (PL / SQL and Data Modeling).Brief management, customer, team, or vendors using written or oral skills at appropriate technical level for audienceAll other duties as assigned or directedSkills Requirements :
Bachelor's Degree in Computer Science, Mathematics, Engineering or a related field.Masters or Doctorate degree may substitute for required experience8+ years of combined experience with Site Reliability Engineering, providing DevOps support, and / or RHEL administration for mission-critical platforms, ideally Kafka.4+ years of combined experience with Kafka (Confluent Kafka, Apache Kafka, Amazon MSK)4+ years of experience with Ansible automationMust be able to obtain and maintain a Public Trust. Contract requirement.Selected candidate must be willing to work on-site in Woodlawn, MD 5 days a week.Strong experience with Ansible Automation and authoring playbooks and roles for installing, maintaining, or upgrading platformsSolid experience using version control software such as Git / Bitbucket including peer reviewing Ansible playbooksHands-on experience administrating Kafka platform (Confluent Kafka, Apache Kafka, Amazon MSK) via Ansible playbooks or other automation.Understanding of Kafka architecture, including partition strategy, replication, transactions, tiered storage, and disaster recovery strategies.Strong experience in automating tasks with scripting languages like Bash, Shell, or PythonSolid foundation of Red Hat Enterprise Linux (RHEL) administrationBasic networking skillsSolid experience triaging and monitoring complex issues, outages, and incidentsExperience with integrating / maintaining various 3rd party tools like ZooKeeper, Flink, Pinot, Prometheus, and Grafana.Experience with Platform-as-a-Service (PaaS) using Red Hat OpenShift / Kubernetes and Docker containersExperience working on Agile projects and understanding Agile terminology.Desired :
Preferred Confluent Certified Administrator for Apache Kafka (CCAAK) or Confluent Certified Developer for Apache Kafka (CCDAK)Practical experience with event-driven applications and at least one event processing framework, such as Kafka Streams, Apache Flink, or ksqlDB.Understanding of Domain Driven Design (DDD) and experience applying DDD patterns in software development.Experience working with Kafka connectors and / or supporting operation of the Kafka Connect APIExperience with Avro / JSON data serialization and schema governance with Confluent Schema Registry.Preferred experience with AWS cloud technologies or other cloud providers; AWS cloud certifications.Experience with Infrastructure-as-Code (CloudFormation / Terraform, Scripting)Solid knowledge of relational databases (PostgreSQL, DB2, or Oracle), NoSQL databases (MongoDB, Cassandra, DynamoDB), SQL, or / and ORM technologies (JPA2, Hibernate, or Spring JPA)Knowledge of Social Security Administration (SSA)Education :
Bachelor's Degree with 7+ years of experienceMust be able to obtain and maintain a Public Trust. Contract requirement.Keywords :
Kafka Administration, Ansible Automation, RHEL Linux, Data Streaming, Event-Driven Architecture, DevOps, Infrastructure as Code, Scripting Bash Python, Monitoring and Alerting, Containerization Docker Kubernetes