Create Email Alert

ⓘ There was an unexpected error processing your request.

Please refresh the page and try again.

If the problem persists, please contact us with your issue.

Email address is already registered

You can always manage your preferences and update your interests to ensure you receive the most relevant opportunities.

Would you like to [visit your alert settings] now?

Success! You're now signed up for Job Alerts

Get ready to discover your next great opportunity.

Similar Jobs

  • Confluent

    Engineering Manager - Kafka Coordinators

    Raleigh, NC, United States

    With Confluent, organizations can harness the full power of continuously flowing data to innovate and win in the modern digital world. We have a purpose that drives us to do better every day we're creating an entirely new category within data infrastructure - data streaming. This technology will allow every organization to create experiences and us

    Job Source: Confluent
  • Confluent

    Senior Software Engineer-Kafka Storage_

    Raleigh

    With Confluent, organizations can harness the full power of continuously flowing data to innovate and win in the modern digital world. We have a purpose that drives us to do better every day – we're creating an entirely new category within data infrastructure - data streaming. This technology will allow every organization to create experiences and

    Job Source: Confluent
  • Irvine Technology Corporation

    Confluent Kafka Engineer 100% Remote

    raleigh, nc

    Confluent Kafka Engineer (Remote) Location: Remote This job expects to pay about $65 - 70 per hour plus benefits. What Gets You The Job: 5+ years of experience programming in a backend language (Java / Python), with good understanding of troubleshooting errors. 3-5 years of experience on Confluent Kafka / 3+ years of experience on Confluent Kafka

    Job Source: Irvine Technology Corporation
  • Confluent

    Senior Software Engineer-Kafka Storage

    Raleigh, NC, United States

    With Confluent, organizations can harness the full power of continuously flowing data to innovate and win in the modern digital world. We have a purpose that drives us to do better every day - we're creating an entirely new category within data infrastructure - data streaming. This technology will allow every organization to create experiences and

    Job Source: Confluent
  • Canonical - Jobs

    Software Engineer - Data Infrastructure - Kafka

    Raleigh

    Job Description Job Description Canonical is building a comprehensive automation suite to provide multi-cloud and on-premise data solutions for the enterprise. The data platform team is a collaborative team that develops a managed solutions for a full range of data stores and data technologies, spanning from big data, through NoSQL, cache-layer c

    Job Source: Canonical - Jobs
  • Omni Inclusive

    Sr. Big Data Admin

    Raleigh, NC, United States

    • Ending Soon

    Job Description : Consistent, comprehensive and integrated technical platforms provide a solid foundation for scalable, reliable business applications, data management and product development. The technical platform management team ensure the availability, maintenance and evolution of FSG Cloudera platform technology stack . Core responsibilities:

    Job Source: Omni Inclusive
  • eTeam

    Dynatrace Admin

    Raleigh, NC, United States

    • Ending Soon

    5+ years of Dynatrace Monitoring experience and Hands on Dynatrace Config, Install & Support creating dashboards with solid problem diagnosing skills. Strong knowledge in configuring pure path, collector and agent for Dynatrace Must be able to troubleshoot and diagnose issues Hands on expertise on other monitoring tools like Solarwinds , PRTG, S

    Job Source: eTeam
  • eTeam

    Oracle Admin

    Morrisville, NC, United States

    • Ending Soon

    Pay range- $55-$60 Key Skills Required: - Installation of Oracle Software/Database Pre-requisites. Oracle Database Basic Administration and Latest Features. - Maintain instance parameters and system settings - Migrating databases using various tools such as SQL*Loader, Import/Export and Data pump. - Applying and Testing security patches to the da

    Job Source: eTeam

Kafka Admin

Cary, NC, United States

Experience

Overall 5+ years of experience out of which 2+ years around Confluent Platform administration

Mandatory Job Requirements

Manage single and multi-node Kafka cluster deployed on VM, Docker and Kubernetes Container platform. Experience with Confluent Platform running on-prem

Perform Kafka Cluster build, including Design, Infrastructure planning, High Availability and Disaster Recovery

Implementing wire encryption using SSL, authentication using SASL/LDAP & authorization using Kafka ACLs in Zookeeper, Broker/Client, Connect cluster/connectors, Schema Registry, REST API, Producers/Consumers, Ksql

Perform high-level, day-to-day administration and support functions

Upgrades for the Kafka Cluster landscape comprising of Development, Test, Staging and Production/DR systems

Creation of key performance metrics, measuring the utilization, performance, and overall health of the cluster.

Capacity planning and implementation of new/upgraded hardware and software releases as well as for storage infrastructure.

Research and recommend innovative ways to maintain the environment and where possible, automate key administration tasks.

Ability to work with various infrastructure, administration, and development teams across business units

Document and share design, build, upgrade and standard operating procedures. Conduct knowledge transfer sessions and workshops for other members in the team. Provide technical expertise and guidance to new and junior members in the team

Create topics, setup Apache Kafka MirrorMaker 2, Confluent Replicator to replicate the topics, create connect clusters, Schemas for the topics using Confluent Schema Registry

Configure various Opensource and licensed Kafka Source/Sink Connectors such as Kafka Connect for SAP HANA, Debezium Oracle and MySQL Connectors, Confluent JDBC source/sink, Confluent ADLS2 Sink connector and Confluent Oracle CDC source connector...

Develop and maintain Unix scripts to perform day to day Kafka Admin and Security related functions using Confluent REST Proxy server

Setting up monitoring tools such as Prometheus, Grafana to scrape metrics from various Kafka cluster components (Broker, Zookeeper, Connect, REST proxy, Mirror Maker, Schema Registry ...) and other endpoints such as webservers, databases, logs etc. and configure alerts for Kafka Cluster and supporting infrastructure to measure availability and performance SLAs

Experience with Confluent ksql to query and process Kafka streams

Knowledge of Kafka Producer and Consumer APIs, Kafka Stream Processing, Confluent Ksql

Availability to work in shifts, extended hours and to provide on-call support as required. There will be work over weekends at times depending on the project needs.

Must have excellent communications and interpersonal skills

Preferred but Optional skills Linux (SLES or RHEL) system administration (basic or advanced), creating shell scripts ..

Working experience on docker and Kubernetes clusters (opensource, Rancher, RedHat OCP, Client Tanzu) involving administration of containers (Operator level skills), deployments, updates, integration with products running outside of the cluster

Working knowledge with container registry such as Harbor, Quay, Nexus etc. Exposure to Container/artifact scanners such as Trivy, Claire ...

Security related config for above listed software or any other tools in SSL for wire encryption, integration with AD for authentication and RBAC for authorizations

Implemented and supported any enterprise product such as any well-known ERP products, Data warehouse, Middleware etc.

Database administration skills in Oracle, MSSQL, SAP HANA, DB2, Aerospike, Postgres ..

Exposure to SaaS based observability platform like New Relic

Deployment of container images and pods using CI/CD pipelines using Jenkins or comparable tools.

Experience in building Kafka deployment pipelines using Terraform, Ansible, Cloud formation templates, shells etc.

Worked in Public cloud environment such as Azure or AWS or GCP, preferably in Azure

Apply

Create Email Alert

Create Email Alert

Kafka Admin jobs in Cary, NC, United States

ⓘ There was an unexpected error processing your request.

Please refresh the page and try again.

If the problem persists, please contact us with your issue.

Email address is already registered

You can always manage your preferences and update your interests to ensure you receive the most relevant opportunities.

Would you like to [visit your alert settings] now?

Success! You're now signed up for Job Alerts

Get ready to discover your next great opportunity.