Easy Apply Now

A bit about us:

Top technology company in the logistics industry committed to building an incredible company. We are specialists and experienced in our respective fields dedicated to continuous improvement both professionally and personally.

The challenges in this industry are big and exciting! We are tackling everything from fast and efficient data input to ingesting large amounts of data and applying AI to looking at blockchain to securely digitize paperwork. If you are passionate about humanizing an industry, automating in innovative ways, building for quality and scale, helping make people's lives easier and touching every part of our economy then this is the place for you.

Why join us?


  • Medical, dental and vision insurance covering 90% of premium costs
  • Legal, AD&D, additional life and other employee assistance benefits
  • 401k savings plan with a 4% match
  • Professional growth and development opportunities
  • Manage your life and schedule paid time off program

Job Details

Responsibilities:
  • Develop and implement solutions using Kafka.
  • Administer and improve use of Kafka across the organization including Kafka Connect, ksqlDB, Streams, and custom implementations.
  • Work with multiple teams to ensure best use of Kafka and data-safe event streaming.
  • Understand and apply event-driven architecture patterns and Kafka best practices. Enable development teams to do the same.
  • Assist developers in choosing correct patterns, event modeling, and ensuring data integrity.
  • Continuous learning to be a Confluent/Kafka subject matter expert.
  • Work with Kafka and Confluent API's (e.g. metadata, metrics, admin) to provide pro-active insights and automation.
  • Work with SRE's to ensure Kafka-related metrics are exported to New Relic.
  • Perform regular reviews of performance data to ensure efficiency and resiliency.
  • Contribute regularly to event-driven patterns, best practices, and guidance.
  • Review feature release and change logs for Kafka, Confluent, and other related components to ensure best use of these systems across the organization.
  • Work with lead to ensure all teams are aware of technology changes and impact.
  • Develop an expert-level understanding of data migration and CDC as it relates to Kafka using Kafka Connect and Debezium.
  • Acquire a deep understanding of source and sink connector technical details for a variety of platforms including PostgreSQL, MS SQL Server, Snowflake, and others as required.

Skills and Experience:
  • Be able to describe the primary components of Kafka and their function (brokers, zookeeper, topics).
  • At least two years of experience supporting applications in a production environment.
  • Proficiency in at least one programming language and one scripting language.
  • Proficiency with Docker containers.
  • Ability to participate in and contribute to code management in GitHub including actively collaborating in peer-reviews, feature branches, and resolving conflicts and commits.
  • Microservices experience is a plus.
  • Distributed tracing experience a plus.
  • An understanding of any cloud (Azure preferred) infrastructure and components is a plus but is not required.

Jobot is an Equal Opportunity Employer. We provide an inclusive work environment that celebrates diversity and all qualified candidates receive consideration for employment without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.

Sometimes Jobot is required to perform background checks with your authorization. Jobot will consider qualified candidates with criminal histories in a manner consistent with any applicable federal, state, or local law regarding criminal backgrounds, including but not limited to the Los Angeles Fair Chance Initiative for Hiring and the San Francisco Fair Chance Ordinance.
Easy Apply Now