Data Engineer
- $100,000 - $135,000
- Boulder, CO
- Remote
Our client based out of the US is a startup looking to hire a Data Engineer!
A bit about us:
We are seeking a highly skilled and experienced Data Engineer to join our dynamic team. As an integral member of our technology department, you will be responsible for designing, developing, and maintaining robust, scalable data infrastructure and pipelines that will support our company's data-driven initiatives. This exciting role provides an opportunity to work with a variety of cutting-edge technologies and tools, including Scala, Kafka, Google Cloud, GCP, AWS, and Python. The ideal candidate will have a strong background in data warehousing and big data solutions, with a keen interest in leveraging data to drive business decisions.
Why join us?
- pre ipo equity offering with 4 year vesting period
- 401k match
- remote work
- ability to learn and take on new challenges/ grow your career in the Data Engineering world
- unlimited PTO with Company taking 1 week off for summer and winter
Job Details
Responsibilities:
1. Design, build, and maintain efficient, reusable, and reliable data pipelines using Scala, Kafka, and Python.
2. Work closely with data scientists and analysts to understand and implement data requirements.
3. Ensure the performance, security, and availability of databases.
4. Develop and maintain scalable data processing systems for our Big Data platform.
5. Build and maintain ETL scripts and data pipelines to support new data integrations.
6. Manage and optimize data retrieval, storage, and distribution across our data warehouse and Big Data platforms.
7. Collaborate with other team members to integrate systems and data quickly and effectively.
8. Troubleshoot data-related issues and work to identify and implement optimal solutions.
9. Continually evaluate the efficiency of existing systems and propose strategic improvements or new technologies.
Qualifications:
1. Bachelor's degree in Computer Science, Engineering, or a related field.
2. Minimum of 3+ years of experience as a Data Engineer or in a similar role.
3. Proven experience with Scala, Kafka, Google Cloud, GCP, AWS, and Python.
4. Strong knowledge of data warehousing, ETL processes, and big data platforms.
5. Experience with data pipeline and workflow management tools.
6. Strong analytical skills and ability to process complex, large-scale data.
7. Excellent problem-solving skills, attention to detail, and ability to work independently as well as in a team.
8. Strong communication skills, with the ability to present complex ideas in a clear and concise manner.
9. Familiarity with machine learning algorithms and data visualization tools is a plus.
1. Design, build, and maintain efficient, reusable, and reliable data pipelines using Scala, Kafka, and Python.
2. Work closely with data scientists and analysts to understand and implement data requirements.
3. Ensure the performance, security, and availability of databases.
4. Develop and maintain scalable data processing systems for our Big Data platform.
5. Build and maintain ETL scripts and data pipelines to support new data integrations.
6. Manage and optimize data retrieval, storage, and distribution across our data warehouse and Big Data platforms.
7. Collaborate with other team members to integrate systems and data quickly and effectively.
8. Troubleshoot data-related issues and work to identify and implement optimal solutions.
9. Continually evaluate the efficiency of existing systems and propose strategic improvements or new technologies.
Qualifications:
1. Bachelor's degree in Computer Science, Engineering, or a related field.
2. Minimum of 3+ years of experience as a Data Engineer or in a similar role.
3. Proven experience with Scala, Kafka, Google Cloud, GCP, AWS, and Python.
4. Strong knowledge of data warehousing, ETL processes, and big data platforms.
5. Experience with data pipeline and workflow management tools.
6. Strong analytical skills and ability to process complex, large-scale data.
7. Excellent problem-solving skills, attention to detail, and ability to work independently as well as in a team.
8. Strong communication skills, with the ability to present complex ideas in a clear and concise manner.
9. Familiarity with machine learning algorithms and data visualization tools is a plus.
Jobot is an Equal Opportunity Employer. We provide an inclusive work environment that celebrates diversity and all qualified candidates receive consideration for employment without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.
Sometimes Jobot is required to perform background checks with your authorization. Jobot will consider qualified candidates with criminal histories in a manner consistent with any applicable federal, state, or local law regarding criminal backgrounds, including but not limited to the Los Angeles Fair Chance Initiative for Hiring and the San Francisco Fair Chance Ordinance.
Sometimes Jobot is required to perform background checks with your authorization. Jobot will consider qualified candidates with criminal histories in a manner consistent with any applicable federal, state, or local law regarding criminal backgrounds, including but not limited to the Los Angeles Fair Chance Initiative for Hiring and the San Francisco Fair Chance Ordinance.