Senior Data Engineer
- $130,000 - $150,000
- Kansas City, MO
Global Organization looking for Senior Data Engineer to help with IT Transformation.
A bit about us:
We are on the hunt for a dynamic, innovative, and experienced Senior Data Engineer to join our team. The successful candidate will be responsible for developing, maintaining, and testing architectures such as databases and large-scale processing systems. As a Senior Data Engineer, you will be instrumental in transforming data into a format that can be easily analyzed and will play a vital role in our company's data-driven decision-making process. You will be working with a team of dedicated professionals in a fast-paced, challenging, and rewarding environment.
Why join us?
Global Organization looking for Senior Data Engineer to help with IT Transformation.
Job Details
Responsibilities:
1. Design, construct, install, test, and maintain highly scalable data management systems.
2. Ensure systems meet business requirements and industry practices.
3. Build high-performance algorithms, prototypes, predictive models, and proof of concepts.
4. Research opportunities for data acquisition and new uses for existing data.
5. Develop data set processes for data modeling, mining, and production.
6. Integrate new data management technologies and software engineering tools into existing structures.
7. Create custom software components and analytics applications.
8. Employ a variety of languages and tools to marry systems together.
9. Recommend ways to improve data reliability, efficiency, and quality.
10. Collaborate with data architects, modelers, and IT team members on project goals.
Qualifications:
1. Bachelor's degree in Computer Science, Engineering, or a related field. A Master's degree is a plus.
2. Proven experience as a Data Engineer, Software Developer, or similar.
3. Minimum five years of experience in the field with specific experience in Snowflake SQL, Databricks, ELT, Database Architecture, Azure Data Factory, Data Warehouse, NoSQL, Big Data, and ETL.
4. Strong analytic skills related to working with unstructured datasets.
5. Proficient in scripting languages like Python, Java, Scala, etc.
6. Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
7. Experience with data pipeline and workflow management tools.
8. Experience with AWS cloud services.
9. Experience with big data tools: Hadoop, Spark, Kafka, etc.
10. Strong project management and organizational skills.
11. Excellent communication skills and the ability to work as part of a team.
12. Ability to translate complex technical terminology, concepts, and issues in terms understandable to technical and non-technical management and staff.
1. Design, construct, install, test, and maintain highly scalable data management systems.
2. Ensure systems meet business requirements and industry practices.
3. Build high-performance algorithms, prototypes, predictive models, and proof of concepts.
4. Research opportunities for data acquisition and new uses for existing data.
5. Develop data set processes for data modeling, mining, and production.
6. Integrate new data management technologies and software engineering tools into existing structures.
7. Create custom software components and analytics applications.
8. Employ a variety of languages and tools to marry systems together.
9. Recommend ways to improve data reliability, efficiency, and quality.
10. Collaborate with data architects, modelers, and IT team members on project goals.
Qualifications:
1. Bachelor's degree in Computer Science, Engineering, or a related field. A Master's degree is a plus.
2. Proven experience as a Data Engineer, Software Developer, or similar.
3. Minimum five years of experience in the field with specific experience in Snowflake SQL, Databricks, ELT, Database Architecture, Azure Data Factory, Data Warehouse, NoSQL, Big Data, and ETL.
4. Strong analytic skills related to working with unstructured datasets.
5. Proficient in scripting languages like Python, Java, Scala, etc.
6. Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
7. Experience with data pipeline and workflow management tools.
8. Experience with AWS cloud services.
9. Experience with big data tools: Hadoop, Spark, Kafka, etc.
10. Strong project management and organizational skills.
11. Excellent communication skills and the ability to work as part of a team.
12. Ability to translate complex technical terminology, concepts, and issues in terms understandable to technical and non-technical management and staff.
Jobot is an Equal Opportunity Employer. We provide an inclusive work environment that celebrates diversity and all qualified candidates receive consideration for employment without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.
Sometimes Jobot is required to perform background checks with your authorization. Jobot will consider qualified candidates with criminal histories in a manner consistent with any applicable federal, state, or local law regarding criminal backgrounds, including but not limited to the Los Angeles Fair Chance Initiative for Hiring and the San Francisco Fair Chance Ordinance.
Sometimes Jobot is required to perform background checks with your authorization. Jobot will consider qualified candidates with criminal histories in a manner consistent with any applicable federal, state, or local law regarding criminal backgrounds, including but not limited to the Los Angeles Fair Chance Initiative for Hiring and the San Francisco Fair Chance Ordinance.