Senior Data Engineer
- $160,000 - $180,000
- Home, KS +1
- Remote
Come join a flexible, friendly, and collaborative environment with plenty of opportunities to take charge of your career!
A bit about us:
Our client provides specialty property and casualty insurance for small to middle-market businesses – and is on a mission to be the best-in-class while achieving steady, profitable growth. Their guiding principles include the core belief that their people are number one. They also strongly emphasize a customer-centric mentality and disciplined underwriting practices. Their work environment is flexible, friendly, and collaborative, with plenty of opportunities to take charge of your career.
Why join us?
Generous paid time off (PTO)
Educational assistance program, which covers up to $5,250 in educational costs per year
Comprehensive health insurance plan (with vision and dental)
No-cost health insurance plan available
Life insurance
401(k) retirement plan with up to 6% company match and immediate vesting
Healthcare and dependent care flexible spending accounts
Short-term and long-term disability
Company-sponsored social events
Various committees to get involved in, which include our Diversity, Equity, and Inclusion Committee, Charitable Giving Committee, and Employee Wellness Committee
Educational assistance program, which covers up to $5,250 in educational costs per year
Comprehensive health insurance plan (with vision and dental)
No-cost health insurance plan available
Life insurance
401(k) retirement plan with up to 6% company match and immediate vesting
Healthcare and dependent care flexible spending accounts
Short-term and long-term disability
Company-sponsored social events
Various committees to get involved in, which include our Diversity, Equity, and Inclusion Committee, Charitable Giving Committee, and Employee Wellness Committee
Job Details
Job Summary
Build and maintain a set of managed data pipelines consisting of a series of stages through which data flows for our data warehouse (DWH) and related data stores and data marts. These data pipelines must be created, maintained, and optimized as workloads move from development to production for specific use cases that support long-term strategic goals and short-term tactical plans for creating, managing, and maintaining corporate data systems and software.
Ensure data applications are created to the highest standards and meet all requirements by implementing and maintaining unit tests as well as automated regression, integration, and performance tests.
Help maintain and extend software development coding/data standards – including but not limited to naming standards, documentation standards, and design pattern recommendations.
Develop and maintain Data Documentation so that new and existing databases, flows and stores can be added to an operation manual.
Work with traditional and agile software life cycle methodologies including working with users and Business Analysts to define requirements and create design documents.
Conduct code reviews to ensure developers are following recommended coding practices and adhering to coding standards when releasing changes.
Adhere to and maintain source control and help develop branching and merging strategies to control code promotion and effectively segregate development, test, and production environments.
Create and maintain CI/CD pipelines with Azure DevOps.
Liaise with all aspects of App Delivery and IT including network administrators, systems analysts, testers, and software engineers to assist in resolving problems with software products or company software systems.
Use experience to aid and contribute to the recommendation, scheduling, and performance of Data Landscape software improvements and upgrades, supporting other members of the team in decision making.
Provide status on a regular basis and provide guidance to teammates.
Work independently as well as thrive in a collaborative environment.
Leverage analytical skills and critical thinking to understand current environment and business needs and provide innovative ideas to improve processes.
Requirements and Qualifications:
Relevant 3rd Level qualification (Degree, Masters) in a related discipline (Data Analytics, Data Science, Computer Science, Technology) or equivalent workplace experience.
Minimum 7 years of experience working on Data and Analytics projects.
Minimum 5 years of experience working on Cloud Platforms (Azure, AWS, Google Cloud Platform etc.) and any Cloud certifications (e.g., Azure Data Engineer Associate Certification) preferred.
Minimum 3 years of experience working on Data projects in an Agile environment.
Insurance / Financial industry experience desirable.
Minimum 7 years with ETL tools in a Highly Available DWH setting.
Minimum 7 years DWH development with star/snowflake methodologies.
Minimum 7 years working with relational preferably Microsoft SQL server (Oracle, DB2, Teradata, Netezza etc. accepted) including proven database design and delivery.
Required Technical Skills
Expertise in SQL coding, Stored Procedure, Extended Stored Procedure, T-SQL, PL-SQL, SQL.
Hands-on experience using Microsoft SQL Servers, Visual Studio, Microsoft BI Stack (SSIS, SSRS, SSAS).
Architect level experience leveraging Azure Storages: ADLS Gen2-Blobs, Containers.
Architect level experience leveraging Azure Data Ingestion tools: Azure Data Factory (ADF), Azure Event Hubs, Azure IoT Hubs, Azure Event Grid.
Architect level experience using Azure Data Prep/Train tools: Azure Databricks (ADB), Azure Stream Analytics, Azure Synapse, Azure PowerShell, Azure Stream.
Experience in complex data warehouse and Data Lake House design.
Developer experience using Azure Databases: SQL databases, SQL Servers, Azure Synapse Analytics, Azure SQL, Dedicated SQL Pools, SQL Elastic Pools, etc.
Experience using Azure DevOps: Repos, CI/CD Pipelines, Test Plans.
Strong experience with NoSQL implementation (Mongo, Cassandra, Cosmos).
Well versed in Programming languages like Python, SQL, etc.
Experience with any of the following message / file formats: Parquet, Avro, ORC Protobuf.
Experience with version control systems like Git.
Strong experience building BI reports, data visualizations using Microsoft Power BI or another high scale reporting service (MicroStrategy, Tableau acceptable) a plus.
Strong ability to design, build and manage data pipelines for data structures encompassing data transformation, data models, schemas, metadata, and workload management.
DWH performance optimization performing intelligent sampling and caching a plus.
Experience with related technologies such as JSON, XSLT, XQuery, XPath, a plus.
Strong organizational skills and mindfulness.
Build and maintain a set of managed data pipelines consisting of a series of stages through which data flows for our data warehouse (DWH) and related data stores and data marts. These data pipelines must be created, maintained, and optimized as workloads move from development to production for specific use cases that support long-term strategic goals and short-term tactical plans for creating, managing, and maintaining corporate data systems and software.
Ensure data applications are created to the highest standards and meet all requirements by implementing and maintaining unit tests as well as automated regression, integration, and performance tests.
Help maintain and extend software development coding/data standards – including but not limited to naming standards, documentation standards, and design pattern recommendations.
Develop and maintain Data Documentation so that new and existing databases, flows and stores can be added to an operation manual.
Work with traditional and agile software life cycle methodologies including working with users and Business Analysts to define requirements and create design documents.
Conduct code reviews to ensure developers are following recommended coding practices and adhering to coding standards when releasing changes.
Adhere to and maintain source control and help develop branching and merging strategies to control code promotion and effectively segregate development, test, and production environments.
Create and maintain CI/CD pipelines with Azure DevOps.
Liaise with all aspects of App Delivery and IT including network administrators, systems analysts, testers, and software engineers to assist in resolving problems with software products or company software systems.
Use experience to aid and contribute to the recommendation, scheduling, and performance of Data Landscape software improvements and upgrades, supporting other members of the team in decision making.
Provide status on a regular basis and provide guidance to teammates.
Work independently as well as thrive in a collaborative environment.
Leverage analytical skills and critical thinking to understand current environment and business needs and provide innovative ideas to improve processes.
Requirements and Qualifications:
Relevant 3rd Level qualification (Degree, Masters) in a related discipline (Data Analytics, Data Science, Computer Science, Technology) or equivalent workplace experience.
Minimum 7 years of experience working on Data and Analytics projects.
Minimum 5 years of experience working on Cloud Platforms (Azure, AWS, Google Cloud Platform etc.) and any Cloud certifications (e.g., Azure Data Engineer Associate Certification) preferred.
Minimum 3 years of experience working on Data projects in an Agile environment.
Insurance / Financial industry experience desirable.
Minimum 7 years with ETL tools in a Highly Available DWH setting.
Minimum 7 years DWH development with star/snowflake methodologies.
Minimum 7 years working with relational preferably Microsoft SQL server (Oracle, DB2, Teradata, Netezza etc. accepted) including proven database design and delivery.
Required Technical Skills
Expertise in SQL coding, Stored Procedure, Extended Stored Procedure, T-SQL, PL-SQL, SQL.
Hands-on experience using Microsoft SQL Servers, Visual Studio, Microsoft BI Stack (SSIS, SSRS, SSAS).
Architect level experience leveraging Azure Storages: ADLS Gen2-Blobs, Containers.
Architect level experience leveraging Azure Data Ingestion tools: Azure Data Factory (ADF), Azure Event Hubs, Azure IoT Hubs, Azure Event Grid.
Architect level experience using Azure Data Prep/Train tools: Azure Databricks (ADB), Azure Stream Analytics, Azure Synapse, Azure PowerShell, Azure Stream.
Experience in complex data warehouse and Data Lake House design.
Developer experience using Azure Databases: SQL databases, SQL Servers, Azure Synapse Analytics, Azure SQL, Dedicated SQL Pools, SQL Elastic Pools, etc.
Experience using Azure DevOps: Repos, CI/CD Pipelines, Test Plans.
Strong experience with NoSQL implementation (Mongo, Cassandra, Cosmos).
Well versed in Programming languages like Python, SQL, etc.
Experience with any of the following message / file formats: Parquet, Avro, ORC Protobuf.
Experience with version control systems like Git.
Strong experience building BI reports, data visualizations using Microsoft Power BI or another high scale reporting service (MicroStrategy, Tableau acceptable) a plus.
Strong ability to design, build and manage data pipelines for data structures encompassing data transformation, data models, schemas, metadata, and workload management.
DWH performance optimization performing intelligent sampling and caching a plus.
Experience with related technologies such as JSON, XSLT, XQuery, XPath, a plus.
Strong organizational skills and mindfulness.
Jobot is an Equal Opportunity Employer. We provide an inclusive work environment that celebrates diversity and all qualified candidates receive consideration for employment without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.
Sometimes Jobot is required to perform background checks with your authorization. Jobot will consider qualified candidates with criminal histories in a manner consistent with any applicable federal, state, or local law regarding criminal backgrounds, including but not limited to the Los Angeles Fair Chance Initiative for Hiring and the San Francisco Fair Chance Ordinance.
Sometimes Jobot is required to perform background checks with your authorization. Jobot will consider qualified candidates with criminal histories in a manner consistent with any applicable federal, state, or local law regarding criminal backgrounds, including but not limited to the Los Angeles Fair Chance Initiative for Hiring and the San Francisco Fair Chance Ordinance.