02/16/2026

Senior Data Engineer

Job Description

Location(s)

Birmingham, Alabama, Chicago, Illinois, Dallas, Texas, Jacksonville, Florida

Details

Kemper is one of the nation’s leading specialized insurers. Our success is a direct reflection of the talented and diverse people who make a positive difference in the lives of our customers every day. We believe a high-performing culture, valuable opportunities for personal development and professional challenge, and a healthy work-life balance can be highly motivating and productive. Kemper’s products and services are making a real difference to our customers, who have unique and evolving needs. By joining our team, you are helping to provide an experience to our stakeholders that delivers on our promises.   

Position Summary: 

We are seeking a highly skilled Senior Data Engineer to guide the design, development, and delivery of our enterprise data platforms. This role requires deep technical expertise with Snowflake, Core AWS Services, cloud-based architecture, and modern data engineering frameworks. As a senior engineer, you will define architectural patterns, mentor engineering teams, and ensure the successful execution of strategic data initiatives that support analytics, reporting, and business intelligence efforts across the organization. 

Position Responsibilities: 

Technical Leadership & Architecture 

  • Provide technical leadership to data engineers, setting standards for solution design, coding practices, data governance, and quality. 

  • Define and evolve the enterprise data architecture leveraging Snowflake, Data Vault 2.0, and modern event driven and ELT frameworks. 

  • Lead architecture reviews, establish engineering best practices, and guide platform modernization efforts. 

  • Mentor engineering team members, facilitate code reviews, and promote continuous learning and innovation. 

Solution Design & Data Engineering 

  • Architect and oversee the delivery of scalable data pipelines, ingestion frameworks, and transformation processes using Snowflake, Python, Spark, Informatica (PowerCenter/IDMC), AWS Glue, and cloud-native tooling. 

  • Design and maintain enterprise data models, including Data Vault 2.0 (Hubs, Links, Satellites, PIT, Bridge structures) and dimensional models. 

  • Develop and optimize Snowflake platform capabilities, including: 

  • Snowpipe, Streams, Tasks, File Formats, External Stages 

  • Dynamic Tables and ELT pipeline automation 

  • Warehouse sizing, performance tuning, and cost optimization 

  • Implement scalable batch, micro-batch, and real-time ingestion solutions following best practices. 

Stakeholder Engagement & Delivery Ownership 

  • Translate complex business requirements into technical design specifications and actionable development plans. 

  • Partner with product owners, business stakeholders, architects, and analytics teams to deliver high-impact data solutions. 

  • Manage technical execution across multiple initiatives, ensuring alignment with enterprise priorities, data strategies, and delivery timelines. 

  • Oversee documentation, deployment readiness, testing processes, and quality assurance for production releases. 

Operational Excellence & Production Support 

  • Ensure operational reliability, data accuracy, and performance of enterprise data warehouse and analytical environments. 

  • Lead root-cause analysis for production issues and drive implementation of long-term preventative solutions. 

  • Implement robust monitoring frameworks using tools such as Snowflake Resource Monitors, CloudWatch, Datadog, or equivalent observability platforms. 

  • Conduct performance optimization on Snowflake workloads, SQL queries, pipelines, and integrations. 

 

Position Qualifications: 

  • 8+ years of experience in data engineering, with 2+ years in a senior, lead, or architect capacity. 

  • Advanced hands-on expertise in Snowflake, including constant learning on advanced new features. 

  • Strong experience implementing Data Vault 2.0 models and automated ELT frameworks. 

  • Proficiency with ETL/ELT and data integration tools such as Informatica IDMC, PowerCenter,   Pentaho, AWS Glue, Control M and Python-based pipelines. 

  • Deep understanding of: 

  • Data warehousing and analytics engineering principles 

  • Dimensional modeling and relational structures 

  • ELT patterns, CDC frameworks, and metadata-driven design 

  • Orchestration frameworks (Control-M, Airflow) 

  • Strong SQL expertise with experience in stored procedures, Snowflake Scripting, and complex query optimization. 

  • Hands-on experience designing and delivering large-scale AWS-based data platforms. 

  • Demonstrated leadership in guiding teams, influencing architecture decisions, and managing technical delivery. 

  • Excellent communication and collaboration skills with the ability to partner across technical and business teams. 

 

Preferred Qualifications: 

  • Experience with advanced Snowflake features: 

  • Snowpark (Python/Java/Scala) 

  •   Snowflake Warehouses, Databases, Schemas, RBAC 

  •   Snowpipe, Tasks, Streams, Stages, File Formats 

  •   Performance tuning, warehouse optimization, clustering 

  •   Data sharing, secure data sharing, reader accounts 

  • AI/ML Function 

  • Experience with modern DataOps, DevOps, and automation frameworks: 

  • CI/CD (GitHub Actions, GitLab CI, Azure DevOps) 

  • Infrastructure-as-Code (Terraform, CloudFormation) 

  • Automated testing frameworks for data pipelines 

  • Familiarity with data quality and governance tools such as Great Expectations, Monte Carlo, Datafold, Soda, Alation, or Collibra. 

  • Experience with streaming technologies such as Kafka, Kinesis, or MSK. 

  • Exposure to ML feature engineering pipelines or MLOps practices. 

  • P& C Insurance industry experience, specifically auto insurance experience required and is strongly preferred. 

  • Prior experience on data warehousing with Guidewire PolicyCenter ,ClaimCenter or BillingCenter and Agency Licensing data is preferred 

  • Bachelor’s degree in Computer Science, Information Systems, Engineering, or equivalent relevant experience/certifications. 

 

Why Join Us 

  • Opportunity to lead the technical direction of an enterprise-wide data platform. 

  • Work with modern cloud-native technologies and shape the future of the organization’s data strategy. 

  • Collaborative, growth-oriented environment with strong investment in data modernization. 

  • Sponsorship is not accepted for this opportunity. 

  • This position can be worked hybrid out of a local Kemper office, including Chicago, IL, Birmingham, AL, Richardson, TX, or Jacksonville, FL.

  • The range for this position is $99,000 to $164,800. When determining candidate offers, we consider experience, skills, education, certifications, and geographic location among other factors.  This job is eligible for an annual discretionary bonus and Kemper benefits (Medical, Dental, Vision, PTO, 401k, etc.)   

 

 

Kemper is proud to be an equal opportunity employer. All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran, disability status or any other status protected by the laws or regulations in the locations where we operate. We are committed to supporting diversity and equality across our organization and we work diligently to maintain a workplace free from discrimination.   

Kemper does not accept unsolicited resumes through or from search firms or staffing agencies. All unsolicited resumes will be considered the property of Kemper and Kemper will not be obligated to pay a placement fee. 

Kemper will never request personal information, such as your social security number or banking information, via text or email.  Additionally, Kemper does not use external messaging applications like WireApp or Skype to communicate with candidates.  If you receive such a message, delete it.  

#LI-JO1 

#LI-Hybrid 

 


Apply Now