Imperial PFS
Location: Kansas City, MO (4 days in-office, 1 day remote)
Experience Level: Mid-level (2-4 years)
Department: Data Analytics Team
About The Role
An Analytics Engineer builds and maintains the foundational data infrastructure that transforms raw business data into reliable, analysis-ready insights for decision-making. This position is critical to establishing scalable data models, pipelines, and governance practices that will support our growing analytical needs.
The prime reason for this role's existence is to bridge the gap between raw data ingestion and analytics needs, ensuring that our analysts and stakeholders have access to high-quality, well-documented, and performant data models. This position directly contributes to the company's overall mission by enabling data-driven decision making across all departments, improving operational efficiency, and supporting strategic initiatives through reliable analytics infrastructure.
Key Contributions To The Company Include
Building standardized data models that reduce time-to-insight for business users Implementing data quality and governance frameworks that ensure information is accurate and compliant. Creating reliable, well-documented data pipelines that enable consistent reporting and analytics across all business functions What You'll Do
Data Modeling and Pipeline Development
Design, build, and manage data models and ELT/ETL pipelines to transform raw data into structured formats within Snowflake using dbt Create conform and analytics layers that standardize data for business consumption Develop dimensional models and data marts tailored to business requirements Data Quality and Governance
Implement best practices for data quality, integrity, and performance monitoring Contribute to data governance frameworks, including maintaining data lineage and definitions Establish data quality checks and validation processes within dbt workflows Performance Optimization
Optimize data storage and retrieval processes within Snowflake to ensure scalable, reliable, and cost-effective data solutions Fine-tune SQL queries and data transformations for optimal performance Monitor and improve pipeline efficiency and resource utilization Technical Documentation
Create and maintain clear, comprehensive documentation for data models, processes, and key metrics Document data lineage and maintain metadata for analytical datasets Establish documentation standards and best practices for the team Software Engineering Practices
Leverage version control and CI/CD pipelines for streamlined, reliable development processes Ensure code quality through testing, peer reviews, and automated deployment Collaboration and Stakeholder Management
Work closely with analysts and business subject matter experts to understand requirements Translate business needs into effective technical data solutions Participate in regular stakeholder meetings to refine and expand data capabilities What You Bring
Required Qualifications
2-4 years of experience in analytics engineering, data engineering, or similar role Advanced SQL proficiency in ANSI-SQL for querying, data transformation, and building data infrastructure Snowflake expertise with hands-on experience in its architecture and best practices for data management and processing Strong data modeling skills with deep understanding of data modeling principles and experience designing efficient structures for analytics dbt proficiency for managing data transformations and building data models within Snowflake Passion for detail and quality - skilled at spotting data and process gaps and committed to driving continuous improvement Strong communication skills and comfort collaborating with non-technical stakeholders Intellectual curiosity and drive to understand business problems through data Preferred Qualifications
SnowPro certification or equivalent advanced Snowflake expertise Experience with modern cloud data architecture (data lakes, data lakehouses, cloud data platforms) Experience with ELT/ETL tools (Fivetran experience a plus) Knowledge of data visualization tools (Tableau, Looker, Power BI) Previous experience in insurance, finance, or regulated industries Python programming experience for data parsing, transformation, and scripting within data pipelines Imperial PFS is an equal opportunity employer.
#J-18808-Ljbffr
Building standardized data models that reduce time-to-insight for business users Implementing data quality and governance frameworks that ensure information is accurate and compliant. Creating reliable, well-documented data pipelines that enable consistent reporting and analytics across all business functions What You'll Do
Data Modeling and Pipeline Development
Design, build, and manage data models and ELT/ETL pipelines to transform raw data into structured formats within Snowflake using dbt Create conform and analytics layers that standardize data for business consumption Develop dimensional models and data marts tailored to business requirements Data Quality and Governance
Implement best practices for data quality, integrity, and performance monitoring Contribute to data governance frameworks, including maintaining data lineage and definitions Establish data quality checks and validation processes within dbt workflows Performance Optimization
Optimize data storage and retrieval processes within Snowflake to ensure scalable, reliable, and cost-effective data solutions Fine-tune SQL queries and data transformations for optimal performance Monitor and improve pipeline efficiency and resource utilization Technical Documentation
Create and maintain clear, comprehensive documentation for data models, processes, and key metrics Document data lineage and maintain metadata for analytical datasets Establish documentation standards and best practices for the team Software Engineering Practices
Leverage version control and CI/CD pipelines for streamlined, reliable development processes Ensure code quality through testing, peer reviews, and automated deployment Collaboration and Stakeholder Management
Work closely with analysts and business subject matter experts to understand requirements Translate business needs into effective technical data solutions Participate in regular stakeholder meetings to refine and expand data capabilities What You Bring
Required Qualifications
2-4 years of experience in analytics engineering, data engineering, or similar role Advanced SQL proficiency in ANSI-SQL for querying, data transformation, and building data infrastructure Snowflake expertise with hands-on experience in its architecture and best practices for data management and processing Strong data modeling skills with deep understanding of data modeling principles and experience designing efficient structures for analytics dbt proficiency for managing data transformations and building data models within Snowflake Passion for detail and quality - skilled at spotting data and process gaps and committed to driving continuous improvement Strong communication skills and comfort collaborating with non-technical stakeholders Intellectual curiosity and drive to understand business problems through data Preferred Qualifications
SnowPro certification or equivalent advanced Snowflake expertise Experience with modern cloud data architecture (data lakes, data lakehouses, cloud data platforms) Experience with ELT/ETL tools (Fivetran experience a plus) Knowledge of data visualization tools (Tableau, Looker, Power BI) Previous experience in insurance, finance, or regulated industries Python programming experience for data parsing, transformation, and scripting within data pipelines Imperial PFS is an equal opportunity employer.
#J-18808-Ljbffr