Logo
Blue Margin

Senior Data Engineer

Blue Margin, Fort Collins, Colorado, us, 80523

Save Job

Why This Role Exists We help mid-market companies turn their data into a strategic asset. Our clients rely on us to design and deliver reporting platforms that fuel better, faster decision-making. We’re passionate about helping clients increase company value through better analysis and decision-making, and we’re looking for a Senior Data Engineer to strengthen our team. As a Senior Data Engineer, you will lead the design, optimization, and scalability of data platforms that power analytics for our clients. You will be hands-on with data pipelines, large-scale data processing, and modern cloud data stacks while mentoring team members and helping shape best practices. The Role This role requires strong expertise in Python (PySpark/Apache Spark), deep knowledge of working with high-volume data, and experience optimizing Delta Lake–based architectures. Exposure to Snowflake or Microsoft Fabric, and tools like Fivetran, Azure Data Factory, and Synapse Pipelines, is highly valued. If you’re motivated by solving complex data challenges, thrive in a collaborative environment, and enjoy applying AI to increase engineering productivity, this role offers the opportunity to have significant technical and strategic impact. What You'll Do Architect, design, and optimize large-scale data pipelines using tools like PySpark, SparkSQL, Delta Lake, and cloud-native tools. Drive efficiency in incremental/delta data loading, partitioning, and performance tuning. Lead implementations across Azure Synapse, Microsoft Fabric, and/or Snowflake environments. Collaborate with stakeholders and analysts to translate business needs into scalable data solutions. Evaluate and incorporate AI/automation to improve development speed, testing, and data quality. Oversee and mentor junior data engineers, establishing coding standards and best practices. Ensure high standards for data quality, security, and governance. Participate in solution design for client engagements, balancing technical depth with practical outcomes. What You Bring Bachelor’s or Master’s degree in Computer Science, Engineering, or related field. 5+ years of professional experience in data engineering, with emphasis on Python & PySpark/Apache Spark. Proven ability to manage large datasets and optimize for speed, scalability, and reliability. Strong SQL skills and understanding of relational and distributed data systems. Experience with Azure Data Factory, Synapse Pipelines, Fivetran, Delta Lake, Microsoft Fabric, or Snowflake. Knowledge of data modeling, orchestration, and Delta/Parquet file management best practices. Familiarity with CI/CD, version control, and DevOps practices for data pipelines. Experience leveraging AI-assisted tools to accelerate engineering workflows. Strong communication skills; ability to convey complex technical details to both engineers and business stakeholders. Relevant certifications (Azure, Snowflake, or Fabric) a plus. Why Join Blue Margin We're a performance-driven team that values clarity, accountability, and doing things right You'll work directly with business leaders and see the impact of your strategic guidance We solve real problems for clients who respect our expertise and rely on us to deliver You'll be surrounded by smart, motivated teammates who care about quality and outcomes Competitive pay ($110K - $140K) strong benefits, and a flexible hybrid work setup—with in-office collaboration based in Fort Collins This role is for someone who takes ownership, builds lasting relationships, and performs at a high level. If that's you, we'd love to meet you.