Logo
Beacon Resources

Data Engineer

Beacon Resources, Chicago, Illinois, United States, 60290

Save Job

Data Engineer II / III Location : Chicago, IL

Salary : $100,000 - $150,000

As a Data Engineer II / III, you will play a key role in designing, building, and maintaining the company’s modern data platform. You’ll own complex data pipelines and integrations that support strategic decision‑making and business operations. As a mid‑level engineer, you'll collaborate closely with product, analytics, and engineering teams to improve data quality, performance, and accessibility. You’ll also contribute to architectural decisions, mentor junior engineers, and help raise the bar for data engineering across the organization.

This position is ideal for someone who has already built robust pipelines, thrives on solving data challenges at scale, and wants to deepen their impact in a growing, mission‑driven company.

This role reports to the Executive Director, Technical Strategy and Operations and is located in Chicago, offering a hybrid work environment with a minimum of 3 days required in the office every week and additional days as business needs arise.

Responsibilities

Design and implement scalable, maintainable ETL / ELT pipelines for a variety of use cases (analytics, operations, product enablement)

Build and optimize integrations with cloud services, databases, APIs, and third‑party platforms

Own production data workflows end‑to‑end, including testing, deployment, monitoring, and troubleshooting

Collaborate with cross‑functional stakeholders to understand business needs and translate them into technical data solutions

Lead technical discussions and participate in architecture reviews to shape our evolving data platform

Write clean, well‑documented, production‑grade code in Python and SQL

Improve data model design and data warehouse performance partitioning, indexing, denormalization strategies

Champion best practices around testing, observability, CI / CD, and data governance

Mentor junior team members and contribute to peer code reviews

Qualifications

3+ years of experience in a data engineering or software engineering role, with a strong track record of delivering robust data solutions

Proficiency in Python and advanced SQL for complex data transformations and performance tuning

Experience building and maintaining production pipelines using tools like Airflow, dbt, or similar workflow / orchestration tools

Strong understanding of cloud‑based data infrastructure AWS, GCP, or Azure

Knowledge of data modeling techniques and data warehouse design star / snowflake schemas

Experience working with structured and semi‑structured data from APIs, SaaS tools, and databases

Familiarity with version control (Git), CI / CD, and Agile development methodologies

Strong communication and collaboration skills

Preferred

Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science, or related technical field

Experience with modern data warehouses like Redshift, BigQuery, or Snowflake

Exposure to modern DevOps / dataops practices Terraform, Docker, dbt Cloud

Experience integrating with Salesforce or other CRM / marketing platforms

Knowledge of data privacy and compliance considerations FERPA, GDPR

Benefits

Hybrid work arrangement

Paid parental leave

Medical, dental, and vision insurance

Flexible Spending Account (FSA) - Health Savings Account (HSA)

Employer‑paid short‑term disability insurance - Optional long‑term disability insurance

401(k) with immediate employer match vesting

Generous PTO plan with accrual increasing by tenure

Tuition reimbursement program

Discounted onsite gym access

Optional pet insurance

Additional perks and benefits

#J-18808-Ljbffr