J Lee Engineering
Job Summary
We're looking for a talented Junior Analytics Engineer to help us turn raw data into actionable insights. You'll play a key role in building and maintaining ELT pipelines, modeling data, and defining metrics that drive business decisions. If you're passionate about data and eager to learn, we'd love to have you on board!
Key Responsibilities
Build, test, and maintain scalable data models and transformations (e.g., dbt) for analytics and BI use cases. Develop and support ELT pipelines from various sources (product, marketing, finance) into the data warehouse. Define, document, and operationalize business metrics and semantic layers in collaboration with stakeholders. Create and maintain data quality tests; monitor and resolve data issues and pipeline incidents. Contribute to BI development (e.g., Looker, Tableau, Power BI): build datasets, explores, dashboards, and semantic definitions. Optimize query and model performance and manage warehouse cost/efficiency. Write clear documentation for datasets, models, metrics, and data lineage; contribute to the data catalog. Use version control (Git) and CI/CD to ensure reliable, reviewable analytics code. Partner with analytics, product, and engineering teams to translate requirements into data solutions. Support ad hoc inquiries by guiding teams to the right, certified data assets. Follow data governance, privacy, and security practices; help enforce naming and modeling standards. Participate in sprint ceremonies, code reviews, and continuous improvement of analytics engineering practices. Skills
Excellent foundational knowledge of SQL and data modeling (dimensional/star schemas, slowly changing dimensions). Proficiency with contemporary ELT workflows and DBT (core or cloud). Knowledge of at least one cloud data warehouse, such as Databricks SQL, BigQuery, Redshift, or Snowflake. Proficiency with Git/GitHub or GitLab; proficiency with basic Python for data jobs and tooling. Experience with CI/CD and orchestration (Airflow, Dagster, dbt Cloud tasks, or comparable). Understanding of observability principles and data quality testing (dbt tests, Great Expectations). Proficiency with BI tools and metrics layers, such as Tableau, Power BI, Looker/LookML, and dbt metrics. Knowledge of APIs and event/data gathering tools (such as Segment and Snowplow). A learning attitude, analytical problem-solving skills, and meticulousness. The capacity to manage work, set priorities, and deliver in a fast-paced setting. Education and Experience
Bachelor's degree in a quantitative field (e.g., Computer Science, Data Science, Engineering, Statistics) or equivalent practical experience. 2 years of experience in analytics engineering, data analytics, or data engineering; internships or projects welcome. Hands-on experience with SQL and a modern data stack project (e.g., dbt + Snowflake/BigQuery/Redshift). Experience building dashboards or datasets in a BI tool. Nice to have: Python for data workflows; LookML/dbt certifications; exposure to Airflow/Dagster; privacy and compliance basics (GDPR/CCPA). Annual Salary
Expected base salary range: USD $70,000 - $120,000, depending on location, skills, and experience. Candidates in higher cost-of-living markets may fall toward the upper end of the range. Exact compensation will be determined during the interview process. Compensation & Benefits
Performance Bonus: 5-10% of base salary Equity/Stock Options: Eligibility available Comprehensive Health Benefits: Medical, dental, vision coverage Retirement Plan: Company match (e.g., 401(k)) Paid Time Off: Generous vacation, holidays, sick leave Paid parental leave and benefits Annual development budget for courses, certifications, conferences Wellness: Home office stipend, equipment, mental health resources If you're excited to build reliable analytics foundations and learn from a collaborative data team, we'd love to hear from you. Apply with your resume, links to projects or a portfolio, and a brief note on why this role interests you. We are unfortunately unable to consider candidates residing outside of US territories for this position.
#J-18808-Ljbffr
Build, test, and maintain scalable data models and transformations (e.g., dbt) for analytics and BI use cases. Develop and support ELT pipelines from various sources (product, marketing, finance) into the data warehouse. Define, document, and operationalize business metrics and semantic layers in collaboration with stakeholders. Create and maintain data quality tests; monitor and resolve data issues and pipeline incidents. Contribute to BI development (e.g., Looker, Tableau, Power BI): build datasets, explores, dashboards, and semantic definitions. Optimize query and model performance and manage warehouse cost/efficiency. Write clear documentation for datasets, models, metrics, and data lineage; contribute to the data catalog. Use version control (Git) and CI/CD to ensure reliable, reviewable analytics code. Partner with analytics, product, and engineering teams to translate requirements into data solutions. Support ad hoc inquiries by guiding teams to the right, certified data assets. Follow data governance, privacy, and security practices; help enforce naming and modeling standards. Participate in sprint ceremonies, code reviews, and continuous improvement of analytics engineering practices. Skills
Excellent foundational knowledge of SQL and data modeling (dimensional/star schemas, slowly changing dimensions). Proficiency with contemporary ELT workflows and DBT (core or cloud). Knowledge of at least one cloud data warehouse, such as Databricks SQL, BigQuery, Redshift, or Snowflake. Proficiency with Git/GitHub or GitLab; proficiency with basic Python for data jobs and tooling. Experience with CI/CD and orchestration (Airflow, Dagster, dbt Cloud tasks, or comparable). Understanding of observability principles and data quality testing (dbt tests, Great Expectations). Proficiency with BI tools and metrics layers, such as Tableau, Power BI, Looker/LookML, and dbt metrics. Knowledge of APIs and event/data gathering tools (such as Segment and Snowplow). A learning attitude, analytical problem-solving skills, and meticulousness. The capacity to manage work, set priorities, and deliver in a fast-paced setting. Education and Experience
Bachelor's degree in a quantitative field (e.g., Computer Science, Data Science, Engineering, Statistics) or equivalent practical experience. 2 years of experience in analytics engineering, data analytics, or data engineering; internships or projects welcome. Hands-on experience with SQL and a modern data stack project (e.g., dbt + Snowflake/BigQuery/Redshift). Experience building dashboards or datasets in a BI tool. Nice to have: Python for data workflows; LookML/dbt certifications; exposure to Airflow/Dagster; privacy and compliance basics (GDPR/CCPA). Annual Salary
Expected base salary range: USD $70,000 - $120,000, depending on location, skills, and experience. Candidates in higher cost-of-living markets may fall toward the upper end of the range. Exact compensation will be determined during the interview process. Compensation & Benefits
Performance Bonus: 5-10% of base salary Equity/Stock Options: Eligibility available Comprehensive Health Benefits: Medical, dental, vision coverage Retirement Plan: Company match (e.g., 401(k)) Paid Time Off: Generous vacation, holidays, sick leave Paid parental leave and benefits Annual development budget for courses, certifications, conferences Wellness: Home office stipend, equipment, mental health resources If you're excited to build reliable analytics foundations and learn from a collaborative data team, we'd love to hear from you. Apply with your resume, links to projects or a portfolio, and a brief note on why this role interests you. We are unfortunately unable to consider candidates residing outside of US territories for this position.
#J-18808-Ljbffr