CereCore
Position Summary
We are seeking a highly skilled Full-Stack Business Intelligence Developer to join our data-driven team. This role is ideal for someone who thrives in transforming complex datasets into actionable insights and enjoys working across the full BI stackfrom data ingestion to visualization. Employment type
Contract Contract length
7-months Responsibilities
Build and optimize ETL pipelines using Python, Apache Airflow, and Cloud Composer. Model and transform data in GCP BigQuery and other cloud-based data warehouses. Collaborate with stakeholders to gather requirements and translate business needs into technical solutions. Ensure data integrity, performance, and scalability across very large datasets and databases. Implement best practices for data governance, security, and compliance. Design, develop, and maintain scalable BI solutions using Power BI, SSRS, and custom dashboards. Qualifications
Proficiency in Python, SQL, and GCP BigQuery. Hands-on experience with Apache Airflow and Cloud Composer for workflow orchestration. Strong understanding of data modeling, star/snowflake schemas, and dimensional modeling. Experience working with very large datasets (terabytes or more) and optimizing performance. Nice to have: Design, develop, and maintain scalable BI solutions using Power BI, SSRS, and custom dashboards. Excellent problem-solving and communication skills. Bachelor's degree in Computer Science, Data Engineering, or related field. Experience with CI/CD pipelines for BI deployments. Exposure to data governance frameworks and role-based access control. Certification in Power BI, GCP, or related technologies. Seniority level
Entry level Job function
Information Technology Industries
IT Services and IT Consulting
#J-18808-Ljbffr
We are seeking a highly skilled Full-Stack Business Intelligence Developer to join our data-driven team. This role is ideal for someone who thrives in transforming complex datasets into actionable insights and enjoys working across the full BI stackfrom data ingestion to visualization. Employment type
Contract Contract length
7-months Responsibilities
Build and optimize ETL pipelines using Python, Apache Airflow, and Cloud Composer. Model and transform data in GCP BigQuery and other cloud-based data warehouses. Collaborate with stakeholders to gather requirements and translate business needs into technical solutions. Ensure data integrity, performance, and scalability across very large datasets and databases. Implement best practices for data governance, security, and compliance. Design, develop, and maintain scalable BI solutions using Power BI, SSRS, and custom dashboards. Qualifications
Proficiency in Python, SQL, and GCP BigQuery. Hands-on experience with Apache Airflow and Cloud Composer for workflow orchestration. Strong understanding of data modeling, star/snowflake schemas, and dimensional modeling. Experience working with very large datasets (terabytes or more) and optimizing performance. Nice to have: Design, develop, and maintain scalable BI solutions using Power BI, SSRS, and custom dashboards. Excellent problem-solving and communication skills. Bachelor's degree in Computer Science, Data Engineering, or related field. Experience with CI/CD pipelines for BI deployments. Exposure to data governance frameworks and role-based access control. Certification in Power BI, GCP, or related technologies. Seniority level
Entry level Job function
Information Technology Industries
IT Services and IT Consulting
#J-18808-Ljbffr