Apolis
Google Cloud Platform Data Engineer
Location: Sunnyvale, CA. REMOTE
Job Description:
10 years of Python and SQL skills.
Key Responsibilities Advanced Data Analysis Perform deep‑drive data analysis using SQL, Python, and R to uncover trends, patterns, and actionable insights.
Collaborate with stakeholders to translate business questions into analytical solutions.
ETL & Dashboard Development Design, build, and maintain automated ETL pipelines on Google Cloud Platform using tools like Dataflow, BigQuery, and Cloud Composer.
Develop and deploy real-time dashboards using Looker, Data Studio, or other BI tools to support data‑driven decision‑making.
Workflow Optimization Identify and implement opportunities to streamline data workflows across cross‑functional teams.
Drive process improvements and automation to enhance data accessibility and reliability.
High-Impact Delivery Thrive in a fast‑paced, high‑impact environment, managing multiple priorities and delivering results under tight deadlines.
Communicate findings and recommendations clearly to both technical and non‑technical audiences.
Required Skills & Qualifications
Proficiency in SQL for complex data querying and transformation.
Intermediate experience with Python and R for data manipulation and statistical analysis.
Hands‑on experience with Google Cloud Platform services such as BigQuery, Cloud Storage, Dataflow, Pub/Sub, and Cloud Functions.
Experience building automated ETL pipelines and working with streaming data.
Familiarity with dashboarding tools like Looker, Tableau, or Google Data Studio.
Strong problem‑solving skills and ability to work independently and collaboratively.
Excellent communication and stakeholder management skills.
Preferred Qualifications
Google Cloud Platform certification (e.g., Professional Data Engineer).
Experience with CI/CD pipelines and infrastructure as code (e.g., Terraform).
Background in data science or machine learning is a plus.
#J-18808-Ljbffr
Job Description:
10 years of Python and SQL skills.
Key Responsibilities Advanced Data Analysis Perform deep‑drive data analysis using SQL, Python, and R to uncover trends, patterns, and actionable insights.
Collaborate with stakeholders to translate business questions into analytical solutions.
ETL & Dashboard Development Design, build, and maintain automated ETL pipelines on Google Cloud Platform using tools like Dataflow, BigQuery, and Cloud Composer.
Develop and deploy real-time dashboards using Looker, Data Studio, or other BI tools to support data‑driven decision‑making.
Workflow Optimization Identify and implement opportunities to streamline data workflows across cross‑functional teams.
Drive process improvements and automation to enhance data accessibility and reliability.
High-Impact Delivery Thrive in a fast‑paced, high‑impact environment, managing multiple priorities and delivering results under tight deadlines.
Communicate findings and recommendations clearly to both technical and non‑technical audiences.
Required Skills & Qualifications
Proficiency in SQL for complex data querying and transformation.
Intermediate experience with Python and R for data manipulation and statistical analysis.
Hands‑on experience with Google Cloud Platform services such as BigQuery, Cloud Storage, Dataflow, Pub/Sub, and Cloud Functions.
Experience building automated ETL pipelines and working with streaming data.
Familiarity with dashboarding tools like Looker, Tableau, or Google Data Studio.
Strong problem‑solving skills and ability to work independently and collaboratively.
Excellent communication and stakeholder management skills.
Preferred Qualifications
Google Cloud Platform certification (e.g., Professional Data Engineer).
Experience with CI/CD pipelines and infrastructure as code (e.g., Terraform).
Background in data science or machine learning is a plus.
#J-18808-Ljbffr