AI Data Engineer
Globe Telecom, Inc. - Snowflake, Arizona, United States, 85937
Work at Globe Telecom, Inc.
Overview
- View job
Overview
We are seeking an AI Data Engineer with specialized expertise in Databricks and comprehensive experience implementing bronze, silver, and gold data pipelines. You will leverage your skills in Snowflake, Google BigQuery (GBQ), Python, and Google Cloud Platform (GCP) to build, optimize, and maintain robust data solutions supporting AI and analytics workloads. In this critical role, you'll collaborate closely with data scientists, AI specialists, cloud engineers, and business stakeholders to deliver scalable, secure, and performant data platforms.
At Globe, our goal is to create a wonderful world for our people, business, and nation. By uniting people of passion who believe they can make a difference, we are confident that we can achieve this goal. Job Description
We are seeking an AI Data Engineer with specialized expertise in Databricks and comprehensive experience implementing bronze, silver, and gold data pipelines. You will leverage your skills in Snowflake, Google BigQuery (GBQ), Python, and Google Cloud Platform (GCP) to build, optimize, and maintain robust data solutions supporting AI and analytics workloads. In this critical role, you'll collaborate closely with data scientists, AI specialists, cloud engineers, and business stakeholders to deliver scalable, secure, and performant data platforms.
DUTIES AND RESPONSIBILITIES: Data Pipeline Development & Optimization Design, build, and maintain data pipelines using Databricks, implementing bronze, silver, and gold data layers.
Continuously optimize data ingestion, transformation, and loading processes to improve performance, reliability, and scalability.
Ensure high data quality standards through robust validation, auditing, and governance frameworks.
Cloud Data Platforms Expertise Manage and optimize data solutions on Snowflake and Google BigQuery, ensuring efficient querying and resource utilization.
Develop strategies to migrate, integrate, and synchronize data between various cloud data warehouses.
Implement best practices for cloud data management, ensuring cost-effective and secure operations.
Python & Automation Leverage Python to automate data processes, streamline workflows, and develop efficient data transformations.
Build and manage automation scripts and workflows for data extraction, cleaning, and loading into various platforms.
Collaborate with AI teams to develop data integration points and data access layers supporting machine learning workloads.
GCP Infrastructure Utilize GCP services (Cloud Storage, BigQuery, Dataflow, Pub/Sub, Composer) to architect and deploy scalable data systems.
Integrate GCP infrastructure seamlessly with Databricks and other analytics environments.
Ensure robust cloud infrastructure monitoring, logging, and alerting mechanisms to proactively identify and mitigate data pipeline issues.
Data Governance & Security Establish comprehensive data governance practices, ensuring compliance with regulatory standards (e.g., GDPR, HIPAA).
Implement robust data security practices including encryption, role-based access control, and auditing mechanisms.
Collaborate closely with security and compliance teams to maintain secure data operations.
Collaboration & Communication Partner with data scientists, engineers, analysts, and business stakeholders to understand and fulfill data infrastructure requirements.
Clearly communicate complex data engineering concepts and solutions to technical and non-technical audiences.
Actively participate in agile methodologies, contributing to sprint planning, retrospectives, and continuous improvement initiatives.
REQUIREMENTS: Education Bachelor's or Master's degree in Computer Science, Data Engineering, Information Systems, or related fields (or equivalent experience).
Experience 3-5+ years of experience as a data engineer, with specific expertise in Databricks and structured data pipeline architectures (bronze, silver, gold).
Demonstrable experience managing data warehouses and data lakes with Snowflake and GBQ.
Technical Skills Proficient in Databricks platform management, data pipeline construction, and optimization.
Strong expertise in Snowflake and GBQ, including data modeling, query optimization, and performance tuning.
Advanced proficiency in Python, particularly for data manipulation, ETL processes, and automation.
Extensive experience with GCP data services including Cloud Storage, BigQuery, Pub/Sub, and Dataflow.
AI & Analytics Integration Understanding of AI and analytics data requirements, including data preparation and feature engineering.
Experience building data solutions supporting machine learning model training, validation, and deployment.
Portfolio Evidence of successful data engineering projects involving Databricks and cloud data warehouses.
Examples demonstrating complex pipeline management and data architecture contributions.
If you're passionate about developing advanced data solutions, leveraging cutting-edge technologies like Databricks, Snowflake, GBQ, and GCP, and collaborating closely with innovative AI teams, we encourage you to apply. Join us to shape the future of data-driven AI solutions in a dynamic and collaborative environment. KPIs:
Timely completion of bronze, silver, and gold data pipeline implementations.
Data pipeline availability and reliability (measured in uptime percentage and successful job completions).
Efficiency and performance improvement in data ingestion and query execution times.
Reduction in data-related incidents , including quality and security breaches.
Improved compliance levels with data governance and security standards (GDPR, HIPAA).
Top 3-5 Deliverables:
Design, implement, and optimize structured bronze, silver, and gold data pipelines on Databricks.
Develop and manage scalable data warehouses leveraging Snowflake and Google BigQuery for analytics and AI workloads.
Create robust Python-based automation scripts to streamline data processing, transformations, and integrations.
Establish and maintain effective data governance frameworks , ensuring high data quality, security, and regulatory compliance.
Architect and deploy scalable GCP infrastructure to integrate seamlessly with Databricks and analytics platforms. Equal Opportunity Employer Globe’s hiring process promotes equal opportunity to applicants, Any form of discrimination is not tolerated throughout the entire employee lifecycle, including the hiring process such as in posting vacancies, selecting, and interviewing applicants.
Globe’s Diversity, Equity and Inclusion Policy Commitment can be accessed here Make Your Passion Part of Your Profession. Attracting the best and brightest Talents is pivotal to our success. If you are ready to share our purpose of Creating a Globe of Good, explore opportunities with us.
Globe Telecom, Inc.(Globe) is the leading Telecommunications company in the Philippines and the purveyor of the Filipino digital lifestyle. We provide cellular, broadband and mobile data services by focusing on enriching our content offerings amid customers' growing preference for multimedia platforms across multiple screens and devices. We want to enrich lives through communications by simplifying technology, so that we bring customers closer to what matters most. Globe was also recognized as one of the Top companies to work for in Asia by Asia Corporate Excellence & Sustainability (ACES) Awards and Best Employer in the Telco category by Stevie’s New York. Our principal shareholders areAyala Corp.and Singtel, both industry leaders in their respective countries and in the region. We are also a member of Bridge Alliance, Asia Pacific's leading mobile alliance of 36 mobile carriers. Purpose In everything we do, we treat people right to create a Globe of Good. Vision We see a Philippines where families' dreams come true, businesses flourish, and the nation is admired. Mission We create wonderful experiences for people to have choices, overcome challenges, and discover new ways to enjoy life.
#J-18808-Ljbffr