USA Jobs
Sr. Data Platform Engineer
Salesforce Inc. seeks Sr. Data Platform Engineer in Dallas, TX: Job Duties: Architecting and building data engineering infrastructure on AWS, utilizing services such as Airflow, EMR, ECS, S3, Glue, and others. Implement and manage data orchestration workflows using Apache Airflow to schedule, monitor, and optimize data processing tasks. Enhancing dbt (data build tool) functionality to integrate with the cloud data platform through custom utilities, improving CI/CD processes. Advocate for metadata-driven design principles to improve data lineage, documentation, and governance. Monitor and optimize data pipelines for efficiency, scalability, and cost-effectiveness, employing abstract class methodology to ensure platform performance. Security and compliance are critical, requiring adherence to data encryption, access controls, and auditing standards. Collaboration with data engineers, scientists, analysts, and other stakeholders is necessary to understand data requirements and deliver tailored solutions. Maintaining comprehensive documentation of data engineering processes, pipelines, and architecture is also essential. Minimum Requirements: Masters degree (or its foreign degree equivalent) in Systems Engineering, Computer Science, Engineering (any field) or a related quantitative discipline and two (2) years of experience in the job offered or in any occupation in a related field. A related technical degree required (Systems Engineering, Computer Science, Engineering (any field)). Special Skill Requirements: Python; Apache Airflow and MWAA; AWS services (VPC, EC2, Lambda, S3, or Glue); Terraform; PySpark; React; C++; SQL; Metadata-driven design; Object-Oriented Programming (OOP); CI/CD (GitHub Actions or Jenkins); Data governance (RBAC); Data modeling (dbt or Snowflake); Query optimization. Any suitable combination of education, training and/or experience is acceptable.
Salesforce Inc. seeks Sr. Data Platform Engineer in Dallas, TX: Job Duties: Architecting and building data engineering infrastructure on AWS, utilizing services such as Airflow, EMR, ECS, S3, Glue, and others. Implement and manage data orchestration workflows using Apache Airflow to schedule, monitor, and optimize data processing tasks. Enhancing dbt (data build tool) functionality to integrate with the cloud data platform through custom utilities, improving CI/CD processes. Advocate for metadata-driven design principles to improve data lineage, documentation, and governance. Monitor and optimize data pipelines for efficiency, scalability, and cost-effectiveness, employing abstract class methodology to ensure platform performance. Security and compliance are critical, requiring adherence to data encryption, access controls, and auditing standards. Collaboration with data engineers, scientists, analysts, and other stakeholders is necessary to understand data requirements and deliver tailored solutions. Maintaining comprehensive documentation of data engineering processes, pipelines, and architecture is also essential. Minimum Requirements: Masters degree (or its foreign degree equivalent) in Systems Engineering, Computer Science, Engineering (any field) or a related quantitative discipline and two (2) years of experience in the job offered or in any occupation in a related field. A related technical degree required (Systems Engineering, Computer Science, Engineering (any field)). Special Skill Requirements: Python; Apache Airflow and MWAA; AWS services (VPC, EC2, Lambda, S3, or Glue); Terraform; PySpark; React; C++; SQL; Metadata-driven design; Object-Oriented Programming (OOP); CI/CD (GitHub Actions or Jenkins); Data governance (RBAC); Data modeling (dbt or Snowflake); Query optimization. Any suitable combination of education, training and/or experience is acceptable.