Jobs via Dice
Overview
The Boeing Company is seeking an Advanced Information Technologist - Cloud Data Engineer to join the Data Analytics Engineering & Integration team in Ridley Park, PA; Seattle, WA; Arlington, VA; San Diego, CA; Long Beach, CA; Mesa, AZ; or Hazelwood, MO. The Solutions team ensures datasets are discoverable, secure, performant, and reliably delivered to downstream consumers to enable analytics, operations, and program decisioning across Boeing Defense, Space and Security (BDS). Responsibilities
Own both DBaaS/platform lifecycle and consumer-facing dataset delivery. Provision and operate DBaaS instances, implement dataset provisioning and access patterns, optimize query performance, enforce security/compliance (including GovCloud constraints), and collaborate with data architects to design, operationalize, and govern enterprise data ontologies and canonical models to enable consistent semantics and discoverability. Participate in all aspects of agile delivery, including design, implementation, testing, and deployment of solution-space features and operational tooling. Own provisioning, configuration, scaling, tuning, patching, backup/restore, and lifecycle management of DBaaS instances (e.g., RDS/Aurora or equivalent), including capacity planning, disaster recovery, and incident response. Implement and maintain dataset provisioning and delivery processes (cataloging, access controls, dataset packaging, freshness/latency SLAs) to support downstream consumers and integrations. Optimize consumer performance and cost-to-serve through data layout and query optimization (indexing, partitioning, materialized views, caching) and by advising solution owners on access/query patterns. Design, implement, and operationalize enterprise data ontologies and canonical models: create semantic models, map solution datasets to ontologies, enforce taxonomy/versioning, and partner with governance for discoverability and lineage. Design and enforce authorization models: implement RBAC for role-level permissions and ABAC for fine-grained, attribute-driven policies (dataset sensitivity, clearance, environment, ontology tags), integrated via a centralized policy engine. Build and maintain infrastructure-as-code (Terraform) and CI/CD for DB and dataset lifecycle changes; automate entitlement provisioning and deprovisioning workflows; author runbooks and participate in on-call rotations. Enforce security, compliance, and GovCloud requirements: manage RBAC/ABAC controls, encryption at rest/in transit, auditing and immutable logging, data classification/tags, masking/anonymization where required, and periodic access attestation. Integrate policy & observability: centralize policy evaluation, instrument audit logs and alerts for policy decisions and anomalous access, and include policy tests in pipelines. Design, deploy, and maintain data integrations and operational patterns within enterprise data platforms (e.g., Palantir Foundry) where applicable, including dataset modeling, Foundry Ontology alignment, transforms, and operationalization. Provide stakeholder support: advise solution owners on SLAs, access patterns, and best practices; validate consumer requirements; perform dataset handoffs and document usage guides. Continuously review and recommend platform improvements to improve reliability, security, performance, and cost efficiency. Qualifications
Basic Qualifications (Required Skills/Experience)
3+ years of experience in data engineering or cloud data platform operations with hands-on experience managing cloud managed databases/DBaaS (e.g., AWS RDS/Aurora) and delivering datasets to downstream consumers. Practical experience implementing Infrastructure-as-Code (Terraform) for DB and dataset provisioning and lifecycle management. Proficiency in Python and SQL for automation, operational tooling, and query optimization. Experience with version control and CI/CD pipelines (e.g., GitLab CI/CD) and containerization (Docker, Kubernetes). Demonstrated experience with backups/DR, performance tuning, capacity planning, runbooks, and production incident response for DB-backed services. Experience translating consumer requirements into dataset provisioning, access specifications, and SLAs (freshness, latency, availability). Working knowledge of access control models and enforcement (RBAC) and practical exposure to attribute-based or policy-driven controls (ABAC or equivalent). Experience implementing security/compliance controls for data (encryption, auditing/logging, data classification, masking/anonymization) in cloud or hybrid environments. Technical Bachelor's degree or equivalent experience. Preferred Qualifications (Highly Preferred / Desired)
Hands-on experience with Palantir Foundry (dataset design, Foundry Ontology, Transforms, Code Repositories) and operational dataset patterns. Experience designing, implementing, and operationalizing data ontologies, canonical models, semantic layers, or knowledge graphs to improve discoverability, lineage, and reuse. Experience implementing RBAC + ABAC end-to-end, including attribute/tag definitions, centralized policy evaluation (e.g., OPA, IAM condition keys, Foundry policies), entitlement workflows, and access attestation. Experience operating in AWS GovCloud or other regulated cloud environments and applying compliance controls in cloud deployments. Experience optimizing consumer performance (indexing, partitioning, materialized views, caching) and cost-to-serve tradeoffs for query-heavy consumption workloads. Experience implementing observability for access and policy decisions (audit logging, alerting on anomalous access, policy denial metrics) and integrating those signals into incident response. Experience in aviation, defense, or other regulated industries and working in large matrixed organizations. Advanced degree or relevant certifications (e.g., AWS Specialty certs, Certified Data Management Professional, Palantir Foundry training) a plus. Additional Information
Relocation: Relocation assistance is not a negotiable benefit. Candidates must live in the immediate area or relocate at their own expense. Drug-Free Workplace: Boeing is a Drug-Free Workplace with post-offer testing policies as outlined in our policies. Shift: This position is for 1st shift. Security Clearance: Requires ability to obtain a U.S. Security Clearance. U.S. Citizenship is required. Interim and/or final U.S. Secret Clearance post-start is preferred. Visa Sponsorship: Employer will not sponsor applicants for employment visa status. EEO: Boeing is an Equal Opportunity Employer. Employment decisions are made without regard to race, color, religion, national origin, gender, sexual orientation, gender identity, age, physical or mental disability, genetic factors, military/veteran status or other characteristics protected by law. Note: This refined description preserves the core content while removing extraneous formatting and non-essential posting details for clarity and compliance.
#J-18808-Ljbffr
The Boeing Company is seeking an Advanced Information Technologist - Cloud Data Engineer to join the Data Analytics Engineering & Integration team in Ridley Park, PA; Seattle, WA; Arlington, VA; San Diego, CA; Long Beach, CA; Mesa, AZ; or Hazelwood, MO. The Solutions team ensures datasets are discoverable, secure, performant, and reliably delivered to downstream consumers to enable analytics, operations, and program decisioning across Boeing Defense, Space and Security (BDS). Responsibilities
Own both DBaaS/platform lifecycle and consumer-facing dataset delivery. Provision and operate DBaaS instances, implement dataset provisioning and access patterns, optimize query performance, enforce security/compliance (including GovCloud constraints), and collaborate with data architects to design, operationalize, and govern enterprise data ontologies and canonical models to enable consistent semantics and discoverability. Participate in all aspects of agile delivery, including design, implementation, testing, and deployment of solution-space features and operational tooling. Own provisioning, configuration, scaling, tuning, patching, backup/restore, and lifecycle management of DBaaS instances (e.g., RDS/Aurora or equivalent), including capacity planning, disaster recovery, and incident response. Implement and maintain dataset provisioning and delivery processes (cataloging, access controls, dataset packaging, freshness/latency SLAs) to support downstream consumers and integrations. Optimize consumer performance and cost-to-serve through data layout and query optimization (indexing, partitioning, materialized views, caching) and by advising solution owners on access/query patterns. Design, implement, and operationalize enterprise data ontologies and canonical models: create semantic models, map solution datasets to ontologies, enforce taxonomy/versioning, and partner with governance for discoverability and lineage. Design and enforce authorization models: implement RBAC for role-level permissions and ABAC for fine-grained, attribute-driven policies (dataset sensitivity, clearance, environment, ontology tags), integrated via a centralized policy engine. Build and maintain infrastructure-as-code (Terraform) and CI/CD for DB and dataset lifecycle changes; automate entitlement provisioning and deprovisioning workflows; author runbooks and participate in on-call rotations. Enforce security, compliance, and GovCloud requirements: manage RBAC/ABAC controls, encryption at rest/in transit, auditing and immutable logging, data classification/tags, masking/anonymization where required, and periodic access attestation. Integrate policy & observability: centralize policy evaluation, instrument audit logs and alerts for policy decisions and anomalous access, and include policy tests in pipelines. Design, deploy, and maintain data integrations and operational patterns within enterprise data platforms (e.g., Palantir Foundry) where applicable, including dataset modeling, Foundry Ontology alignment, transforms, and operationalization. Provide stakeholder support: advise solution owners on SLAs, access patterns, and best practices; validate consumer requirements; perform dataset handoffs and document usage guides. Continuously review and recommend platform improvements to improve reliability, security, performance, and cost efficiency. Qualifications
Basic Qualifications (Required Skills/Experience)
3+ years of experience in data engineering or cloud data platform operations with hands-on experience managing cloud managed databases/DBaaS (e.g., AWS RDS/Aurora) and delivering datasets to downstream consumers. Practical experience implementing Infrastructure-as-Code (Terraform) for DB and dataset provisioning and lifecycle management. Proficiency in Python and SQL for automation, operational tooling, and query optimization. Experience with version control and CI/CD pipelines (e.g., GitLab CI/CD) and containerization (Docker, Kubernetes). Demonstrated experience with backups/DR, performance tuning, capacity planning, runbooks, and production incident response for DB-backed services. Experience translating consumer requirements into dataset provisioning, access specifications, and SLAs (freshness, latency, availability). Working knowledge of access control models and enforcement (RBAC) and practical exposure to attribute-based or policy-driven controls (ABAC or equivalent). Experience implementing security/compliance controls for data (encryption, auditing/logging, data classification, masking/anonymization) in cloud or hybrid environments. Technical Bachelor's degree or equivalent experience. Preferred Qualifications (Highly Preferred / Desired)
Hands-on experience with Palantir Foundry (dataset design, Foundry Ontology, Transforms, Code Repositories) and operational dataset patterns. Experience designing, implementing, and operationalizing data ontologies, canonical models, semantic layers, or knowledge graphs to improve discoverability, lineage, and reuse. Experience implementing RBAC + ABAC end-to-end, including attribute/tag definitions, centralized policy evaluation (e.g., OPA, IAM condition keys, Foundry policies), entitlement workflows, and access attestation. Experience operating in AWS GovCloud or other regulated cloud environments and applying compliance controls in cloud deployments. Experience optimizing consumer performance (indexing, partitioning, materialized views, caching) and cost-to-serve tradeoffs for query-heavy consumption workloads. Experience implementing observability for access and policy decisions (audit logging, alerting on anomalous access, policy denial metrics) and integrating those signals into incident response. Experience in aviation, defense, or other regulated industries and working in large matrixed organizations. Advanced degree or relevant certifications (e.g., AWS Specialty certs, Certified Data Management Professional, Palantir Foundry training) a plus. Additional Information
Relocation: Relocation assistance is not a negotiable benefit. Candidates must live in the immediate area or relocate at their own expense. Drug-Free Workplace: Boeing is a Drug-Free Workplace with post-offer testing policies as outlined in our policies. Shift: This position is for 1st shift. Security Clearance: Requires ability to obtain a U.S. Security Clearance. U.S. Citizenship is required. Interim and/or final U.S. Secret Clearance post-start is preferred. Visa Sponsorship: Employer will not sponsor applicants for employment visa status. EEO: Boeing is an Equal Opportunity Employer. Employment decisions are made without regard to race, color, religion, national origin, gender, sexual orientation, gender identity, age, physical or mental disability, genetic factors, military/veteran status or other characteristics protected by law. Note: This refined description preserves the core content while removing extraneous formatting and non-essential posting details for clarity and compliance.
#J-18808-Ljbffr