Logo
DataSync

Senior Technical Reporting Manager, ETL Reporting Engineer-AWS Data Platform/Tab

DataSync, Henderson, Nevada, us, 89077

Save Job

Senior Technical Reporting Manager, ETL Reporting Engineer-AWS Data Platform/Tableau

Design and maintain enterprise-scale data pipelines using AWS cloud services, handling schema evolution in data feeds and delivering analytics-ready datasets to BI platforms. This role requires hands-on expertise with the full AWS data stack and proven ability to build enterprise-grade data solutions that scale. Responsibilities: Build and orchestrate ETL/ELT workflows using Apache Airflow for complex data pipeline management Develop serverless data processing with AWS Lambda and EventBridge for real-time transformations Create scalable ETL jobs using AWS Glue with automated schema discovery and catalog management Execute database migrations and continuous replication using AWS DMS Design and optimize Amazon Redshift data warehouses and Amazon Athena federated queries Implement streaming data pipelines with Apache Kafka for real-time ingestion Manage schema changes in data feeds with automated detection and pipeline adaptation Create data feeds for Tableau and BusinessObjects reporting platforms Requirements: Airflow: DAG development, custom operators, workflow orchestration, production deployment Lambda: Serverless functions, event triggers, performance optimization EventBridge: Event-driven architecture, rule configuration, cross-service integration Glue: ETL job development, crawlers, Data Catalog, schema management DMS: Database migrations, continuous replication, heterogeneous database integration Redshift: Cluster management, query optimization, workload management Athena: Serverless analytics, partitioning strategies, federated queries Tableau (Expert Level): Develop and maintain data analogs, data cubes, queries, data visualization and reports Education And Experience: 5+ years AWS data platform development 3+ years production Airflow experience with complex workflow orchestration Proven experience managing high-volume data feeds (TB+ daily) with schema evolution Database migration expertise using DMS for enterprise-scale projects BI integration experience with Tableau and BusinessObjects platforms Key Competencies: Design fault-tolerant data pipelines with automated error handling and recovery Handle schema changes in real-time and batch data feeds without pipeline disruption Optimize performance across streaming and batch processing architectures Implement data quality validation and monitoring frameworks Coordinate cross-platform data synchronization and lineage tracking Preferred Qualifications: AWS Data Analytics Specialty or Solutions Architect Professional certification Experience with Infrastructure as Code (Terraform, CloudFormation) Knowledge of DataOps practices and CI/CD for data pipelines Containerization experience (Docker, ECS, EKS) for data workloads We are an equal opportunities employer and welcome applications from all qualified candidates.

#J-18808-Ljbffr