DCM INFOTECH LIMITED
Position: Lead Data Engineer
Job Type: Fulltime Opportunity
Location: Fort Worth, TX (Hybrid/2 days)
We are seeking a skilled and motivated
Lead Data Engineer
with expertise in
FastAPI ,
Pub/Sub messaging systems , and
Apache Airflow
to build and maintain scalable, cloud-native applications on
AWS . The ideal candidate has strong experience in modern Python development and is having strong hands-on experience with event-driven architectures and data workflow orchestration in AWS cloud environments.
Required Qualifications: Bachelor's degree in computer science, data science, or a related technical discipline. 10+ years of hands on experience in data engineering, include developing ETL/ElT data pipeline, API Integration (Fast API Preferred), data platform/products and or data warehouse. 3+ years of hands-on experience in developing data-intensive solutions on AWS for operational and analytics workloads. 3+ years of experience in designing both ETL/ELT for batch processing and data streaming architectures for real-time or near real-time data ingestion and processing. 3+ years of experience in develop and orchestrate complex data workflows suing Apache Airflow
(Mandatory), including DAG Authoring, scheduling, and monitoring. 2+ years of experience in building and managing event-driven microservices using Pub Sub systems (e.g. AWS SNS/SQL , Kafka) 3+ years of hands-on experience in two or more database technologies (e.g., MySQL, PostgreSQL, MongoDB) and data warehouses (e.g., Redshift, BigQuery, or Snowflake), as well as cloud-based data engineering technologies. Proficient in Dashboard/BI and Data visualization tools (eg. Tableau, Quicksight) Develop conceptual, logical, and physical data models using ERDs.
We are seeking a skilled and motivated
Lead Data Engineer
with expertise in
FastAPI ,
Pub/Sub messaging systems , and
Apache Airflow
to build and maintain scalable, cloud-native applications on
AWS . The ideal candidate has strong experience in modern Python development and is having strong hands-on experience with event-driven architectures and data workflow orchestration in AWS cloud environments.
Required Qualifications: Bachelor's degree in computer science, data science, or a related technical discipline. 10+ years of hands on experience in data engineering, include developing ETL/ElT data pipeline, API Integration (Fast API Preferred), data platform/products and or data warehouse. 3+ years of hands-on experience in developing data-intensive solutions on AWS for operational and analytics workloads. 3+ years of experience in designing both ETL/ELT for batch processing and data streaming architectures for real-time or near real-time data ingestion and processing. 3+ years of experience in develop and orchestrate complex data workflows suing Apache Airflow
(Mandatory), including DAG Authoring, scheduling, and monitoring. 2+ years of experience in building and managing event-driven microservices using Pub Sub systems (e.g. AWS SNS/SQL , Kafka) 3+ years of hands-on experience in two or more database technologies (e.g., MySQL, PostgreSQL, MongoDB) and data warehouses (e.g., Redshift, BigQuery, or Snowflake), as well as cloud-based data engineering technologies. Proficient in Dashboard/BI and Data visualization tools (eg. Tableau, Quicksight) Develop conceptual, logical, and physical data models using ERDs.