CoreAi Consulting
Overview
CoreAi Consulting is seeking a highly skilled
Big Data Developer
to join our growing team. This role involves designing, developing, and optimizing large-scale data pipelines, ETL processes, and big data applications that enable actionable insights and support mission-critical business decisions. The ideal candidate is a hands-on problem solver with a deep understanding of distributed data systems, cloud platforms, and modern software engineering practices. Responsibilities
Design, develop, and maintain scalable
data pipelines
and
ETL workflows
across large data sets. Implement and optimize
big data solutions
using technologies such as Hadoop, Spark, Hive, Kafka, and HBase. Collaborate with cross-functional teams to integrate data from multiple sources and ensure reliability and performance. Build and maintain
data warehouses
and support advanced analytics and reporting requirements. Leverage
cloud platforms
(AWS, Azure, or GCP) to design and deploy big data solutions in production environments. Apply best practices in
distributed computing, data modeling, and performance optimization . Partner with data scientists and analysts to support
machine learning, analytics, and visualization initiatives . Ensure
data quality, security, and compliance
with enterprise standards. Qualifications
5+ years
of hands-on experience in
large-scale application development
using the Big Data ecosystem. Strong expertise with
big data frameworks
(Hadoop, Spark, Hive, Kafka, HBase, etc.). Proficiency in at least one programming language:
Java, Scala, or Python . Practical experience with
cloud-based data platforms
(AWS EMR/Glue, Azure Data Lake/Databricks, GCP BigQuery, etc.). Strong understanding of
distributed computing, parallel processing, and data processing patterns . Experience with
data warehousing and ETL
concepts and tools. Familiarity with
data visualization and analytics tools
(Tableau, Power BI, or similar). Strong problem-solving skills and ability to work in a
fast-paced, collaborative environment . Bachelor’s or Master’s degree in
Computer Science, Data Science, or related field . Relevant
Big Data or Cloud certifications
(AWS Big Data Specialty, Databricks, Cloudera, etc.) are a plus. Seniorities
Mid-Senior level Employment type
Full-time Job function
Information Technology Industries
IT Services and IT Consulting
#J-18808-Ljbffr
CoreAi Consulting is seeking a highly skilled
Big Data Developer
to join our growing team. This role involves designing, developing, and optimizing large-scale data pipelines, ETL processes, and big data applications that enable actionable insights and support mission-critical business decisions. The ideal candidate is a hands-on problem solver with a deep understanding of distributed data systems, cloud platforms, and modern software engineering practices. Responsibilities
Design, develop, and maintain scalable
data pipelines
and
ETL workflows
across large data sets. Implement and optimize
big data solutions
using technologies such as Hadoop, Spark, Hive, Kafka, and HBase. Collaborate with cross-functional teams to integrate data from multiple sources and ensure reliability and performance. Build and maintain
data warehouses
and support advanced analytics and reporting requirements. Leverage
cloud platforms
(AWS, Azure, or GCP) to design and deploy big data solutions in production environments. Apply best practices in
distributed computing, data modeling, and performance optimization . Partner with data scientists and analysts to support
machine learning, analytics, and visualization initiatives . Ensure
data quality, security, and compliance
with enterprise standards. Qualifications
5+ years
of hands-on experience in
large-scale application development
using the Big Data ecosystem. Strong expertise with
big data frameworks
(Hadoop, Spark, Hive, Kafka, HBase, etc.). Proficiency in at least one programming language:
Java, Scala, or Python . Practical experience with
cloud-based data platforms
(AWS EMR/Glue, Azure Data Lake/Databricks, GCP BigQuery, etc.). Strong understanding of
distributed computing, parallel processing, and data processing patterns . Experience with
data warehousing and ETL
concepts and tools. Familiarity with
data visualization and analytics tools
(Tableau, Power BI, or similar). Strong problem-solving skills and ability to work in a
fast-paced, collaborative environment . Bachelor’s or Master’s degree in
Computer Science, Data Science, or related field . Relevant
Big Data or Cloud certifications
(AWS Big Data Specialty, Databricks, Cloudera, etc.) are a plus. Seniorities
Mid-Senior level Employment type
Full-time Job function
Information Technology Industries
IT Services and IT Consulting
#J-18808-Ljbffr