Cynet systems Inc
Job Description
Pay Range: $70hr - $87.97hr
The Data Application Developer will design, build, and maintain scalable data-driven applications, pipelines, and APIs to support data processing and analytics.
This role involves developing efficient ETL/ELT workflows, integrating multiple data sources, and collaborating with cross-functional teams to deliver high-quality, reliable data solutions.
Responsibilities
Build and maintain data‑centric applications, tools, and APIs for real‑time and batch data processing.
Design and implement data ingestion pipelines integrating data from multiple sources such as databases, APIs, and file systems.
Create reusable ETL/ELT pipelines to process and transform raw data into usable formats using Snowflake, DBT, or Python.
Collaborate with analysts and stakeholders to understand requirements and translate them into scalable technical solutions.
Maintain comprehensive documentation for data applications, workflows, and processes.
Qualifications
Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent experience).
Proficiency in programming languages such as Python, C#, and ASP.NET (Core).
Strong understanding of SQL and database design, with experience using relational databases such as Snowflake or SQL Server.
Hands‑on experience with ETL/ELT tools and frameworks like Apache Airflow (DBT experience is a plus).
Familiarity with cloud platforms such as AWS, Azure, or Google Cloud and related data services (e.g., S3, AWS Lambda).
Experience with real‑time data processing tools such as Kafka or Spark, as well as batch data processing.
Skilled in designing and integrating RESTful APIs for data access and communication.
Knowledge of version control systems like Git for code management.
Strong analytical and problem‑solving skills for troubleshooting complex data issues.
Preferred Skills
Knowledge of containerization tools like Docker and orchestration platforms such as Kubernetes.
Experience with business intelligence tools like Tableau, Power BI, or Looker.
Soft Skills
Excellent communication and collaboration skills for effective teamwork across departments.
Ability to prioritize and manage multiple tasks in a fast‑paced environment.
Strong attention to detail and commitment to delivering high‑quality results.
#J-18808-Ljbffr
The Data Application Developer will design, build, and maintain scalable data-driven applications, pipelines, and APIs to support data processing and analytics.
This role involves developing efficient ETL/ELT workflows, integrating multiple data sources, and collaborating with cross-functional teams to deliver high-quality, reliable data solutions.
Responsibilities
Build and maintain data‑centric applications, tools, and APIs for real‑time and batch data processing.
Design and implement data ingestion pipelines integrating data from multiple sources such as databases, APIs, and file systems.
Create reusable ETL/ELT pipelines to process and transform raw data into usable formats using Snowflake, DBT, or Python.
Collaborate with analysts and stakeholders to understand requirements and translate them into scalable technical solutions.
Maintain comprehensive documentation for data applications, workflows, and processes.
Qualifications
Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent experience).
Proficiency in programming languages such as Python, C#, and ASP.NET (Core).
Strong understanding of SQL and database design, with experience using relational databases such as Snowflake or SQL Server.
Hands‑on experience with ETL/ELT tools and frameworks like Apache Airflow (DBT experience is a plus).
Familiarity with cloud platforms such as AWS, Azure, or Google Cloud and related data services (e.g., S3, AWS Lambda).
Experience with real‑time data processing tools such as Kafka or Spark, as well as batch data processing.
Skilled in designing and integrating RESTful APIs for data access and communication.
Knowledge of version control systems like Git for code management.
Strong analytical and problem‑solving skills for troubleshooting complex data issues.
Preferred Skills
Knowledge of containerization tools like Docker and orchestration platforms such as Kubernetes.
Experience with business intelligence tools like Tableau, Power BI, or Looker.
Soft Skills
Excellent communication and collaboration skills for effective teamwork across departments.
Ability to prioritize and manage multiple tasks in a fast‑paced environment.
Strong attention to detail and commitment to delivering high‑quality results.
#J-18808-Ljbffr