Logo
Damco Solutions

Sr. Data Engineer

Damco Solutions, New York, New York, us, 10261

Save Job

Title: Sr. Data Engineer Location: New York, NY Job Description Required Skills:

Proficiency in data engineering programming languages (preferably Python, alternatively Scala or Java) Proficiency in atleast one cluster computing frameworks (preferably Spark, alternatively Flink or Storm) Proficiency in atleast one cloud data lakehouse platforms (preferably AWS data lake services or Databricks, alternatively Hadoop), atleast one relational data stores (Postgres, Oracle or similar) and atleast one NOSQL data stores (Cassandra, Dynamo, MongoDB or similar) Proficiency in atleast one scheduling/orchestration tools (preferably Airflow, alternatively AWS Step Functions or similar) Proficiency with data structures, data serialization formats (JSON, AVRO, Protobuf, or similar), big-data storage formats (Parquet, Iceberg, or similar), data processing methodologies (batch, micro-batching, and stream), one or more data modelling techniques (Dimensional, Data Vault, Kimball, Inmon, etc.), Agile methodology (develop PI plans and roadmaps), TDD (or BDD) and CI/CD tools (Jenkins, Git,) Strong organizational, problem-solving and critical thinking skills; Strong documentation skills.

Preferred skills:

Experience using AWS Bedrock APIs Knowledge of Generative AI concepts (such as RAG, Vector embeddings, Model fine tuning, Agentic AI) Experience in IaC (preferably Terraform, alternatively AWS cloud formation) Technology Cover Letter (TCL)

Skills:

Rating (0-5)

Year of experience:

Write- Up (2 lines for each skill on your exp.)

Python 5 10 I have designed and developed backend in Python framework - Django, used to develop internal frameworks. Developed scripts to automate tasks and ingested data. Spark 3 2 Used to designed and developed data lake system on amazon and azure. WS 4 6 Used to designed and developed backend applications, deployed applications using CI/CD irflow 3 1 Used to orchestrate the extraction tasks, called sql scripts from Amazon s3.