Logo
Danta Technologies

Senior Data Architect

Danta Technologies, Trenton, New Jersey, United States

Save Job

Overview

We are currently hiring a Senior Data Architect. This position is located in the USA, and it is 100% remote. If you are interested, please apply here. Role Details

Role: Senior Data Architect Location: Remote, USA only (Must be willing to work EST hours) Duration: Contract 6+ months and possibility for extension – This duration is subject to changes based on the project’s requirement and/or the client’s sole discretion. Rate: $60/hr – $70/hr on W2 All Inclusive Danta Technologies payroll Client Industry: Healthcare / Life Sciences Job Description

The client is looking for a

senior-level data infrastructure expert who can

design, build, and optimize scalable data pipelines and architectures, particularly in a

cloud-native environment, and support

analytics, reporting, and machine learning workflows. Key Responsibilities Data Pipeline Development: Design, build, and maintain scalable data pipelines to ingest, process, and transform structured and unstructured data. Data Modeling: Create optimized data models to support analytics, reporting, and machine learning workflows. ETL/ELT Processes: Develop and manage ETL/ELT workflows to ensure clean, reliable, and high-quality data. Database Management: Work with relational and NoSQL databases to ensure efficient storage and retrieval of large datasets. Cloud Data Solutions: Implement and optimize data solutions on cloud platforms like AWS, Azure, or Google Cloud Platform. Data Quality & Governance: Ensure data integrity, security, compliance, and quality across systems. Collaboration: Partner with data scientists, analysts, and software engineers to deliver reliable data infrastructure. Automation: Streamline data processes using orchestration tools and automation frameworks. Monitoring & Optimization: Implement monitoring, logging, and performance tuning of data systems. Documentation: Maintain detailed documentation of data pipelines, architecture, and workflows. Skills

Programming Skills: Proficiency in Python, SQL, and familiarity with Java/Scala. Data Pipelines & ETL: Experience with ETL tools (Airflow, DBT, Informatica, Talend). Big Data Frameworks: Knowledge of Spark, Hadoop, Kafka, or Flink. Data Warehousing: Hands-on experience with Snowflake, Redshift, BigQuery, or Synapse. Cloud Platforms: Proficiency in AWS (Glue, Redshift, S3), Azure (Data Factory, Synapse), or Google Cloud Platform (BigQuery, Dataflow). Databases: Strong experience with relational databases (PostgreSQL, MySQL, Oracle) and NoSQL databases (MongoDB, Cassandra). Data Modeling: Expertise in designing star/snowflake schemas, OLTP/OLAP systems. DevOps & Version Control: Familiarity with Git, CI/CD pipelines, and Infrastructure as Code (Terraform). Data Governance & Security: Knowledge of GDPR, HIPAA, encryption, role-based access controls. Analytical Skills: Strong problem-solving and optimization skills in handling big data. Collaboration & Communication: Ability to work in cross-functional teams and clearly document technical processes. Notes: All qualified applicants will receive consideration for employment without regard to race, color, religion, religious creed, sex, national origin, ancestry, age, physical or mental disability, medical condition, genetic information, military and veteran status, marital status, pregnancy, gender, gender expression, gender identity, sexual orientation, or any other characteristic protected by local law, regulation, or ordinance. Benefits

Danta offers a compensation package to all W2 employees that are competitive in the industry. It consists of competitive pay, the option to elect healthcare insurance (Dental, Medical, Vision), Major holidays and Paid sick leave as per state law. The rate/ Salary range is dependent on numerous factors including Qualification, Experience and Location.

#J-18808-Ljbffr