Cyber Space Technologies LLC
Direct message the job poster from Cyber Space Technologies LLC
Location:
Minneapolis, MN (Hybrid/Onsite as required)
Key Responsibilities
Design, develop, and maintain
ETL/ELT pipelines
on
Azure Data Services
(Azure Data Factory, Synapse, Databricks, Data Lake).
Write clean, efficient, and scalable code using
Python
and
PySpark
for data ingestion, transformation, and processing.
Build and optimize
data models
for analytics, BI, and reporting use cases.
Implement
data quality, lineage, and governance
frameworks within Azure environments.
Collaborate with business stakeholders, analysts, and data scientists to deliver high-quality datasets and solutions.
Troubleshoot and optimize pipelines for
performance, scalability, and cost-efficiency .
Ensure adherence to
security, compliance, and best practices
in Azure data engineering.
Required Qualifications
Experience as a
Data Architect
with strong
ETL design and development
background.
Hands‑on expertise with
Azure Data Factory, Azure Synapse, Azure Databricks, and Data Lake .
Strong proficiency in
Python
and
PySpark
for large-scale data processing.
Solid SQL knowledge and experience with relational and NoSQL databases.
Experience in
data modeling
and building data warehouse/lakehouse solutions.
Strong problem‑solving, debugging, and performance optimization skills.
Preferred Qualifications
Experience in the
BFSI or Wealth Management domain .
Familiarity with DevOps and CI/CD pipelines for data workflows.
Knowledge of BI tools like
Power BI, Tableau, or Looker .
Education
Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.
Seniority level:
Not Applicable
Employment type:
Contract
Referrals increase your chances of interviewing at Cyber Space Technologies LLC by 2x
#J-18808-Ljbffr
Location:
Minneapolis, MN (Hybrid/Onsite as required)
Key Responsibilities
Design, develop, and maintain
ETL/ELT pipelines
on
Azure Data Services
(Azure Data Factory, Synapse, Databricks, Data Lake).
Write clean, efficient, and scalable code using
Python
and
PySpark
for data ingestion, transformation, and processing.
Build and optimize
data models
for analytics, BI, and reporting use cases.
Implement
data quality, lineage, and governance
frameworks within Azure environments.
Collaborate with business stakeholders, analysts, and data scientists to deliver high-quality datasets and solutions.
Troubleshoot and optimize pipelines for
performance, scalability, and cost-efficiency .
Ensure adherence to
security, compliance, and best practices
in Azure data engineering.
Required Qualifications
Experience as a
Data Architect
with strong
ETL design and development
background.
Hands‑on expertise with
Azure Data Factory, Azure Synapse, Azure Databricks, and Data Lake .
Strong proficiency in
Python
and
PySpark
for large-scale data processing.
Solid SQL knowledge and experience with relational and NoSQL databases.
Experience in
data modeling
and building data warehouse/lakehouse solutions.
Strong problem‑solving, debugging, and performance optimization skills.
Preferred Qualifications
Experience in the
BFSI or Wealth Management domain .
Familiarity with DevOps and CI/CD pipelines for data workflows.
Knowledge of BI tools like
Power BI, Tableau, or Looker .
Education
Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.
Seniority level:
Not Applicable
Employment type:
Contract
Referrals increase your chances of interviewing at Cyber Space Technologies LLC by 2x
#J-18808-Ljbffr