ZipRecruiter
Job DescriptionJob Description
Job Title:
Solution Architect
Location:
Remote |
Employment Type:
Contract
About the Role
We are looking for a
seasoned Solution Architect
to design and implement
scalable data and cloud architectures
for modern enterprises. The ideal candidate will have extensive hands-on experience with
Databricks ,
Delta Lake , and other
enterprise-grade cloud solutions . You will work closely with cross-functional teams to define and execute
technical roadmaps , deliver high-performance solutions, and drive innovation across
analytics platforms
and
data-driven systems .
Key Responsibilities
Lead the design and implementation of
cloud- data platforms , leveraging tools such as
Databricks ,
Delta Lake , and
MLflow .
Architect
large-scale ETL/ELT pipelines , data lakes, and
real-time/streaming data solutions
for diverse business needs.
Collaborate with
data engineers ,
data scientists , and
stakeholders
to translate business requirements into scalable technical solutions.
Integrate
Databricks notebooks ,
Apache Spark , and cloud- services (e.g.,
AWS Glue ,
Azure Data Factory ) for both
batch
and
real-time data processing .
Define and enforce
data governance ,
security best practices , and tools like
Unity Catalog ,
IAM , and
encryption at rest/in transit .
Implement
integration patterns
using
REST APIs ,
event-driven messaging
(Kafka/Pub/Sub), and
distributed systems design .
Participate in
architectural reviews ,
performance tuning , and optimization across distributed compute frameworks.
Stay ahead of
emerging technologies
in
data architecture ,
cloud infrastructure , and
ML Ops
practices.
Required Qualifications
Bachelor’s
or
Master’s degree
in
Computer Science ,
Data Engineering , or a related field.
10+ years of experience
in
enterprise software
or
data architecture roles , specifically working with cloud- platforms.
Strong hands-on expertise with
Databricks ,
Apache Spark , and
Delta Lake
for building scalable data solutions.
Proficiency in at least
one cloud platform
( AWS ,
Azure , or
GCP ), and working knowledge of key services like
S3 ,
ADLS ,
BigQuery , or
Redshift .
Familiarity with streaming platforms such as
Kafka ,
Kinesis , or
Azure Event Hubs .
Experience designing and deploying
data lakehouses
or
analytics platforms .
Solid understanding of
data modeling ,
data governance , and
pipeline orchestration
(e.g.,
Airflow ,
dbt ).
Skilled in
performance optimization ,
data security best practices , and
cloud cost management .
Excellent communication skills, with the ability to manage stakeholders and collaborate across teams.
Skills
certifications in
Databricks ,
AWS/Azure/GCP Solution Architecture , or
TOGAF .
Knowledge of
ML/AI workflows ,
model versioning , and
ML Ops
practices.
Familiarity with
Unity Catalog ,
Great Expectations , or other
data quality frameworks .
Previous experience working in regulated environments such as
healthcare ,
finance , or
insurance
is a plus.
Job Title:
Solution Architect
Location:
Remote |
Employment Type:
Contract
About the Role
We are looking for a
seasoned Solution Architect
to design and implement
scalable data and cloud architectures
for modern enterprises. The ideal candidate will have extensive hands-on experience with
Databricks ,
Delta Lake , and other
enterprise-grade cloud solutions . You will work closely with cross-functional teams to define and execute
technical roadmaps , deliver high-performance solutions, and drive innovation across
analytics platforms
and
data-driven systems .
Key Responsibilities
Lead the design and implementation of
cloud- data platforms , leveraging tools such as
Databricks ,
Delta Lake , and
MLflow .
Architect
large-scale ETL/ELT pipelines , data lakes, and
real-time/streaming data solutions
for diverse business needs.
Collaborate with
data engineers ,
data scientists , and
stakeholders
to translate business requirements into scalable technical solutions.
Integrate
Databricks notebooks ,
Apache Spark , and cloud- services (e.g.,
AWS Glue ,
Azure Data Factory ) for both
batch
and
real-time data processing .
Define and enforce
data governance ,
security best practices , and tools like
Unity Catalog ,
IAM , and
encryption at rest/in transit .
Implement
integration patterns
using
REST APIs ,
event-driven messaging
(Kafka/Pub/Sub), and
distributed systems design .
Participate in
architectural reviews ,
performance tuning , and optimization across distributed compute frameworks.
Stay ahead of
emerging technologies
in
data architecture ,
cloud infrastructure , and
ML Ops
practices.
Required Qualifications
Bachelor’s
or
Master’s degree
in
Computer Science ,
Data Engineering , or a related field.
10+ years of experience
in
enterprise software
or
data architecture roles , specifically working with cloud- platforms.
Strong hands-on expertise with
Databricks ,
Apache Spark , and
Delta Lake
for building scalable data solutions.
Proficiency in at least
one cloud platform
( AWS ,
Azure , or
GCP ), and working knowledge of key services like
S3 ,
ADLS ,
BigQuery , or
Redshift .
Familiarity with streaming platforms such as
Kafka ,
Kinesis , or
Azure Event Hubs .
Experience designing and deploying
data lakehouses
or
analytics platforms .
Solid understanding of
data modeling ,
data governance , and
pipeline orchestration
(e.g.,
Airflow ,
dbt ).
Skilled in
performance optimization ,
data security best practices , and
cloud cost management .
Excellent communication skills, with the ability to manage stakeholders and collaborate across teams.
Skills
certifications in
Databricks ,
AWS/Azure/GCP Solution Architecture , or
TOGAF .
Knowledge of
ML/AI workflows ,
model versioning , and
ML Ops
practices.
Familiarity with
Unity Catalog ,
Great Expectations , or other
data quality frameworks .
Previous experience working in regulated environments such as
healthcare ,
finance , or
insurance
is a plus.