CoreMedical Group
Lead Data Engineer/Architect at KANINI Nashville, TN
CoreMedical Group, Nashville, Tennessee, United States, 37247
Lead Data Engineer/Architect job at KANINI. Nashville, TN.
Title:
Sr. Data Engineer/Architect Location
: US Remote Work Authorization
: US Citizen or Green Card SCOPE: Kanini is seeking a highly skilled
Lead Data Engineer/Architect
with deep expertise in Azure Data Service technologies, Power BI, and strong Snowflake experience. This role plays a key part in supporting our data-driven initiatives by designing and maintaining scalable data infrastructure and ensuring seamless access to critical insights across the organization. Key Responsibilities Data Infrastructure & Architecture:
Design, build, and maintain high-performance, scalable, and reliable data pipelines and data architectures on
Azure
platforms. Data Warehousing:
Develop and manage cloud-based data warehouse solutions, particularly using
Snowflake
, ensuring optimized storage and query performance. ETL/ELT Development:
Create robust ETL/ELT processes to ingest structured and unstructured data from diverse sources, including point-of-sale (POS) systems, product usage logs, web/eCommerce platforms, and geolocation data. Analytics & Reporting Enablement:
Enable data access for business users and analysts by building effective reporting layers and dashboards using tools like
Power BI. Collaboration & Stakeholder Engagement:
Work closely with data analysts, data scientists, and business stakeholders to define data requirements and deliver actionable insights aligned with business goals. Performance Optimization & Troubleshooting:
Monitor data pipelines for performance, reliability, and integrity. Optimize queries, manage data partitioning and clustering, and resolve technical issues swiftly. Governance & Security:
Ensure adherence to data governance, quality, and security best practices across all data handling processes. Tooling:
Utilize modern data orchestration and workflow tools such as
Cloud Composer (Apache Airflow)
,
ADF
,
Pub/Sub
, and
Cloud Storage
to support data movement and transformation. Required Qualifications Strong proficiency in
SQL
with expertise in advanced query tuning and performance optimization. Solid hands-on experience with
PySpark
for data engineering tasks. Proven experience working with
cloud platforms
especially integrating with
Snowflake
. Familiarity with
Azure Data Factory
,
Synapse Analytics
, and
Microsoft Fabric
. Deep understanding of
data modeling
and the ability to design scalable data solutions. Demonstrated ability to implement and enforce
data governance
and
security
practices. Experience in
agile and product-centric environments
. Excellent communication and collaboration skills, with a proactive approach to stakeholder engagement.
#J-18808-Ljbffr
Sr. Data Engineer/Architect Location
: US Remote Work Authorization
: US Citizen or Green Card SCOPE: Kanini is seeking a highly skilled
Lead Data Engineer/Architect
with deep expertise in Azure Data Service technologies, Power BI, and strong Snowflake experience. This role plays a key part in supporting our data-driven initiatives by designing and maintaining scalable data infrastructure and ensuring seamless access to critical insights across the organization. Key Responsibilities Data Infrastructure & Architecture:
Design, build, and maintain high-performance, scalable, and reliable data pipelines and data architectures on
Azure
platforms. Data Warehousing:
Develop and manage cloud-based data warehouse solutions, particularly using
Snowflake
, ensuring optimized storage and query performance. ETL/ELT Development:
Create robust ETL/ELT processes to ingest structured and unstructured data from diverse sources, including point-of-sale (POS) systems, product usage logs, web/eCommerce platforms, and geolocation data. Analytics & Reporting Enablement:
Enable data access for business users and analysts by building effective reporting layers and dashboards using tools like
Power BI. Collaboration & Stakeholder Engagement:
Work closely with data analysts, data scientists, and business stakeholders to define data requirements and deliver actionable insights aligned with business goals. Performance Optimization & Troubleshooting:
Monitor data pipelines for performance, reliability, and integrity. Optimize queries, manage data partitioning and clustering, and resolve technical issues swiftly. Governance & Security:
Ensure adherence to data governance, quality, and security best practices across all data handling processes. Tooling:
Utilize modern data orchestration and workflow tools such as
Cloud Composer (Apache Airflow)
,
ADF
,
Pub/Sub
, and
Cloud Storage
to support data movement and transformation. Required Qualifications Strong proficiency in
SQL
with expertise in advanced query tuning and performance optimization. Solid hands-on experience with
PySpark
for data engineering tasks. Proven experience working with
cloud platforms
especially integrating with
Snowflake
. Familiarity with
Azure Data Factory
,
Synapse Analytics
, and
Microsoft Fabric
. Deep understanding of
data modeling
and the ability to design scalable data solutions. Demonstrated ability to implement and enforce
data governance
and
security
practices. Experience in
agile and product-centric environments
. Excellent communication and collaboration skills, with a proactive approach to stakeholder engagement.
#J-18808-Ljbffr