Logo
Interon IT Solutions

Databricks Architect – AWS Cloud - W2 - MN - Local

Interon IT Solutions, Minneapolis, Minnesota, United States, 55400

Save Job

Company Interon IT Solutions specializes in delivering innovative IT services, helping businesses drive sustainable growth through advanced technologies. With expertise in Digital Transformation, Cloud Technologies, DevSecOps, Robotic Process Automation (RPA), Data Insights, and Healthcare IT, the company enables organizations to adapt and excel in competitive environments. Serving industries such as Healthcare, Financial Services, Retail, Government, and Manufacturing, Interon IT Solutions is committed to delivering secure, scalable, and tailored solutions that cater to specific business needs. With a customer‑focused approach, the team is dedicated to excellence and measurable success.

Position Databricks Architect

Location: Minnesota (local candidates only)

Duration: Long‑term Contract

Role Summary This contract role is for a Databricks Architect specializing in AWS Cloud. The position requires working on‑site at our location in Minneapolis, MN. The day‑to‑day responsibilities include designing and implementing scalable data architecture, leading the integration of cloud‑based platforms with Databricks, assessing requirements, and defining architectural solutions. The role also includes collaborating with cross‑functional teams to support project execution and ensuring technical alignment with business objectives.

Key Responsibilities

Lead end‑to‑end architecture of Databricks‑based data and analytics platforms on AWS.

Design scalable ELT/ETL pipelines using Databricks, Spark, and Delta Lake.

Architect data ingestion, data quality, governance, and lineage frameworks.

Implement CI/CD pipelines for Databricks notebooks and jobs.

Optimize Spark workloads and cluster configurations for performance and cost.

Integrate Databricks with AWS services (S3, Glue, Lambda, Redshift, EMR, Kinesis, IAM).

Provide architectural guidance and governance across engineering teams.

Collaborate with product owners, data engineers, and business stakeholders.

Required Skills

10+ years in data engineering/architecture; 4+ years in Databricks.

Strong hands‑on Spark (PySpark/Scala) experience.

Expert knowledge of AWS cloud architecture—S3, IAM, Glue, Lambda, Step Functions, EMR, Redshift.

Experience implementing Delta Lakehouse architecture.

Strong knowledge of CI/CD (Azure DevOps, GitHub Actions, Jenkins, etc.).

Excellent communication and stakeholder management skills.

Contact Best Regards, Sushil. N 571‑616‑8875 (c) Sushil.s@interonit.com https://interonit.com/

#J-18808-Ljbffr