Logo
Bixal

Sr. Systems Architect

Bixal, Baltimore, Maryland, United States

Save Job

Important Notice for Applicants: At Bixal, we want to ensure a transparent and secure application process for all candidates. Official communication will come from an email address ending in @bixal.com or @bixal.na.teamtailor-mail.com. Messages from other sources may be fraudulent, and you should exercise care to avoid any links or attachments included.

Bixal will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment.

If you need assistance or a reasonable accommodation to complete your application, we're here to help. Please reach out to us at talent@bixal.com and let us know how we can support you. You do not need to share personal details or disclose the nature of your request. You can expect a response from a team member within 24 hours during the regular work week and on the next operating day during the weekend or holidays.

About the Role The Systems Architect plays a central role in designing, implementing, and optimizing cloud data solutions that power a large data lakehouse at CMS. You will work directly with engineers, data analysts, and stakeholders to develop solutions in Databricks and AWS that are scalable, compliant, and efficient. This position is hands‑on, focused on delivering practical, secure, and maintainable architectures that support data ingestion, transformation, and visualization. This is a full‑time position on a currently funded Federal contract. The role offers the opportunity to make a meaningful impact aligned with Bixal’s mission of delivering innovative, human‑centered solutions.

Compensation The salary range for this role is $190,000 – $200,000. We make compensation decisions thoughtfully, considering experience, skills, and internal equity.

Responsibilities Solution Architecture and Design

Design and document end-to-end data solutions in Databricks and AWS, including ingestion pipelines, transformations, and storage patterns.

Define technical standards and configurations that ensure performance, reliability, and security.

Develop and maintain architecture diagrams, schemas, and documentation for engineering teams.

Review requirements and propose efficient, cost‑effective cloud solutions that align with client standards.

Support integration of Databricks with AWS services, QuickSight, and other client systems.

Other relevant duties as assigned and qualified/trained to perform.

Implementation and Delivery

Work closely with data engineers to design and implement pipelines using Spark, Delta Lake, and Databricks Workflows.

Optimize cluster configurations, job performance, and data access for large‑scale workloads.

Support automation of deployment and monitoring processes using CI/CD and Infrastructure as Code tools.

Troubleshoot and resolve technical issues across the Databricks and AWS environments.

Design, deploy, and operate secure hosting for custom and proprietary models using AWS Bedrock, including model access controls, network isolation, and MLOps‑ready deployment pipelines.

Collaboration and Communication.

Partner with CMS stakeholders to assess data maturity and establish governance, quality, and transformation approaches that produce AI‑ready datasets within the lakehouse.

Performance and Cost Optimization

Monitor resource usage and performance to ensure stability and efficiency.

Provide recommendations for compute optimization, storage lifecycle policies, and job scheduling to control costs.

Partner with client to review cloud usage metrics and cost forecasts for Databricks and AWS services.

Security and Compliance

Work with the Security and DevOps teams to maintain compliance with FedRAMP and CMS ATO requirements.

Implement data security best practices for encryption, access control, and audit logging.

Contribute to technical documentation for security reviews and system authorization.

Collaboration and Communication

Collaborate with client and partner teams on solution planning and design decisions.

Present technical concepts and recommendations to both technical and non‑technical stakeholders.

Participate in architecture review and design sessions to ensure consistency across the platform.

Excellent presentation and relationship building skills.

Qualifications

Bachelor's degree in a related field.

8 or more years of experience designing and implementing solutions in AWS.

4 or more years of hands‑on experience with Databricks, including Spark, SQL, Delta Lake, and Python.

Strong background in data engineering, ETL/ELT design, and cloud data architecture.

Experience designing secure, compliant, and scalable data environments.

Strong understanding of networking, IAM, and infrastructure concepts within AWS.

Ability to lead solutioning discussions and provide practical technical guidance.

Excellent written and verbal communication skills.

Ability to obtain and maintain a Public Trust Security Clearance.

Nice To Have Skills And Experience

Experience with federal healthcare data.

Familiarity with QuickSight or similar BI tools.

Experience with AI agents and production of AI‑ready dataset.

Knowledge of CI/CD pipelines, GitHub integration, and DevOps practices.

AWS Certified Solutions Architect or Databricks certification.

Familiarity with Agile or SAFe delivery environments.

How We Support Our Team

Flex hours

401K with matching incentive

Parental Leave

Medical/dental/vision benefits

Flex Spending Account

Company provided short-term disability and life insurance

Commuter benefits

Paid Time Off (PTO)

11 Paid holidays

Our company is committed to providing equal employment opportunities for all individuals and complies with all applicable federal, state, and local anti-discrimination laws. Employment decisions are based on merit, qualifications, and business needs.

#J-18808-Ljbffr