Logo
SS&C Technologies Holdings

Senior Data Platform Software Engineer

SS&C Technologies Holdings, Kansas City, Missouri, United States, 64101

Save Job

Job Description : We are seeking a skilled Senior Data Platform Software Engineer to join our Data Platform team in Kansas City, MO. In this role, you will be instrumental in designing, coding, implementing, and optimizing a cloud-native data stack that leverages best-in-class open-source tools. The ideal candidate will design, build, and maintain an opinionated, resilient, and scalable data platform in a private cloud environmentenabling data-driven decision-making, analytics, and machine learning, while providing deep insights out of the box. This role blends data engineering, software development, and infrastructure management, leveraging tools such as Apache Iceberg, Java, Airflow, Spark, Kafka, and Superset. Core Responsibilities Data Pipelines Develop and maintain robust, fault-tolerant data ingestion and transformation pipelines using Java, Python and Spark. Define flexible and scalable data schemas using Apache Iceberg. Support both batch and real-time data processing, including integration with Apache Kafka. Ensure reliability, observability, and integrity of data pipelines. Data APIs Design and implement scalable, secure RESTful and data APIs for data access and integration. Build APIs for data ingestion, transformation, and consumption across internal services. Ensure API performance, consistency, and proper access controls. Apply best practices for API design, versioning, and Swagger documentation. Integrate APIs with orchestration tools like Apache Airflow and metadata platforms. Enable seamless interoperability between data consumers, pipelines, and governance systems. Metadata Management Data Governance Evaluate and implement metadata management platforms such as DataHub, Apache Atlas, or OpenMetadata to support data cataloging, lineage, and governance use cases. Collaborate with data stakeholders to align metadata solutions with organizational needs. Define and enforce governance policies related to data quality, privacy, and compliance (e.g., GDPR, CCPA). Implement fine-grained access controls, encryption, and auditing with a focus on regulatory compliance and data traceability. Automation CI/CD Automate data workflows, infrastructure provisioning, and deployments using tools like Airflow, Ansible, Salt and Kubernetes. Implement CI/CD pipelines for data platform updates and enhancements. Performance Optimization Optimize data storage and queries using Apache Iceberg and Spark to ensure high performance and low-latency access. Identify and address performance bottlenecks; implement partitioning, caching, and indexing strategies. Monitoring and Alerting Monitor data platform health using tools such as Prometheus and Grafana dashboards. Configure real-time alerts to proactively detect and resolve pipeline failures or data issues. Troubleshoot and resolve platform outages and data incidents promptly. Collaboration Work closely with data scientists, analysts, and engineers to understand data needs and deliver performant, scalable solutions. Collaborate with cross-functional teams (Cloud Engineering, Network, and DevOps/Solutions Engineering) to troubleshoot and resolve infrastructure issues. Qualifications Education Bachelors or Masters degree in Computer Science, Data Engineering, or a related field. Experience 58+ years of experience in data engineering, with a strong focus on cloud-based data platforms. Technical Skills Strong programming skills in Java. Deep knowledge of Apache Iceberg, Spark, Superset, and Kafka. Familiarity with metadata management platforms like DataHub, Apache Atlas, or OpenMetadata and experience with their evaluation or implementation. Experience with cloud-native infrastructure tools such as Kubernetes, Ansible, Salt, etc. Soft Skills Strong analytical and problem-solving skills. Effective communication and collaboration with cross-functional teams.