Logo
HRU

Programmer Analyst

HRU, Falls Church, Virginia, United States, 22042

Save Job

Overview

Programmer Analyst Falls Church, VA Responsibilities

Design and implement data pipelines to ingest, extract, transform, load (ETL) data and store large datasets from various sources Build and maintain data warehouses, including data modeling, data governance, and data quality Ensure data quality, integrity, and security by implementing data validation, data cleansing, and data governance policies Optimize data systems for performance, scalability, and reliability Collaborate with customers to understand their technical requirements and provide guidance on best practices for using Amazon Redshift Work with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver data solutions Provide technical support for Amazon Redshift, including troubleshooting, performance optimization, and data modeling Identify and resolve data-related issues, including data pipeline failures, data quality issues, and performance bottlenecks Develop technical documentation and knowledge base articles to help customers and AWS engineers troubleshoot common issues Key Skills / Qualifications

Bachelor’s or Master’s degree in Computer Science or a related field, with at least 6 years of experience in Information Technology Proficiency in one or more programming languages (e.g., Python, Java, Scala) 8+ years of experience in data engineering, with a focus on designing and implementing large-scale data systems 5+ years of hands-on experience in writing complex, highly-optimized queries across large data sets using Oracle, SQL Server and Redshift 5+ years of hands-on experience using AWS Glue, python/pyspark to build ETL pipelines in a production setting, including writing test cases Strong understanding of database design principles, data modeling, and data governance Proficiency in SQL, including query optimization, indexing, and performance tuning Experience with data warehousing concepts, including star and snowflake schemas Strong analytical and problem-solving skills, with the ability to break down complex problems into manageable components Experience with data storage solutions such as relational databases (Oracle, SQL Server), NoSQL databases, or cloud-based data warehouses (Redshift) Experience with data processing frameworks such as Apache Kafka, Fivetran Experience in building ETL pipelines using AWS Glue, Apache Airflow, and programming languages including Python and PySpark Understanding of data quality and governance principles and best practices Experience with agile development methodologies such as Scrum or Kanban EOE (Veteran/Disability) How to Apply

To apply for this position, please follow these steps: Apply for this job with your current resume. We’ll get back to you the same day with some feedback on your application. Next you’ll be invited to an online screening conversation so we can go over the job requirements together and you can ask questions. After this stage we’ll give you more feedback on your application and select the candidates to move forward in the process. Typically steps 1 to 4 takes less than 24 hours. #HRUJobs HRU

#J-18808-Ljbffr