Source Code Technologies LLC
Overview
Role
Data Analytics Engineer Location
Austin TX or Sunnyvale CA (Hybrid onsite) Duration : 6-12 months NOTE: Candidates without sponsorship only are encouraged to apply Core Professional Competencies
++Communication++: Clearly communicates technical work to diverse audiences, verbally and in writing. Participates in peer reviews and team discussions with clarity and purpose. ++Documentation++: Maintains clear, structured documentation of project logic, decisions, and maintenance. Contributes to team standards for reproducibility and transparency. ++Collaboration++: Works effectively with cross-functional partners. Values shared ownership and ensures continuity through knowledge sharing. ++Initiative++: Comfortable in ambiguity; proactively identifies issues and opportunities. Demonstrates curiosity and critical thinking. ++Attention to Detail++: Delivers high-quality, consistent code and documentation that supports long-term maintainability and trust in data systems. Data Engineering Expertise (5 years)
Experienced in building and maintaining data pipelines (ETL/ELT) Proficient with orchestration tools (e.g., Airflow, dbt, Prefect) Comfortable working with cloud platforms (e.g., AWS) and tools like Snowflake Familiar with data lake and warehouse architecture (e.g., S3 Athena, Delta Lake) Strong Python skills for data manipulation (e.g., pandas, pyarrow, pyspark) Data Infrastructure & Management (5 years)
Expertise in data modeling (star/snowflake schemas, normalization, dimensional modeling) Skilled in maintaining data quality and integrity (data monitoring, validation, deduplication, anomaly detection) Familiar with version control, CI/CD practices for data workflows (e.g., Git)
#J-18808-Ljbffr
Role
Data Analytics Engineer Location
Austin TX or Sunnyvale CA (Hybrid onsite) Duration : 6-12 months NOTE: Candidates without sponsorship only are encouraged to apply Core Professional Competencies
++Communication++: Clearly communicates technical work to diverse audiences, verbally and in writing. Participates in peer reviews and team discussions with clarity and purpose. ++Documentation++: Maintains clear, structured documentation of project logic, decisions, and maintenance. Contributes to team standards for reproducibility and transparency. ++Collaboration++: Works effectively with cross-functional partners. Values shared ownership and ensures continuity through knowledge sharing. ++Initiative++: Comfortable in ambiguity; proactively identifies issues and opportunities. Demonstrates curiosity and critical thinking. ++Attention to Detail++: Delivers high-quality, consistent code and documentation that supports long-term maintainability and trust in data systems. Data Engineering Expertise (5 years)
Experienced in building and maintaining data pipelines (ETL/ELT) Proficient with orchestration tools (e.g., Airflow, dbt, Prefect) Comfortable working with cloud platforms (e.g., AWS) and tools like Snowflake Familiar with data lake and warehouse architecture (e.g., S3 Athena, Delta Lake) Strong Python skills for data manipulation (e.g., pandas, pyarrow, pyspark) Data Infrastructure & Management (5 years)
Expertise in data modeling (star/snowflake schemas, normalization, dimensional modeling) Skilled in maintaining data quality and integrity (data monitoring, validation, deduplication, anomaly detection) Familiar with version control, CI/CD practices for data workflows (e.g., Git)
#J-18808-Ljbffr