Gasoc
Overview
Georgia System Operations is a progressive organization offering opportunities for engineers, technicians, project managers, and more. We’ve been honored with Best Place to Work in Georgia. Our people-over-profit culture and competitive compensation and benefits packages prove we’re dedicated to retaining the best candidates. We offer comprehensive medical, dental, and vision coverage, a strong retirement program, career development, and flexible work schedules. We’re focused on wellness and being a supportive member of the community. Basic insurance for accidental death and dismemberment, long-term disability, and life insurance are available at no cost. Employees can opt to pay for more coverage. A competitive retirement plan, with company match and company contributions, is available for full-time employees. We offer many options for our employees’ well-being, including an employee assistance program, an on-site fitness center, and several wellness-focused programs. Educational reimbursement is available for full-time employees. Employees can also participate in a 529 college savings plan. Employees can participate in voluntary benefits, covering hospitalization and critical illness, legal and ID theft protection, and pet insurance. Vacation and sick leave are available for full-time positions via the paid time off program. GSOC is closed for 9 national holidays annually. We support growth and development for all our employees through an on-site training program, online learning tools, and programs designed to develop industry knowledge. Our employees are given volunteer paid time off every year to contribute to the community service organization of their choice. Department:
Shared Services IT Full Time $96600 - $168800 per year Tucker, Georgia, United States Responsibilities
Job Duties: Data Pipeline Engineering:
Design, build, and maintain reliable ETL/ELT pipelines to extract from diverse sources, transform and validate data, and load to enterprise storage/warehouse layers; optimize for scalability, performance, and cost. Integration & Modeling:
Integrate data from databases, APIs, and external systems; enforce consistency and integrity; contribute to dimensional and lakehouse modeling patterns that support BI/AI use cases. Platform Engineering:
Leverage
Azure Data Factory ,
Synapse ,
Databricks , and
Spark
to standardize ingestion/processing frameworks; automate jobs, monitoring, and alerting for resilient operations. Performance & Reliability:
Tune pipelines, queries, and clusters; address bottlenecks; apply caching, indexing/partitioning, and workload management for dependable SLAs. Quality & Governance:
Implement in‑pipeline data quality checks and validation rules; document lineage/assumptions; contribute to cataloging and stewardship practices in partnership with data governance. Collaboration:
Partner with data analysts and scientists to productionize data for dashboards and models; translate business needs into technical designs and reusable data products. Continuous Improvement:
Evaluate emerging tools and methods (e.g., orchestration, streaming, cost/perf optimization); proactively enhance standards, templates, and developer experience. Qualifications
Required Qualifications: Education:
Bachelor’s degree in Computer Science, Data Science, Software Engineering, Information Systems, or a related quantitative field; Master’s degree preferred. Experience: Level III
- Minimum of
4 years
in data engineering (or closely related), including hands‑on pipeline development and operations. Level IV
- Minimum of
6 years
designing and managing large‑scale data solutions; leads project workstreams and cross‑functional delivery. Level V
- Minimum of
8 years
architecting and operating enterprise data platforms; standardizes patterns and provides technical leadership across IT. Equivalent Experience (in lieu of degree requirements above): Level III
- Minimum of
8 years
of relevant experience may also be considered. Level IV
- Minimum of
10 years
of relevant experience may also be considered. Level V
- Minimum of
12 years
of relevant experience may also be considered. Responsibility: Level III
- Independently delivers production‑grade pipelines and data models; contributes to standards; begins leading small initiatives. Level IV
- Leads the design and rollout of new data domains and frameworks; mentors junior engineers; partners with stakeholders to improve data product reliability and usability. Level V
- Oversees major data platform initiatives; sets best practices for modeling, orchestration, performance, security, and governance; recognized as a subject matter expert across the IT function. Licenses, Certifications, and/or Registrations (plus, not required): ITIL Foundation . Specialized Skills
Technical Expertise Proficiency with distributed data processing/orchestration (e.g.,
Apache Spark ,
Airflow ,
Kafka ) to build scalable pipelines and streaming/batch workloads. Strong programming skills in Python
and/or
Java ; expert
SQL
for transformation and performance‑minded querying. Experience designing and deploying solutions on modern cloud data platforms, especially
Azure
(Data Factory, Synapse,
Databricks , ADLS); exposure to
Snowflake
is a plus. Data Architecture & Warehousing Knowledge of lakehouse/warehouse concepts (e.g., medallion layering, dimensional modeling, partitioning); experience with relational and NoSQL stores. Data Governance & Security Implement data quality checks, schema enforcement, and lineage; align with stewardship, cataloging, and compliance standards (e.g.,
SOX ) in partnership with IT and Security. Soft Skills Excellent problem‑solving/analytical skills and attention to detail; strong communication with both technical and business stakeholders; customer‑service orientation and positive attitude.
#J-18808-Ljbffr
Georgia System Operations is a progressive organization offering opportunities for engineers, technicians, project managers, and more. We’ve been honored with Best Place to Work in Georgia. Our people-over-profit culture and competitive compensation and benefits packages prove we’re dedicated to retaining the best candidates. We offer comprehensive medical, dental, and vision coverage, a strong retirement program, career development, and flexible work schedules. We’re focused on wellness and being a supportive member of the community. Basic insurance for accidental death and dismemberment, long-term disability, and life insurance are available at no cost. Employees can opt to pay for more coverage. A competitive retirement plan, with company match and company contributions, is available for full-time employees. We offer many options for our employees’ well-being, including an employee assistance program, an on-site fitness center, and several wellness-focused programs. Educational reimbursement is available for full-time employees. Employees can also participate in a 529 college savings plan. Employees can participate in voluntary benefits, covering hospitalization and critical illness, legal and ID theft protection, and pet insurance. Vacation and sick leave are available for full-time positions via the paid time off program. GSOC is closed for 9 national holidays annually. We support growth and development for all our employees through an on-site training program, online learning tools, and programs designed to develop industry knowledge. Our employees are given volunteer paid time off every year to contribute to the community service organization of their choice. Department:
Shared Services IT Full Time $96600 - $168800 per year Tucker, Georgia, United States Responsibilities
Job Duties: Data Pipeline Engineering:
Design, build, and maintain reliable ETL/ELT pipelines to extract from diverse sources, transform and validate data, and load to enterprise storage/warehouse layers; optimize for scalability, performance, and cost. Integration & Modeling:
Integrate data from databases, APIs, and external systems; enforce consistency and integrity; contribute to dimensional and lakehouse modeling patterns that support BI/AI use cases. Platform Engineering:
Leverage
Azure Data Factory ,
Synapse ,
Databricks , and
Spark
to standardize ingestion/processing frameworks; automate jobs, monitoring, and alerting for resilient operations. Performance & Reliability:
Tune pipelines, queries, and clusters; address bottlenecks; apply caching, indexing/partitioning, and workload management for dependable SLAs. Quality & Governance:
Implement in‑pipeline data quality checks and validation rules; document lineage/assumptions; contribute to cataloging and stewardship practices in partnership with data governance. Collaboration:
Partner with data analysts and scientists to productionize data for dashboards and models; translate business needs into technical designs and reusable data products. Continuous Improvement:
Evaluate emerging tools and methods (e.g., orchestration, streaming, cost/perf optimization); proactively enhance standards, templates, and developer experience. Qualifications
Required Qualifications: Education:
Bachelor’s degree in Computer Science, Data Science, Software Engineering, Information Systems, or a related quantitative field; Master’s degree preferred. Experience: Level III
- Minimum of
4 years
in data engineering (or closely related), including hands‑on pipeline development and operations. Level IV
- Minimum of
6 years
designing and managing large‑scale data solutions; leads project workstreams and cross‑functional delivery. Level V
- Minimum of
8 years
architecting and operating enterprise data platforms; standardizes patterns and provides technical leadership across IT. Equivalent Experience (in lieu of degree requirements above): Level III
- Minimum of
8 years
of relevant experience may also be considered. Level IV
- Minimum of
10 years
of relevant experience may also be considered. Level V
- Minimum of
12 years
of relevant experience may also be considered. Responsibility: Level III
- Independently delivers production‑grade pipelines and data models; contributes to standards; begins leading small initiatives. Level IV
- Leads the design and rollout of new data domains and frameworks; mentors junior engineers; partners with stakeholders to improve data product reliability and usability. Level V
- Oversees major data platform initiatives; sets best practices for modeling, orchestration, performance, security, and governance; recognized as a subject matter expert across the IT function. Licenses, Certifications, and/or Registrations (plus, not required): ITIL Foundation . Specialized Skills
Technical Expertise Proficiency with distributed data processing/orchestration (e.g.,
Apache Spark ,
Airflow ,
Kafka ) to build scalable pipelines and streaming/batch workloads. Strong programming skills in Python
and/or
Java ; expert
SQL
for transformation and performance‑minded querying. Experience designing and deploying solutions on modern cloud data platforms, especially
Azure
(Data Factory, Synapse,
Databricks , ADLS); exposure to
Snowflake
is a plus. Data Architecture & Warehousing Knowledge of lakehouse/warehouse concepts (e.g., medallion layering, dimensional modeling, partitioning); experience with relational and NoSQL stores. Data Governance & Security Implement data quality checks, schema enforcement, and lineage; align with stewardship, cataloging, and compliance standards (e.g.,
SOX ) in partnership with IT and Security. Soft Skills Excellent problem‑solving/analytical skills and attention to detail; strong communication with both technical and business stakeholders; customer‑service orientation and positive attitude.
#J-18808-Ljbffr