Logo
Orlando Utilities Commission (OUC - The Reliable One)

Data Engineer

Orlando Utilities Commission (OUC - The Reliable One), Orlando, Florida, us, 32885

Save Job

OUC – The Reliable One, is presently seeking a Data Engineer (Cloud & Semantic Modeling) to join the Transformation division. At OUC, we don’t just work — we’re building a bright future of innovation and transformation for future generations.

We’re seeking a Data Engineer who’s passionate about building scalable, intelligent, and high-performance data systems. In this mid-level role, you’ll be responsible for designing and implementing robust ETL/ELT pipelines, optimizing data workflows, and supporting cloud-based and hybrid data infrastructure that powers analytics, business intelligence, and AI/ML initiatives.

You’ll work at the intersection of data integration, modeling, and cloud architecture, ensuring data is clean, accessible, and efficiently processed across platforms like Snowflake, AWS S3, Azure, and Databricks. Your work will enable seamless data delivery for reporting, predictive modeling, and machine learning.

If you’re fluent in SQL or Python, experienced with cloud data platforms, and excited about building semantic models, knowledge graphs, and performance-optimized pipelines, we’d love to hear from you.

OUC’s mission is to provide exceptional value to our customers and community by delivering sustainable and reliable services and solutions.

Click here

to learn more about what we do.

The ideal candidate will have:

Bachelor’s Degree in Information Technology, Computer Science, Computer Engineering, Management Information Systems, Mathematics, Statistics, or a related field of study from an accredited college or university. In lieu of a degree, equivalent combination of education, certifications, and experience may be substitutable on a 1:1 basis;

Master’s Degree in Data Engineering, Information Systems, Computer Science, Mathematics or Statistics, or a related field of study from an accredited college or university (preferred).

Minimum of 3 years of experience in SQL or Python programming, ETL/ELT processes, data warehousing & cloud, data modeling and architecture, data security and governance, and visualization and reporting.

Certifications (Preferred):

SnowPro® Advanced: Data Engineer

Databricks Certified Data Engineer

IBM Certified Data Engineer — Cognos Analytics; IBM Data Engineering Professional Certificate (Coursera)

Cloudera Certified Professional (CCP) Data Engineer

Google Cloud Professional Database Engineer

Microsoft Azure Data Engineer Associate

Snowflake SnowPro Certification

SnowPro® Associate: Platform Certification; SnowPro® Core Certification; SnowPro® Specialty: Snowpark; SnowPro® Specialty: Native Apps; SnowPro® Advanced: Architect

IBM Certified Solution Architect — Cloud Pak for Data

AWS Certified Data Analytics; AWS Certified Solutions Architect — Professional

Microsoft Certified: Azure Solutions Architect Expert

TOGAF 9 Certification

Google Data Engineering Professional Certificate

Oracle Certified Professional or MySQL Database Administrator

Coursera/Udemy Data Engineering Bootcamps

Zachman Certified Enterprise Architect

OUC offers a very competitive compensation and benefits package. Our Total Rewards package includes, to cite a few:

Competitive compensation

Low-cost medical, dental, and vision benefits and paid life insurance premiums with no probationary period. Retirement benefits include a cash balance account with employer matching along with a health reimbursement account

Paid vacation, holidays, and sick time

Paid parental leave

Educational and Professional assistance programs; Paid Memberships in Professional Associations

Access to workout facilities at each location

Paid Conference and Training Opportunities

Free downtown parking

Hybrid work schedule

Click Here To View Our Benefits Summary.

Salary Range:

$100,000.00 to $125,000.00 annually commensurate with experience.

Location: Gardenia Please see below a complete Job description for this position.

Job Purpose:

Performs ETL/ELT processes, designs data models, and implements multidimensional data architectures to optimize data processing for analytics, reporting, business intelligence, and operational decision-making.

Develops and enhances scalable data pipelines, integrations, and storage solutions across on-premises and cloud environments, ensuring high-performance, reliable data accessibility.

Builds and maintains AI/ML-ready data infrastructure, enabling seamless integration of machine learning models and automated analytics workflows. Supports feature engineering, model training, and inference to enhance predictive and prescriptive analytics capabilities. Executes query auditing, code reviews, quality assurance (QA), and data governance to uphold best practices in security, performance, and compliance, ensuring integrity, reliability, and accessibility of data assets.

Translates business requirements into technical designs, optimized data flows, and infrastructure solutions, ensuring alignment with organizational objectives. Collaborates with data scientists, ML engineers, analysts, and IT teams to streamline data workflows, troubleshoot performance bottlenecks, and maintain scalable architectures for structured and unstructured data. Generates effort assessments for data engineering projects, defining Work Breakdown Structures (WBS) to scope and estimate tasks, timelines, and resource requirements. Supports project planning by forecasting development efforts, infrastructure needs, and optimization strategies, ensuring efficiency in execution and delivery.

Primary Functions:

Data Engineering & Integration

Gather and interpret business requirements, translating them into optimized data pipelines, models, and infrastructure solutions that support BI, analytics, and AI-driven decision- making.

Perform data modeling using foundational and modern principles, methodologies, and automation strategies to structure data for predictive analytics, reporting, and AI-driven decision-making.

Implement secure, high-performance integrations between on-premises and cloud-based storage solutions, optimizing data accessibility for BI, analytics, and operational insights.

Develop and maintain scalable ETL/ELT pipelines for efficient data extraction, transformation, and loading, ensuring integration with AI/ML workflows or Data Analytics reporting layer.

Ensure real-time and batch processing capabilities, supporting streaming, and event-driven architectures, while implementing robust data access management solutions to facilitate secure, compliant data accessibility for business intelligence (BI), analytics, and operational decision-making.

Data Warehousing & Cloud Infrastructure

Develop and implement hybrid cloud and on-prem data storage solutions with future-proof scalability and security.

Design and optimize multidimensional data models, ensuring efficient storage and retrieval for AI-assisted data science, analytics and reporting.

Implement star and snowflake schemas, enhancing analytical performance and query optimization.

Utilize data visualization, AI-powered analytics, and business intelligence tools to structure data for strategic for decision-making and automation.

Problem-Solving, Query Auditing, Optimization & Quality Assurance (QA)

Review and audit more complex SQL queries and ETL/ELT pipelines, ensuring efficiency, compliance, and cloud and AI-readiness, adhering to best practices and established standards.

Evaluate data engineering scripts and transformation logic, ensuring integrity, scalability, and security of structured and unstructured datasets.

Conduct troubleshooting for advanced data inconsistencies, optimizing high-performance analytics pipelines across hybrid data environments, leveraging cloud tools or AI-enhanced tuning when needed.

Collaborate with data scientists, reporting analysts, programmers, business systems analysts, solution architects, and IT teams to deliver reliable data solutions.

Identify opportunities and recommend automation, AI-driven improvements, cloud efficiency options, and other process and performance improvement in the data engineering process.

General

Define and generate effort assessments for data engineering tasks, establishing work breakdown structures (WBS) to scope, estimate, and track project execution.

Forecast development efforts, infrastructure requirements, and reporting, Cloud, AI/ML data processing needs, ensuring efficiency and resource allocation.

Support Agile methodologies, aligning engineering workflows with business strategy and evolving project demands.

Apply data governance, AI-augmented security, privacy protocols, and compliance best practices.

Adhere to ethical guidelines and legal regulations governing the collection and use of data, including and not limiting to closed-source intelligence, open-source intelligence data, and AI- generated insights.

Maintain security and confidentiality of sensitive information and data;

Perform other duties as assigned.

Technical Requirements:

Programming languages savvy & Query Optimization expert.

Expertise in SQL, Python, and Scala, ensuring optimized query performance for structured and unstructured data processing.

Proficiency in two or more of the following languages for diverse technical implementations: JavaScript (Snowpark), DAX, Power Query (M), WDL, Power Fx, C#, Perl, VBA, R, Julia.

Strong understanding of big data frameworks and distributed computing for scalable analytics.

ETL/ELT Tools:

Proficiency in ETL/ELT tools, including Talend, Dataiku, Alteryx, Snowpipes, Azure Data Factory, SSIS.

Experience in workflow automation using Power Automate and cloud-based integration platforms.

Expertise in real-time and batch processing, supporting streaming data and event-driven architectures.

Data Warehousing & Cloud:

Strong knowledge of cloud-native data solutions, including Snowflake, Databricks, AWS S3, Azure Synapse, and relational databases like Oracle, MySQL.

Proficiency in VM-based deployments using Azure VMs, AWS EC2, Google Compute Engine for scalable data processing.

Familiarity with IBM Framework for structured data modeling, governance, and analytics.

Experience in data lakes, warehouse optimization, and hybrid cloud architectures for scalable analytics.

Data Modeling & Architecture:

Expertise in data modeling methodologies: Inmon, Kimball, Data Vault, ensuring robust analytics solutions.

Proficiency in IBM Framework, Dataiku, Alteryx, Cognos Analytics, Power BI, and DBeaver to structure enterprise-wide data solutions.

Ability to design AI/ML-ready architectures, supporting predictive and prescriptive analytics.

Data Security & Governance:

Strong knowledge of role-based access control (RBAC), encryption strategies, data lineage, and compliance frameworks.

Familiarity with GDPR, CCPA, HIPAA, NIST, BCSI, and security protocols for data governance and regulatory compliance.

Ability to implement industry best practices for secure data management in cloud and VM environments.

Visualization & Reporting:

Proficiency in Power BI, Tableau, and Cognos Analytics, enabling data-driven decision-making through interactive dashboards.

Strong critical thinking and problem-solving capabilities for optimizing insights.

Ability to translate business needs into technical solutions, ensuring structured data workflows.

Business Alignment & Project Estimation

Ability to effectively communicate insights to non-technical stakeholders, gather and translate business requirements into scalable data architectures.

Expertise in Work Breakdown Structures (WBS), effort estimation, and resource planning for efficient project execution.

Strong collaboration skills with data scientists, ML engineers, analysts, and IT teams.

Strong critical thinking and problem-solving capabilities.

Business acumen with a curiosity for data-driven decision-making.

Ability to rapidly learn and adapt to evolving analytics and reporting tools.

Ability to interpret and analyze data.

Ability to make arithmetic computations using whole numbers, fractions and decimals, and compute rates, ratios, and percentages;

Ability to use Microsoft Office Suite (Outlook, Excel, Word, etc.) and standard office equipment (computer, telephone, fax, copier, etc.)

Education/ Certification/ Years of Experience Requirements:

Bachelor’s Degree in Information Technology, Computer Science, Computer Engineering, Management Information Systems, Mathematics, Statistics, or a related field of study from an accredited college or university. In lieu of a degree, equivalent combination of education, certifications, and experience may be substitutable on a 1:1 basis;

Master’s Degree in Data Engineering, Information Systems, Computer Science, Mathematics or Statistics, or a related field of study from an accredited college or university (preferred).

Minimum of 3 years of experience in SQL or Python programming, ETL/ELT processes, data warehousing & cloud, data modeling and architecture, data security and governance, and visualization and reporting.

Certifications (Preferred):

SnowPro® Advanced: Data Engineer

Databricks Certified Data Engineer

IBM Certified Data Engineer — Cognos Analytics; IBM Data Engineering Professional Certificate (Coursera)

Cloudera Certified Professional (CCP) Data Engineer

Google Cloud Professional Database Engineer

Microsoft Azure Data Engineer Associate

Snowflake SnowPro Certification

SnowPro® Associate: Platform Certification; SnowPro® Core Certification; SnowPro® Specialty: Snowpark; SnowPro® Specialty: Native Apps; SnowPro® Advanced: Architect

IBM Certified Solution Architect — Cloud Pak for Data

AWS Certified Data Analytics; AWS Certified Solutions Architect — Professional

Microsoft Certified: Azure Solutions Architect Expert

TOGAF 9 Certification

Google Data Engineering Professional Certificate

Oracle Certified Professional or MySQL Database Administrator

Coursera/Udemy Data Engineering Bootcamps

Zachman Certified Enterprise Architect

Working Conditions: This job is absent of disagreeable working conditions. This job is performed in an office work environment.

Physical Requirements: This job consists of speaking, hearing, reading, typing and writing. This job requires frequent sitting, occasional standing and walking and may require lifting up to twenty (20) lbs., bending/ stooping, reaching over head.

OUC–The Reliable One is an Equal Opportunity Employer who is committed through responsible management policies to recruit, hire, promote, train, transfer, compensate, and administer all other personnel actions without regard to race, color, ethnicity, national origin, age, religion, disability, marital status, gender, sexual orientation, gender identity or expression, genetic information and any other factor prohibited under applicable federal, state, and local civil rights laws, rules, and regulations.

EOE M/F/Vets/Disabled

#J-18808-Ljbffr