Kilroy Realty
Opportunity Requirements
Degree in Business with emphasis on Computer Technology/Information Systems OR in Business/Operations with a strong interest in IT; or equivalent combination of education
10+ years of experience in data engineering, ETL development, or enterprise data architecture.
Deep expertise in Azure cloud data services, including Azure Data Factory, Synapse Analytics, Microsoft Fabric, and Databricks.
Familiarity with DevOps, CI/CD (Azure DevOps), containerization (Docker, Kubernetes), and Terraform for data infrastructure automation.
Working knowledge of BI/Reporting Tools (Power BI, Tableau, Looker) and API-based data integrations.
Strong proficiency in Data Models, SQL, and Python with a focus on performance tuning and distributed computing.
Extensive experience building and optimizing ETL/ELT pipelines for large-scale data processing.
Strong understanding of data modeling (star schema, snowflake schema) and database optimization techniques.
Experience with real-time and batch data processing architectures, including Kafka, Spark Streaming, or Azure Event Hub.
Hands-on experience with data security, governance, and compliance frameworks.
Ability to translate complex business requirements into scalable data solutions, collaborating with both technical and non-technical stakeholders.
Familiarity with version control systems like Git
Understanding of SOX compliance and audit procedures preferred
Willingness to be a team player and handle a variety of tasks that vary in complexity
Strong attention to detail with project management experience and good organizational skills
Ability to adapt and understand priorities
Self-starter with strong organizational skills and the ability to manage multiple tasks
Preferred Qualifications
Experience in real estate, financial services, or investment industries. Familiarity with serverless computing (AWS Lambda, Azure Functions) for event-driven data processing. Experience with data governance and cataloging tools such as Microsoft Purview, including metadata management, data lineage tracking, and access control to ensure compliance and data discoverability. Summary of Responsibilities
The core responsibilities of this position include, but are not limited to the following:
Collaborate with other departments and technical team to develop requirements and scope of reports or data projects. Design, develop, and maintain ETL/ELT pipelines to ingest, transform, and load data from various sources into an Azure environment, with a focus on Microsoft Fabric Automate and optimize data workflows for scalability and efficiency. Connect disparate data sources, such as databases, APIs, cloud platforms, and on-premise systems, into a unified architecture. Collaborate with internal teams to integrate data into analytical tools such as Power BI. Build and maintain data warehouses, data lakes, or other storage solutions (e.g., Microsoft Fabric Lakehouse, Microsoft Fabric Warehouse). Ensure the accuracy, integrity, and consistency of data across all systems. Implement and maintain data security, compliance, and privacy protocols. Establish and enforce best practices for data management, quality, and documentation. Provide support for data-related issues and optimize performance for reporting. Monitor and improve the performance of databases and data pipelines. Troubleshoot bottlenecks and resolve issues in data processing workflows. Stay updated on emerging technologies and best practices in data engineering. Propose and implement innovative solutions to improve the organization's data infrastructure. Architect and build large-scale ETL/ELT pipelines using Azure Data Factory, Microsoft Fabric Dataflows, and Databricks notebooks. Develop advanced data models for data warehouses, datamarts, and data lakes, ensuring scalability and efficiency. Develop and maintain scalable CI/CD pipelines and infrastructure-as-code (IaC) methodologies. Integrate data across disparate sources, including APIs, databases, streaming platforms, cloud services, and on-prem systems. Develop real-time and batch data processing solutions, enabling advanced AI, machine learning, and analytics applications. Ensure high availability and reliability of enterprise data platforms, proactively addressing latency and performance bottlenecks.
What we offer At Kilroy, base pay is one part of our total compensation package and is determined within a range. This provides the opportunity to progress as you grow and develop within the role. The base pay range for this role is between $164,00 to $195,000 and your base pay will depend on your skills, experience and training, knowledge, licensure and certifications, and other business and organizational needs. It is not typical for an individual to be hired at or near the top of the range for their role and compensation decisions are dependent on the facts and circumstances of each case. This role is eligible for an annual discretionary bonus as well.
Our comprehensive group health benefits program is built around your total health and provides employees and their families with care and coverage designed to help you thrive. Our health and wellness program offerings include medical, dental, vision, with FSA, HSA options, Group Life & Disability, LTD coverage and much more. Ancillary programs include a retirement savings plan with a competitive employer match, employee support programs like our parental leave coaching program, wellness, and commuter benefits, just to name a few. We invite you to visit our website at www.kilroyrealty.com to learn more.
Equal Opportunity Employer This employer is required to notify all applicants of their rights pursuant to federal employment laws. For further information, please review the Know Your Rights notice from the Department of Labor.
Experience in real estate, financial services, or investment industries. Familiarity with serverless computing (AWS Lambda, Azure Functions) for event-driven data processing. Experience with data governance and cataloging tools such as Microsoft Purview, including metadata management, data lineage tracking, and access control to ensure compliance and data discoverability. Summary of Responsibilities
The core responsibilities of this position include, but are not limited to the following:
Collaborate with other departments and technical team to develop requirements and scope of reports or data projects. Design, develop, and maintain ETL/ELT pipelines to ingest, transform, and load data from various sources into an Azure environment, with a focus on Microsoft Fabric Automate and optimize data workflows for scalability and efficiency. Connect disparate data sources, such as databases, APIs, cloud platforms, and on-premise systems, into a unified architecture. Collaborate with internal teams to integrate data into analytical tools such as Power BI. Build and maintain data warehouses, data lakes, or other storage solutions (e.g., Microsoft Fabric Lakehouse, Microsoft Fabric Warehouse). Ensure the accuracy, integrity, and consistency of data across all systems. Implement and maintain data security, compliance, and privacy protocols. Establish and enforce best practices for data management, quality, and documentation. Provide support for data-related issues and optimize performance for reporting. Monitor and improve the performance of databases and data pipelines. Troubleshoot bottlenecks and resolve issues in data processing workflows. Stay updated on emerging technologies and best practices in data engineering. Propose and implement innovative solutions to improve the organization's data infrastructure. Architect and build large-scale ETL/ELT pipelines using Azure Data Factory, Microsoft Fabric Dataflows, and Databricks notebooks. Develop advanced data models for data warehouses, datamarts, and data lakes, ensuring scalability and efficiency. Develop and maintain scalable CI/CD pipelines and infrastructure-as-code (IaC) methodologies. Integrate data across disparate sources, including APIs, databases, streaming platforms, cloud services, and on-prem systems. Develop real-time and batch data processing solutions, enabling advanced AI, machine learning, and analytics applications. Ensure high availability and reliability of enterprise data platforms, proactively addressing latency and performance bottlenecks.
What we offer At Kilroy, base pay is one part of our total compensation package and is determined within a range. This provides the opportunity to progress as you grow and develop within the role. The base pay range for this role is between $164,00 to $195,000 and your base pay will depend on your skills, experience and training, knowledge, licensure and certifications, and other business and organizational needs. It is not typical for an individual to be hired at or near the top of the range for their role and compensation decisions are dependent on the facts and circumstances of each case. This role is eligible for an annual discretionary bonus as well.
Our comprehensive group health benefits program is built around your total health and provides employees and their families with care and coverage designed to help you thrive. Our health and wellness program offerings include medical, dental, vision, with FSA, HSA options, Group Life & Disability, LTD coverage and much more. Ancillary programs include a retirement savings plan with a competitive employer match, employee support programs like our parental leave coaching program, wellness, and commuter benefits, just to name a few. We invite you to visit our website at www.kilroyrealty.com to learn more.
Equal Opportunity Employer This employer is required to notify all applicants of their rights pursuant to federal employment laws. For further information, please review the Know Your Rights notice from the Department of Labor.