Logo
AHU Technologies, Inc.

16 years Big Data Architect IT Consultant with Databricks

AHU Technologies, Inc., Washington, District of Columbia, us, 20022

Save Job

Role : Big Data Architect IT Consultant Master Client : State of DC Location : Washington,D.C.

Job description

This role will provide expertise to support the development of a Big Data / Data Lake system architecture that supports enterprise data operations for the District of Columbia government, including the Internet of Things (IoT) / Smart City projects, enterprise data warehouse, the open data portal, and data science applications. This is an exciting opportunity to work as a part of a collaborative senior data team supporting DC's Chief Data Officer. This architecture includes an Databricks, Microsoft Azure platform tools (including Data Lake, Synapse), Apache platform tools (including Hadoop, Hive, Impala, Spark, Sedona, Airflow) and data pipeline/ETL development tools (including Streamsets, Apache NiFi, Azure Data Factory). The platform will be designed for District wide use and integration with other OCTO Enterprise Data tools such as Esri, Tableau, MicroStrategy, API Gateways, and Oracle databases and integration tools.

Responsibilities: 1. Coordinates IT project management, engineering, maintenance, QA, and risk management. 2. Plans, coordinates, and monitors project activities. 3. Develops technical applications to support users. 4. Develops, implements, maintains and enforces documented standards and procedures for the design, development, installation, modification, and documentation of assigned systems. 5. Provides training for system products and procedures. 6. Performs application upgrades. 7. Performs, monitoring, maintenance, or reporting on real- time databases, real-time network and serial data communications, and real-time graphics and logic applications. 8. Troubleshoots problems. 9. Ensures project life-cycle is in compliance with District standards and procedures.

Minimum Education/Certification Requirements: Bachelor's degree in Information Technology or related field or equivalent experience

Skills

Experience implementing Big Data storage and analytics platforms such as Databricks and Data Lakes Knowledge of Big Data and Data Architecture and Implementation best practices - 5 Years Knowledge of architecture and implementation of networking, security and storage on cloud platforms such as Microsoft Azure - 5 Years Experience with deployment of data tools and storage on cloud platforms such as Microsoft Azure - 5 Years Knowledge of Data-centric systems for the analysis and visualization of data, such as Tableau, MicroStrategy, ArcGIS, Kibana, Oracle - 10 Years Experience querying structured and unstructured data sources including SQL and NoSQL databases - 5 Years Experience modeling and ingesting data into and between various data systems through the use of Data Pipelines - 5 Years Experience with implementing Apache data products such as Spark, Sedona, Airflow, Atlas, NiFi, Hive, Impala - 5 Years Experience with API / Web Services (REST/SOAP) - 3 Years Experience with complex event processing and real-time streaming data - 3 Years Experience with deployment and management of data science tools and modules such as JupyterHub - 3 Years Experience with ETL, data processing, analytics using languages such as Python, Java or R - 3 Years Experience with Cloudera Data Platform - 3 Years 16+ yrs planning, coordinating, and monitoring project activities - 16 Years 16+ yrs leading projects, ensuring they are in compliance with established standards/procedures - 16 Years Bachelor's degree in IT or related field or equivalent experience Required 5 Years Knowledge of Big Data and Data Architecture and Implementation best practices

5 Years

Knowledge of architecture and implementation of networking, security and storage on cloud platforms such as Microsoft Azure - 5 Years

Experience with deployment of data tools and storage on cloud platforms such as Microsoft Azure -5 Years

Knowledge of Data-centric systems for the analysis and visualization of data, such as Tableau, MicroStrategy, ArcGIS, Kibana, Oracle

10 Years Experience querying structured and unstructured data sources including SQL and NoSQL databases - 5 Years

Experience modeling and ingesting data into and between various data systems through the use of Data Pipelines -5 Years

Experience with implementing Apache data products such as Spark, Sedona, Airflow, Atlas, NiFi, Hive, Impala - 5 Years Experience with API / Web Services (REST/SOAP) - 3 Years

Experience with complex event processing and real-time streaming data - 3 Years

Experience with deployment and management of data science tools and modules such as JupyterHub - 3 Years Experience with ETL, data processing, analytics using languages such as Python, Java or R - 3 Years

Experience with Cloudera Data Platform - 3 Years

16+ yrs planning, coordinating, and monitoring project activities - 16 Years

16+ yrs leading projects, ensuring they are in compliance with established standards/procedures - 16 Years

Bachelor's degree in IT or related field or equivalent experience

Required

Compensation: $75.00 - $80.00 per hour

About Us

AHU Technologies INC is an IT consulting and permanent staffing firm that meets and exceeds the evolving IT service needs of leading corporations within the United States. We have been providing IT solutions to customers from different industry sectors, helping them control costs and release internal resources to focus on strategic issues.

AHU Technologies INC

was co-founded by visionary young techno-commercial entrepreneurs who remain as our principal consultants. Maintaining working relationships with a cadre of other highly skilled independent consultants, we have a growing number of resources available for development projects. We are currently working on Various projects such as media entertainment, ERP Solutions, data warehousing, Web Applications, Telecommunications and medical to our clients all over the world.