Intellect Solutions LLC
Clearance: Active Secret of above
Requirements Gathering Collaborate with business analysts, data architects, and stakeholders to gather requirements and ensure data integrity throughout the migration lifecycle. Ability to create Data mapping documents (Source to Target) by understanding business rules. Data Understanding/Profiling Perform data profiling, cleansing, and quality checks to ensure accuracy and completeness of migrated data. Analyze and map source Oracle database schemas to target Dataverse tables/entities, defining transformation and cleansing logic as required. Technical Requirements for Development Write and optimize SQL queries, stored procedures, and scripts for Oracle databases to support data migration and validation. Implement data transformation processes to ensure data compatibility with Dataverse data types, relationships, and constraints. Experience with Dataverse data types and CRUD operations Experience configuring data pipelines inADF(to extract, transform, load). Experience using .csv files as sources of data for theADFpipelines Experience using Oracle CDCs as sources of data for theADFpipelines. Experience logging errors (audit framework) from anADFpipeline to a Dataverse table Experience pushing data to Dataverse table(s) including necessary transformations to match target field types (i.e. Dataverse lookups, choices, date/time, etc.) Experience configuring setting for pipelines to improve concurrency/performance (i.e.,ADFcores, memory, etc.) Experience with additional Azure services (e.g., Logic Apps, Data Lake, Azure Functions, Azure SQL). Experience/exposurein python/shell scripting is a plus Data Validation Post migration Ability to generate a reconcile report of the migrated data using SQL or scripts Code Migration Must be familiar with Git or source code repository & CI/CD process.
#J-18808-Ljbffr
Requirements Gathering Collaborate with business analysts, data architects, and stakeholders to gather requirements and ensure data integrity throughout the migration lifecycle. Ability to create Data mapping documents (Source to Target) by understanding business rules. Data Understanding/Profiling Perform data profiling, cleansing, and quality checks to ensure accuracy and completeness of migrated data. Analyze and map source Oracle database schemas to target Dataverse tables/entities, defining transformation and cleansing logic as required. Technical Requirements for Development Write and optimize SQL queries, stored procedures, and scripts for Oracle databases to support data migration and validation. Implement data transformation processes to ensure data compatibility with Dataverse data types, relationships, and constraints. Experience with Dataverse data types and CRUD operations Experience configuring data pipelines inADF(to extract, transform, load). Experience using .csv files as sources of data for theADFpipelines Experience using Oracle CDCs as sources of data for theADFpipelines. Experience logging errors (audit framework) from anADFpipeline to a Dataverse table Experience pushing data to Dataverse table(s) including necessary transformations to match target field types (i.e. Dataverse lookups, choices, date/time, etc.) Experience configuring setting for pipelines to improve concurrency/performance (i.e.,ADFcores, memory, etc.) Experience with additional Azure services (e.g., Logic Apps, Data Lake, Azure Functions, Azure SQL). Experience/exposurein python/shell scripting is a plus Data Validation Post migration Ability to generate a reconcile report of the migrated data using SQL or scripts Code Migration Must be familiar with Git or source code repository & CI/CD process.
#J-18808-Ljbffr