Southern Arkansas University
Minimum 6 years of development experience in data technologies.
Minimum 5 years’ experience in
Python or Java
for building Big Data pipelines for data ingestion (structured/unstructured raw data) and streaming (Kafka). Experienced with data wrangling and preparation for use within data science, business intelligence or similar analytical functions required. Experience in
Hadoop PySpark . Understanding of BIAN & Data modelling knowledge. Experience in Analytics and reporting (Tableau). Experience in Common RDBMS (Oracle, SQL Server, etc.) & NoSQL (MongoDB etc.) databases, Neo4J Graph DB, Hadoop, MapReduce, Spark, Sqoop,
Hive , DW/Data Lake knowledge/ETL Informatica, Data Ingestion/Streaming tools Apache Kafka, Reporting/Analytics - Dremio, Tableau. Experience in banking & finance industry. Skill set: Hadoop / Big Data: 5 years Database RDBMS (Oracle/Teradata) / Hive - Impala / Mongo: 4 years Python: 5 years Spark: 4 years Database Modelling: 5 years NoSQL: 4 years Neo4J: 2 years
#J-18808-Ljbffr
Python or Java
for building Big Data pipelines for data ingestion (structured/unstructured raw data) and streaming (Kafka). Experienced with data wrangling and preparation for use within data science, business intelligence or similar analytical functions required. Experience in
Hadoop PySpark . Understanding of BIAN & Data modelling knowledge. Experience in Analytics and reporting (Tableau). Experience in Common RDBMS (Oracle, SQL Server, etc.) & NoSQL (MongoDB etc.) databases, Neo4J Graph DB, Hadoop, MapReduce, Spark, Sqoop,
Hive , DW/Data Lake knowledge/ETL Informatica, Data Ingestion/Streaming tools Apache Kafka, Reporting/Analytics - Dremio, Tableau. Experience in banking & finance industry. Skill set: Hadoop / Big Data: 5 years Database RDBMS (Oracle/Teradata) / Hive - Impala / Mongo: 4 years Python: 5 years Spark: 4 years Database Modelling: 5 years NoSQL: 4 years Neo4J: 2 years
#J-18808-Ljbffr