TikTok
Senior Software Engineer, Digital Forensics - Global Security Organization
TikTok, San Jose, California, United States, 95199
Senior Data Engineer, Forensics - Global Security Organization
The data engineer for forensics builds and maintains robust data pipelines that ensure data is collected, processed, and produced in a forensically sound and legally defensible manner. Responsibilities
Big data management: Architect and manage large-scale data storage solutions like data lakes and warehouses to handle petabytes of data from servers, network devices, and cloud environments. Data integrity and chain of custody: Ensure the integrity of the data is maintained throughout the entire e-discovery lifecycle, documenting and maintaining a secure chain of custody to ensure the data is admissible in court. Data management and loading: Ingest, integrate, and organize processed data into review platforms for attorneys to analyze, handling various native file formats and metadata. Workflow automation: Build and maintain data pipelines to automate and orchestrate e-discovery processes, including data ingestion, processing, and production to increase efficiency and reduce manual errors. Processing and culling: Use specialized e-discovery software (e.g., Relativity, Nuix) to process massive datasets by filtering, de- duplicating, and preparing the data for review; apply advanced search techniques to cull data for legal review. Create requirements, work with stakeholders to design, implement, and deploy security tools to improve the efficiency of forensics. Qualifications
Minimum Qualifications
Programming: Strong proficiency in Python for building automated data pipelines and scripts. Database expertise: In-depth knowledge of SQL, as well as NoSQL databases, and experience with database architecture and data modeling. Big data frameworks: Experience with tools like Apache Spark and Kafka for processing and streaming large volumes of data. Cloud computing: Proficiency with cloud platforms such as AWS, Azure, and Google Cloud, central to large-scale data investigations. Forensic tools: Knowledge of traditional forensic tools (e.g., FTK, EnCase, Autopsy, Relativity, Nuix) and how to integrate their outputs into a larger data processing system. Preferred Qualifications
5+ years of applicable experience in Data Engineering, Programming, etc. Digital forensics process: Understanding of acquisition, preservation, analysis, and reporting phases. Legal and ethical compliance: Knowledge of relevant laws and regulations (e.g., GDPR, CCPA) and privacy considerations in digital evidence collection. Chain of custody: Understanding strict procedural requirements for maintaining integrity and legal admissibility of digital evidence.
#J-18808-Ljbffr
The data engineer for forensics builds and maintains robust data pipelines that ensure data is collected, processed, and produced in a forensically sound and legally defensible manner. Responsibilities
Big data management: Architect and manage large-scale data storage solutions like data lakes and warehouses to handle petabytes of data from servers, network devices, and cloud environments. Data integrity and chain of custody: Ensure the integrity of the data is maintained throughout the entire e-discovery lifecycle, documenting and maintaining a secure chain of custody to ensure the data is admissible in court. Data management and loading: Ingest, integrate, and organize processed data into review platforms for attorneys to analyze, handling various native file formats and metadata. Workflow automation: Build and maintain data pipelines to automate and orchestrate e-discovery processes, including data ingestion, processing, and production to increase efficiency and reduce manual errors. Processing and culling: Use specialized e-discovery software (e.g., Relativity, Nuix) to process massive datasets by filtering, de- duplicating, and preparing the data for review; apply advanced search techniques to cull data for legal review. Create requirements, work with stakeholders to design, implement, and deploy security tools to improve the efficiency of forensics. Qualifications
Minimum Qualifications
Programming: Strong proficiency in Python for building automated data pipelines and scripts. Database expertise: In-depth knowledge of SQL, as well as NoSQL databases, and experience with database architecture and data modeling. Big data frameworks: Experience with tools like Apache Spark and Kafka for processing and streaming large volumes of data. Cloud computing: Proficiency with cloud platforms such as AWS, Azure, and Google Cloud, central to large-scale data investigations. Forensic tools: Knowledge of traditional forensic tools (e.g., FTK, EnCase, Autopsy, Relativity, Nuix) and how to integrate their outputs into a larger data processing system. Preferred Qualifications
5+ years of applicable experience in Data Engineering, Programming, etc. Digital forensics process: Understanding of acquisition, preservation, analysis, and reporting phases. Legal and ethical compliance: Knowledge of relevant laws and regulations (e.g., GDPR, CCPA) and privacy considerations in digital evidence collection. Chain of custody: Understanding strict procedural requirements for maintaining integrity and legal admissibility of digital evidence.
#J-18808-Ljbffr