< Return to Video

Digital Forensics Best Practices: From Data Acquisition to Analysis

  • 0:00 - 0:02
    Hello, everyone, and welcome to today's
  • 0:02 - 0:06
    session on digital forensics: best practices
  • 0:06 - 0:09
    from data acquisition to analysis. I'm
  • 0:09 - 0:11
    Shilpa Goswami, and I'll be your host
  • 0:11 - 0:13
    for the day. Before we get
  • 0:13 - 0:16
    started, we would like to go over a few
  • 0:16 - 0:18
    house rules for our attendees. The
  • 0:18 - 0:20
    session will be in listen-only mode and
  • 0:20 - 0:23
    will last for an hour, of which the
  • 0:23 - 0:26
    last 15 minutes will be dedicated to Q&A.
  • 0:26 - 0:28
    If you have any questions during the
  • 0:28 - 0:31
    webinar, for our organizers or
  • 0:31 - 0:34
    speakers, please use the Q&A window. Also, if you
  • 0:34 - 0:36
    face any audio or video challenges, please
  • 0:36 - 0:38
    check your internet connection or you
  • 0:38 - 0:41
    may log out and log in again. An
  • 0:41 - 0:44
    important announcement for our audience:
  • 0:44 - 0:46
    we have initiated CPE credit
  • 0:46 - 0:49
    certificates for our participants. To
  • 0:49 - 0:51
    qualify for one, attendees are required
  • 0:51 - 0:54
    to attend the entire webinar and then
  • 0:54 - 0:59
    send an email to cyber talks at eccouncil.org,
  • 0:59 - 1:01
    after which our team will
  • 1:01 - 1:04
    issue the CPE certificate. Also, we would
  • 1:04 - 1:06
    like to inform our audience about the
  • 1:06 - 1:09
    special handouts. Take a screenshot of
  • 1:09 - 1:11
    the running webinar and post it on your
  • 1:11 - 1:15
    social media, LinkedIn or Twitter, tagging
  • 1:15 - 1:18
    EC Council and Cyber Talks. We will
  • 1:18 - 1:21
    share free handouts with the first 15
  • 1:21 - 1:24
    attendees. As a commitment to closing the
  • 1:24 - 1:27
    cybersecurity workforce gap by creating
  • 1:27 - 1:30
    multi-domain cyber technicians, EC Council
  • 1:30 - 1:35
    pledges $3,500,000 towards ECT
  • 1:35 - 1:37
    Education and Certification Scholarships
  • 1:37 - 1:40
    to certify approximately 10,000 cyber
  • 1:40 - 1:43
    professionals ready to contribute to the
  • 1:43 - 1:45
    industry. Did you know that you can be
  • 1:45 - 1:46
    part of the lucrative cybersecurity
  • 1:46 - 1:50
    industry? Even top companies like Google,
  • 1:50 - 1:54
    Microsoft, Amazon, IBM, Facebook, and Dell
  • 1:54 - 1:56
    all hire cybersecurity professionals.
  • 1:56 - 1:59
    The cybersecurity industry has a 0%
  • 1:59 - 2:00
    unemployment rate. The average salary
  • 2:00 - 2:02
    for an entry-level cybersecurity job is
  • 2:02 - 2:05
    about $100,000 per year in the United
  • 2:05 - 2:07
    States. Furthermore, you don't need to
  • 2:07 - 2:10
    know coding, and you can learn from home, and
  • 2:10 - 2:11
    you get a scholarship to kick-start your
  • 2:11 - 2:15
    career. Apply now. EC Council is pledging
  • 2:15 - 2:19
    a $3,500,000 CCT scholarship for cybersecurity
  • 2:19 - 2:21
    career starters. Scan the QR
  • 2:21 - 2:22
    code on the screen to apply for the
  • 2:22 - 2:25
    scholarship. Fill out the form.
  • 2:32 - 2:34
    Now, about our
  • 2:34 - 2:38
    speaker Dr. Luis. Dr. Luis Noguerol is the
  • 2:38 - 2:40
    Information Systems Security Officer for
  • 2:40 - 2:44
    the U.S. Department of Commerce, NOAA,
  • 2:44 - 2:45
    where he oversees the cybersecurity
  • 2:45 - 2:47
    operation for six states in the
  • 2:47 - 2:50
    Southeast Region. Dr. Luis is also the
  • 2:50 - 2:52
    President and CEO of the Advanced
  • 2:52 - 2:54
    Division of Informatics and Technology,
  • 2:54 - 2:58
    Technology INC, a company that focuses on
  • 2:58 - 3:01
    data recovery, digital forensics, and
  • 3:01 - 3:03
    penetration testing. He is a world-renowned
  • 3:03 - 3:06
    expert in data recovery, digital
  • 3:06 - 3:08
    forensics, and penetration testing. He
  • 3:08 - 3:11
    holds multiple globally recognized
  • 3:11 - 3:13
    information technology and cybersecurity
  • 3:13 - 3:15
    certifications and accreditations
  • 3:15 - 3:17
    and is the recipient of multiple awards
  • 3:17 - 3:19
    in technology, cybersecurity, and
  • 3:19 - 3:23
    mathematics. He currently serves pro bono as
  • 3:23 - 3:25
    an editorial board member and reviewer for the
  • 3:25 - 3:27
    American Journal of Information Science
  • 3:27 - 3:30
    and Technology, and is a member of the
  • 3:30 - 3:32
    prestigious high-edging professor program for
  • 3:32 - 3:34
    undergraduate and graduate programs at
  • 3:34 - 3:37
    multiple universities in the U.S. and as a
  • 3:37 - 3:39
    reviewer for the doctoral program at the
  • 3:39 - 3:42
    University of Karachi in Pakistan. He is
  • 3:42 - 3:44
    the author of multiple cybersecurity
  • 3:44 - 3:48
    publications and articles, including Cybersecurity
  • 3:48 - 3:50
    Issues in Blockchain: Challenges and
  • 3:50 - 3:52
    Possible Solutions. He is also one of
  • 3:52 - 3:54
    the co-authors and reviewers of the
  • 3:54 - 3:57
    worldwide acclaimed book, Intrusion
  • 3:57 - 3:59
    Detection Guide.
  • 3:59 - 4:01
    Prior to obtaining his doctorate
  • 4:01 - 4:03
    degree in Information Systems and
  • 4:03 - 4:05
    Technologies from the University of
  • 4:05 - 4:08
    Phoenix, Dr. Luis earned a Bachelor's in
  • 4:08 - 4:12
    Science and Radio Technical and
  • 4:12 - 4:14
    Electronic Engineering, a
  • 4:14 - 4:15
    Bachelor of Science in
  • 4:15 - 4:18
    Telecommunications and Networking, and a
  • 4:18 - 4:20
    Master of Science in Mathematics and
  • 4:20 - 4:21
    Computer Science.
  • 4:21 - 4:23
    Without any further delay, I will
  • 4:23 - 4:26
    hand over the session to you, Dr. Luis.
  • 4:26 - 4:29
    Thank you very much. Thanks. Okay.
  • 4:30 - 4:33
    Good morning, everybody. Good afternoon, and
  • 4:33 - 4:35
    good night, depending on the specific
  • 4:35 - 4:38
    area in which you reside. We are going to
  • 4:38 - 4:40
    have an interesting conversation today
  • 4:40 - 4:42
    about digital forensic best practices
  • 4:42 - 4:44
    from data acquisition to analysis. This
  • 4:44 - 4:47
    is the title of the presentation or
  • 4:47 - 4:51
    subject, and I’m more than happy to be
  • 4:51 - 4:53
    here with you all and share some of
  • 4:53 - 4:58
    my expertise. So, let's go ahead and start the conference,
  • 4:58 - 5:01
    okay? She already mentioned
  • 5:01 - 5:03
    some of my credentials.
  • 5:03 - 5:06
    I have been working in cybersecurity
  • 5:06 - 5:09
    at this point for over 41 years.
  • 5:09 - 5:12
    This is in my DNA, a topic that I didn’t
  • 5:12 - 5:14
    like and respect as much as I cannot
  • 5:14 - 5:17
    talk about any other topic in my life.
  • 5:17 - 5:21
    Before we go, I have here a statement that
  • 5:21 - 5:24
    I put together for you, okay? Digital
  • 5:24 - 5:26
    forensic best practices. Well,
  • 5:26 - 5:29
    consideration number one: just to break
  • 5:29 - 5:31
    the ice in the labyrinth of
  • 5:31 - 5:35
    cyberspace, where shadows dance through encased
  • 5:35 - 5:38
    passages and data whispers its secrets, the
  • 5:38 - 5:42
    digital detective emerges. This is us, the
  • 5:42 - 5:44
    digital forensic experts. Clad in lines of
  • 5:44 - 5:48
    code and armed with algorithms, we seek
  • 5:48 - 5:52
    the hidden treasures of truth and
  • 5:52 - 5:55
    solving enigmatic cybercrimes. With a visual
  • 5:55 - 5:58
    magnifying glass, this is what we do: we
  • 5:58 - 6:01
    dissect the digital tapestry,
  • 6:01 - 6:04
    unveiling the footprints of elusive
  • 6:04 - 6:08
    cyber cultures. This is what cyber forensics, or
  • 6:08 - 6:11
    digital forensics, is about. Each keystroke and
  • 6:11 - 6:14
    pixel holds a clue, something that we can
  • 6:14 - 6:18
    use in our favor. And in this mesmerizing
  • 6:18 - 6:23
    world of the digital era, ones and zeros,
  • 6:23 - 6:26
    the art of digital forensics is about
  • 6:26 - 6:29
    finding the secret of the digital reality. Digital
  • 6:29 - 6:34
    forensics is about finding evidence
  • 6:34 - 6:36
    that can lead to a particular process. It
  • 6:36 - 6:39
    can be a legal process, or it can be any
  • 6:39 - 6:41
    other kind of process. But what is
  • 6:41 - 6:44
    digital forensics from my point of view?
  • 6:44 - 6:47
    Well, I mentioned earlier that I've
  • 6:47 - 6:50
    been working in cybersecurity for 41 years.
  • 6:50 - 6:53
    My specialties are in penetration
  • 6:53 - 6:55
    testing, data recovery, and digital forensics.
  • 6:55 - 6:57
    I’ve been working for the
  • 6:57 - 6:59
    police department in multiple places
  • 6:59 - 7:03
    doing digital forensics for them. So I try to
  • 7:03 - 7:06
    put together an easy definition for you from my
  • 7:06 - 7:08
    standpoint about what digital forensics
  • 7:08 - 7:12
    is. Digital forensics investigates digital
  • 7:12 - 7:15
    devices and electronic data to use as
  • 7:15 - 7:18
    evidence. Please note that I don’t say
  • 7:18 - 7:21
    electronic information; I use the word "data"
  • 7:21 - 7:24
    intentionally to understand digital events
  • 7:24 - 7:28
    and trace illicit activities. This is a key
  • 7:28 - 7:31
    component of digital forensics. Normally
  • 7:31 - 7:34
    speaking, digital forensics happens, of
  • 7:34 - 7:37
    course, after the facts, and the idea of
  • 7:37 - 7:41
    digital forensics is identifying traces,
  • 7:41 - 7:44
    okay, that lead to particular data that
  • 7:44 - 7:46
    we can gather together and make a
  • 7:46 - 7:49
    conclusion. It involves the systematic
  • 7:49 - 7:52
    collection, preservation, analysis, and
  • 7:52 - 7:54
    presentation of digital evidence in
  • 7:54 - 7:57
    legal proceedings. This is key
  • 7:57 - 7:59
    today because we are technology-dependent,
  • 7:59 - 8:02
    and there are multiple states,
  • 8:02 - 8:05
    at least in the USA and some other countries,
  • 8:05 - 8:07
    where digital forensics is still in
  • 8:07 - 8:10
    limbo because it's not accepted in the
  • 8:10 - 8:13
    court of law. Okay. So, this is very
  • 8:13 - 8:16
    important to keep in mind. What are we
  • 8:16 - 8:18
    going to do from the digital forensics
  • 8:18 - 8:21
    standpoint, the data collection process,
  • 8:21 - 8:23
    and the analysis? Digital forensics
  • 8:23 - 8:26
    experts use specialized techniques and
  • 8:26 - 8:29
    tools to extract data from computers,
  • 8:29 - 8:32
    smartphones, networks, and digital storage
  • 8:32 - 8:35
    media to support investigations and
  • 8:35 - 8:38
    resolve legal matters. So this is
  • 8:38 - 8:41
    basically what digital forensics is
  • 8:41 - 8:43
    about. Let's go ahead and start with the
  • 8:43 - 8:46
    technical part, which is the topic I like
  • 8:46 - 8:49
    most. Okay, let's talk about those
  • 8:49 - 8:52
    30 best practices that I’ve put
  • 8:52 - 8:54
    together for you. At the end of the
  • 8:54 - 8:55
    presentation, you will have the
  • 8:55 - 8:58
    opportunity to ask as many questions as
  • 8:58 - 9:01
    you like. 1. You have to
  • 9:01 - 9:04
    follow the legal and ethical standards:
  • 9:04 - 9:06
    For this particular first point, I am not
  • 9:06 - 9:09
    going to make any comment. I believe that
  • 9:09 - 9:12
    ethics is a key component
  • 9:12 - 9:15
    of cybersecurity. We always
  • 9:15 - 9:18
    have to follow the rules. We must always
  • 9:18 - 9:21
    follow the legal procedures in the
  • 9:21 - 9:24
    places in which we operate because every
  • 9:24 - 9:27
    single place is a different component.
  • 9:27 - 9:31
    2. Understand the original evidence:
  • 9:31 - 9:33
    This is key. Okay. You always have to
  • 9:33 - 9:35
    maintain the integrity of the original
  • 9:35 - 9:38
    evidence to ensure it is admissible in
  • 9:38 - 9:42
    court. Any kind of manipulation
  • 9:42 - 9:46
    or modification will result in
  • 9:46 - 9:49
    disqualification from the court system.
  • 9:49 - 9:51
    Document everything: This is something
  • 9:51 - 9:53
    that technical people like me don’t
  • 9:53 - 9:56
    like too much, but when it comes to
  • 9:56 - 9:59
    digital forensics, we have to document
  • 9:59 - 10:01
    every single step we take. We have to
  • 10:01 - 10:04
    record all the steps we
  • 10:04 - 10:07
    follow, and we want to make sure that
  • 10:07 - 10:10
    everything is documented and recorded in
  • 10:10 - 10:13
    a specific chronological order. This is
  • 10:13 - 10:16
    a key component for digital
  • 10:16 - 10:19
    forensics or investigations to be accepted
  • 10:19 - 10:23
    in the court of law. Secure the scene:
  • 10:23 - 10:26
    It’s not just physical
  • 10:26 - 10:28
    crime scenes that need to be secured to prevent
  • 10:28 - 10:30
    contamination or tampering.
  • 10:30 - 10:33
    If you present anything in court and
  • 10:33 - 10:35
    the opposing party
  • 10:35 - 10:38
    has the ability to prove that
  • 10:38 - 10:40
    something was not preserved, the
  • 10:40 - 10:43
    conversation is over. Chain of custody:
  • 10:43 - 10:45
    I’m going to repeat this more than
  • 10:45 - 10:48
    once during the presentation. Sorry.
  • 10:48 - 10:52
    Chain of custody refers to how you
  • 10:52 - 10:53
    establish and maintain
  • 10:53 - 10:56
    the evidence and the process
  • 10:56 - 10:59
    that facilitates how the
  • 10:59 - 11:02
    tracking process is handled. Use-Write-
  • 11:02 - 11:04
    Blocking Tools: This is another key
  • 11:04 - 11:07
    component of digital forensics. It means
  • 11:07 - 11:10
    that you have to use the appropriate
  • 11:10 - 11:12
    hardware and software that allow for
  • 11:12 - 11:14
    write blockers when you are collecting
  • 11:14 - 11:18
    data to prevent alteration. There are a
  • 11:18 - 11:20
    set of tools you can use, and at the end
  • 11:20 - 11:22
    of the presentation, I’m going to provide
  • 11:22 - 11:26
    you with a specific set
  • 11:26 - 11:30
    of tools you can use as write-blocking
  • 11:30 - 11:33
    tools. Verify hashing or hash
  • 11:33 - 11:36
    values. This is how you calculate and compare
  • 11:36 - 11:39
    hash values to verify the data's integrity.
  • 11:39 - 11:41
    There is often confusion about integrity,
  • 11:41 - 11:44
    confidentiality, and availability. In
  • 11:44 - 11:47
    digital forensics, the most important
  • 11:47 - 11:50
    component is integrity. It means that we
  • 11:50 - 11:53
    must make every effort to
  • 11:53 - 11:55
    ensure that the data is not modified in
  • 11:55 - 11:58
    any possible way, from the time we
  • 11:58 - 12:00
    arrive at the scene
  • 12:00 - 12:02
    to the time we present the evidence
  • 12:02 - 12:06
    in court and even after that as well. So
  • 12:06 - 12:09
    other component is Collect volatile data
  • 12:09 - 12:13
    first. Okay, this obviously makes perfect
  • 12:13 - 12:16
    sense. You have to prioritize this
  • 12:16 - 12:18
    type of data collection as it can be
  • 12:18 - 12:20
    lost or modified when the system is
  • 12:20 - 12:23
    powered down. For many of you, what I’m
  • 12:23 - 12:25
    going to tell you may
  • 12:25 - 12:28
    sound not appropriate, and this is the
  • 12:28 - 12:30
    following assessment:
  • 12:30 - 12:34
    we've been told from the time we
  • 12:34 - 12:37
    arrived at school and even at work
  • 12:37 - 12:45
    that information or data in random access memory (RAM) disappears when the
  • 12:45 - 12:51
    computer is shut down. Back in 2019,
  • 12:51 - 12:53
    I made a presentation similar to
  • 12:53 - 12:55
    this one for this account, in
  • 12:55 - 12:58
    which I proved that the data in RAM
  • 12:58 - 13:01
    can be recovered. Okay. So, what we have been
  • 13:01 - 13:04
    learning in multiple places, and what you can
  • 13:04 - 13:07
    easily find on Google, that data in RAM
  • 13:07 - 13:09
    is lost when
  • 13:09 - 13:12
    computers are powered down, is not
  • 13:12 - 13:15
    exactly correct. The other component is
  • 13:15 - 13:17
    Forensic image. You have to create a
  • 13:17 - 13:20
    forensic image of storage devices to
  • 13:20 - 13:23
    work with copies. You must always
  • 13:23 - 13:25
    present the original evidence. This is a
  • 13:25 - 13:30
    requirement in the court of law. You must
  • 13:30 - 13:33
    present the original evidence every single
  • 13:33 - 13:35
    time. The other component is the Data
  • 13:35 - 13:39
    recovery. Data recovery is closely
  • 13:39 - 13:42
    associated with digital forensics for
  • 13:42 - 13:44
    obvious reasons. Okay. You have to
  • 13:44 - 13:47
    employ specialized tools to recover
  • 13:47 - 13:51
    deleted or hidden data. This is also
  • 13:51 - 13:54
    something to keep in mind. At the end,
  • 13:54 - 13:56
    I'm going to provide some specific
  • 13:56 - 13:58
    applications you can use to do data
  • 13:58 - 14:00
    recovery.
  • 14:00 - 14:03
    Timeline analysis: You must construct
  • 14:03 - 14:06
    and analyze timelines to understand the
  • 14:06 - 14:09
    sequence of events. What happened first? The
  • 14:09 - 14:13
    chronological order is a mandatory
  • 14:13 - 14:15
    requirement in the court of law. You
  • 14:15 - 14:17
    cannot present evidence in court
  • 14:17 - 14:20
    in a random manner. You have to
  • 14:20 - 14:22
    follow the specific chronological order.
  • 14:22 - 14:25
    The other consideration is Preserving
  • 14:25 - 14:28
    the metadata. Ensure metadata integrity
  • 14:28 - 14:31
    to verify results, timing, and
  • 14:31 - 14:34
    authenticity of the digital artifacts you
  • 14:34 - 14:36
    are going to present in the court of law.
  • 14:36 - 14:40
    Use known good reference data: This
  • 14:40 - 14:42
    means you have to compare the
  • 14:42 - 14:45
    collected data with known
  • 14:45 - 14:47
    good reference data to identify
  • 14:47 - 14:51
    anomalies, specific patterns, and
  • 14:51 - 14:54
    statistical processes. Many times, you have
  • 14:54 - 14:57
    to do this as well. Antiforensic
  • 14:57 - 15:00
    awareness: You have to be aware of the
  • 15:00 - 15:03
    antiforensic techniques in use.
  • 15:03 - 15:06
    There are multiple applications
  • 15:06 - 15:09
    that work against digital forensics. So,
  • 15:09 - 15:12
    you must be aware of that. Before
  • 15:12 - 15:15
    you start digital forensics analysis,
  • 15:15 - 15:22
    while working on the data collection
  • 15:22 - 15:24
    process, you want to make sure you
  • 15:24 - 15:27
    don't have any anti-forensic
  • 15:27 - 15:30
    tools or applications installed on the
  • 15:30 - 15:33
    particular host or hosts in which you are
  • 15:33 - 15:36
    going to conduct the investigation. Another
  • 15:36 - 15:39
    very important component is Cross-validation.
  • 15:39 - 15:41
    This is what brings actual
  • 15:41 - 15:45
    reputation and respect to the data you
  • 15:45 - 15:49
    are presenting in the court of law. Okay?
  • 15:49 - 15:51
    So the standard operating procedures are a
  • 15:51 - 15:55
    very important component that is oftentimes
  • 15:55 - 15:56
    overlooked, and it's about
  • 15:56 - 15:59
    developing and following SOPs that
  • 15:59 - 16:02
    maintain consistency. This is
  • 16:02 - 16:05
    why documentation is key, and it was
  • 16:05 - 16:08
    presented in slide number one. Training
  • 16:08 - 16:11
    and certification are also important components, and
  • 16:11 - 16:13
    this is relevant. The reason why it's
  • 16:13 - 16:15
    relevant is that I understand you can learn
  • 16:15 - 16:19
    many things by yourself. This is becoming
  • 16:19 - 16:22
    more popular as we become more
  • 16:22 - 16:25
    technology-dependent. This is normal
  • 16:25 - 16:28
    and expected, but certifications still
  • 16:28 - 16:31
    hold particular value. There are
  • 16:31 - 16:33
    multiple questions in certification
  • 16:33 - 16:38
    exams, in general terms, not only in EC-Council
  • 16:38 - 16:40
    certifications or others, in which,
  • 16:40 - 16:42
    most likely, if you don't go through the
  • 16:42 - 16:45
    certification process, you will never
  • 16:45 - 16:47
    find out. And this is what
  • 16:47 - 16:50
    some people say: "Well, this is
  • 16:50 - 16:53
    theoretical information." Digital forensics
  • 16:53 - 16:56
    involves a lot of theoretical information--
  • 16:56 - 16:58
    A LOT. Remember that we are doing the
  • 16:58 - 17:01
    analysis at a low
  • 17:01 - 17:05
    level, from the technical standpoint. So
  • 17:05 - 17:07
    theory is extremely important and
  • 17:07 - 17:11
    relevant when we do forensic
  • 17:11 - 17:13
    investigations--digital forensics. The same
  • 17:13 - 17:16
    happens with medical doctors. When
  • 17:16 - 17:18
    medical doctors do a forensic
  • 17:18 - 17:20
    analysis of a body of someone who
  • 17:20 - 17:23
    passed away, they also employ a lot of
  • 17:23 - 17:25
    theoretical knowledge they have been
  • 17:25 - 17:28
    accumulating. Digital forensics is no
  • 17:28 - 17:29
    different.
  • 17:29 - 17:32
    The other consideration is expert
  • 17:32 - 17:35
    testimony. Okay? I, for example, live
  • 17:35 - 17:39
    in Miami, Florida, in the USA, and I am one of the
  • 17:39 - 17:43
    11 experts certified by the legal system
  • 17:43 - 17:48
    in the 11 districts. This means that when you
  • 17:48 - 17:50
    go to court, you have to be
  • 17:50 - 17:53
    classified as an expert in order to
  • 17:53 - 17:58
    provide comments and evidence. Otherwise, you will
  • 17:58 - 18:03
    probably not be able to speak in court,
  • 18:03 - 18:04
    as what we say
  • 18:04 - 18:07
    in court is relevant for the case.
  • 18:07 - 18:10
    And with our wording or statement,
  • 18:10 - 18:13
    along with the evidence we provide, we have
  • 18:13 - 18:16
    the ability to put somebody in jail or
  • 18:16 - 18:19
    release this person from jail.
  • 18:19 - 18:23
    So, this is extremely important. Okay? So,
  • 18:23 - 18:26
    evidence storage is one of the most
  • 18:26 - 18:28
    important components. Your opponent in
  • 18:28 - 18:31
    court or in your company will try
  • 18:31 - 18:34
    their best to challenge what you
  • 18:34 - 18:36
    are presenting. So, you have to safely
  • 18:36 - 18:39
    store and protect evidence to maintain
  • 18:39 - 18:42
    its integrity. Integrity is the most
  • 18:42 - 18:45
    important characteristic or
  • 18:45 - 18:48
    consideration in digital forensics--
  • 18:48 - 18:52
    without any other factor coming close. So, integrity
  • 18:52 - 18:55
    is everything in digital forensics. Okay?
  • 18:55 - 18:58
    Data encryption: There are multiple cases
  • 18:58 - 19:00
    in which you will do digital
  • 19:00 - 19:04
    forensics on encrypted storage devices,
  • 19:04 - 19:07
    encrypted data, or encrypted
  • 19:07 - 19:11
    applications. You need to develop the
  • 19:11 - 19:14
    ability to handle encrypted data
  • 19:14 - 19:17
    and understand the encryption methods.
  • 19:17 - 19:19
    Among the publications I have, I have
  • 19:19 - 19:22
    over 25 publications on different
  • 19:22 - 19:25
    topics and concepts within security. A
  • 19:25 - 19:28
    few of them, probably five or six, are
  • 19:28 - 19:31
    specifically about encryption. If we want
  • 19:31 - 19:35
    to do digital forensics, we must become
  • 19:35 - 19:39
    data encryption experts. There is no other
  • 19:39 - 19:41
    way. I understand that many people
  • 19:41 - 19:46
    don’t like math, statistics, physics, etc.,
  • 19:46 - 19:48
    but this is a requirement for doing an
  • 19:48 - 19:50
    appropriate digital forensic assessment.
  • 19:50 - 19:54
    It’s a necessity today. Okay? The other
  • 19:54 - 19:56
    consideration, and this is for the people
  • 19:56 - 19:59
    who love technology like me attending
  • 19:59 - 20:02
    or watching this conference, is network. I
  • 20:02 - 20:04
    am a big fan of networks. I have been
  • 20:04 - 20:08
    working in networking for 41 years.
  • 20:08 - 20:10
    My doctoral degree is in
  • 20:10 - 20:13
    telecommunications and cybersecurity. So,
  • 20:13 - 20:17
    networking is in my DNA. I love networking more than
  • 20:17 - 20:20
    any other topic in information
  • 20:20 - 20:23
    technology. Network analysis is the
  • 20:23 - 20:25
    ability to analyze network
  • 20:25 - 20:29
    traffic logs and data to trace digital
  • 20:29 - 20:31
    footprints. I’m pretty sure
  • 20:31 - 20:34
    everyone has a tool of mine, and, of course,
  • 20:34 - 20:38
    this tool is most likely part of the
  • 20:38 - 20:40
    tools I’m going to
  • 20:40 - 20:42
    provide in the last slide for you.
  • 20:42 - 20:45
    But network analysis today, from a
  • 20:45 - 20:47
    digital forensics standpoint, is
  • 20:47 - 20:50
    everything. Everything is network-related in
  • 20:50 - 20:53
    one or another way. Malware analysis: We
  • 20:53 - 20:56
    need to develop the ability to
  • 20:56 - 20:59
    understand malware behavior and analysis
  • 20:59 - 21:03
    and how those malwares impact systems.
  • 21:03 - 21:05
    This needs to be incorporated as part of
  • 21:05 - 21:08
    the cybersecurity analysis when
  • 21:08 - 21:11
    performing digital forensics today. Cloud
  • 21:11 - 21:14
    forensics: I don’t have to highlight how
  • 21:14 - 21:17
    important cloud operations are. Okay? We are
  • 21:17 - 21:20
    moving operations to the cloud, and
  • 21:20 - 21:22
    for those still
  • 21:22 - 21:25
    running operations on-premises, there is
  • 21:25 - 21:27
    a high expectation that sooner rather than
  • 21:27 - 21:29
    later, you will move operations to the cloud for
  • 21:29 - 21:31
    multiple conveniences. However, the
  • 21:31 - 21:33
    configuration at this point does not fully
  • 21:33 - 21:37
    benefit all aspects of the cloud. From
  • 21:37 - 21:40
    a forensic standpoint, when you do
  • 21:40 - 21:42
    cloud forensics, the situation is a little
  • 21:42 - 21:45
    different from
  • 21:45 - 21:48
    on-premises investigations. So, you have to
  • 21:48 - 21:51
    adapt methodologies for investigating
  • 21:51 - 21:53
    data in the cloud, regardless of the
  • 21:53 - 21:56
    cloud provider. Here, as a matter, you can see
  • 21:56 - 22:00
    AWS, Google, Azure, or anyone else.
  • 22:00 - 22:03
    The operation in the cloud is somehow
  • 22:03 - 22:05
    different from a digital forensics
  • 22:05 - 22:07
    standpoint, starting with how you
  • 22:07 - 22:08
    access the data.
  • 22:08 - 22:13
    Remote forensics: Remote forensics is the opportunity
  • 22:13 - 22:16
    to develop skills for collecting and
  • 22:16 - 22:19
    analyzing data from a remote location.
  • 22:19 - 22:22
    This is happening more frequently now as
  • 22:22 - 22:26
    we become more telework-dependent.
  • 22:26 - 22:29
    In multiple cases--my own company, for example, knowing my
  • 22:29 - 22:31
    job with the government, but owning my own
  • 22:31 - 22:34
    company--I have been doing more remote digital forensics in the last
  • 22:34 - 22:36
    two, three years, probably two years.
  • 22:38 - 22:40
    Digital forensics that
  • 22:40 - 22:42
    than probably ever before in my life. So, this
  • 22:42 - 22:45
    is an important skill to develop as well.
  • 22:45 - 22:48
    Case management: This is how we use
  • 22:48 - 22:50
    digital forensics case management to
  • 22:50 - 22:53
    organize and track investigations. I mentioned to
  • 22:53 - 22:56
    you that I go to court very often--more
  • 22:56 - 23:00
    often than I want, very, very often.
  • 23:00 - 23:04
    Okay. And they scrutinize every
  • 23:04 - 23:06
    single protocol you present, every single
  • 23:06 - 23:09
    artifact, every single document, and the
  • 23:09 - 23:11
    specific chronological order. This is a
  • 23:11 - 23:15
    complex process. It’s not just collecting
  • 23:15 - 23:18
    the data, performing the digital forensics
  • 23:18 - 23:20
    analysis, and going to court to testify.
  • 23:20 - 23:23
    Okay? The process is much more
  • 23:23 - 23:25
    complex than this.
  • 23:25 - 23:27
    Collaboration: Collaborate with other
  • 23:27 - 23:29
    experts and there's one in the middle
  • 23:29 - 23:32
    that I'm going to highlight in a few.
  • 23:32 - 23:34
    Collaborate with other experts, law
  • 23:34 - 23:37
    enforcement, or organizations for complex
  • 23:37 - 23:40
    cases. Cases are different from one another.
  • 23:40 - 23:42
    Of course, this is okay, and I know you
  • 23:42 - 23:45
    know that. Okay? But you have some cases
  • 23:45 - 23:47
    sometimes in which the forensic analysis
  • 23:47 - 23:50
    becomes very complex. In those particular
  • 23:50 - 23:53
    cases, my advice is to collaborate with
  • 23:53 - 23:56
    others. Okay? You do better when you work
  • 23:56 - 23:58
    as part of a team and not when you work
  • 23:58 - 24:01
    independently. I’ll skip the data
  • 24:01 - 24:04
    privacy compliance for a minute because
  • 24:04 - 24:08
    this is relevant. Every single state,
  • 24:08 - 24:09
    every single... No
  • 24:09 - 24:14
    exception. A state court operates on the
  • 24:14 - 24:16
    different requirements. So, you want to
  • 24:16 - 24:19
    make sure that you follow the privacy
  • 24:19 - 24:23
    regulations in your specific place. Okay?
  • 24:23 - 24:25
    And by the way, I'm going to ask you a
  • 24:25 - 24:27
    question. I'm not expecting any response.
  • 24:27 - 24:30
    But the question is: by any chance, do you
  • 24:30 - 24:33
    know the specific digital forensic
  • 24:33 - 24:36
    regulations in the place you live? Ask
  • 24:36 - 24:39
    yourself this question, and probably some
  • 24:39 - 24:42
    of you are going to respond "no." This is a
  • 24:42 - 24:45
    critical thing. Continuous learning: You
  • 24:45 - 24:48
    need to keep asking about what we do. Okay?
  • 24:48 - 24:52
    Cybersecurity is an specialization of IT. From
  • 24:52 - 24:55
    my point of view, it's the most fascinating
  • 24:55 - 24:57
    topic in the world. This is
  • 24:57 - 25:00
    the only topic I can talk about
  • 25:00 - 25:04
    for 25 hours without drinking water.
  • 25:04 - 25:08
    This is my life. I dedicate multiple
  • 25:08 - 25:10
    hours every single day, seven days a week,
  • 25:10 - 25:13
    even when it creates some personal
  • 25:13 - 25:16
    problems with my family, etc. This is in
  • 25:16 - 25:20
    my DNA. I encourage each of you, if you
  • 25:20 - 25:24
    are not doing so, to dedicate your life to
  • 25:24 - 25:27
    become a digital forensics expert. Digital
  • 25:27 - 25:30
    forensic is one of the most fascinating
  • 25:30 - 25:33
    topics in the planet. Okay. And you want
  • 25:33 - 25:37
    to be attentive to these type of things.
  • 25:37 - 25:39
    Report and presentation: When you go to
  • 25:39 - 25:41
    the court or when you present your
  • 25:41 - 25:44
    outcomes of all the digital forensic
  • 25:44 - 25:47
    outcomes to your organization, you want
  • 25:47 - 25:48
    to make sure that you use clear
  • 25:48 - 25:52
    language, you are concise, and you are
  • 25:52 - 25:55
    ready for the presentation questions and
  • 25:55 - 25:57
    answers. You never want to go to the
  • 25:57 - 25:59
    court unprepared. Okay? Never in your
  • 25:59 - 26:01
    life. This is not appropriate because, at
  • 26:01 - 26:04
    the end your assessment, you have the
  • 26:04 - 26:08
    possibility to put somebody in jail or
  • 26:08 - 26:09
    somebody will be fired from the
  • 26:09 - 26:12
    organization or not. So what we said is
  • 26:12 - 26:16
    relevant. Our wording has a huge impact
  • 26:16 - 26:19
    in other people's lives. It's important
  • 26:19 - 26:21
    to be attentive to that. One of the most
  • 26:21 - 26:25
    relevant topic that I have been using in
  • 26:25 - 26:28
    my practice is the use of artificial
  • 26:28 - 26:31
    intelligence in digital forensic. Since
  • 26:31 - 26:36
    2017, this is not a topic that is well
  • 26:36 - 26:39
    known. At this point, the reason why I
  • 26:39 - 26:42
    really want to share my experience--
  • 26:42 - 26:45
    practical experience with you guys,
  • 26:45 - 26:48
    digital evidence analysis, how artificial
  • 26:48 - 26:52
    intelligence can help us. Well, everybody
  • 26:52 - 26:55
    knows that we have multiple applications
  • 26:55 - 26:58
    that we can use in order to analyze
  • 26:58 - 27:00
    the different kind of media that can be
  • 27:00 - 27:03
    generated. For example, text, image, and
  • 27:03 - 27:06
    videos, artificial intelligence studies
  • 27:06 - 27:09
    have the ability to detect and flag
  • 27:09 - 27:11
    potential relevant content for
  • 27:11 - 27:13
    investigations, especially from the
  • 27:13 - 27:17
    timing standpoint. Digital forensic is
  • 27:17 - 27:20
    extremely time consuming, very, very
  • 27:20 - 27:23
    time consuming and complex. This is
  • 27:23 - 27:27
    probably along with data recovery the
  • 27:27 - 27:29
    most complex specialization in
  • 27:29 - 27:33
    cybersecurity. So the use of artificial
  • 27:33 - 27:36
    intelligence, in our favor, is very
  • 27:36 - 27:38
    convenient. And at the end, I'm going to
  • 27:38 - 27:41
    include as well or actually I included
  • 27:41 - 27:44
    in the list a particular artificial
  • 27:44 - 27:46
    intelligence tool that you can use in
  • 27:46 - 27:49
    your favor. The other use of artificial
  • 27:49 - 27:52
    intelligence is pattern
  • 27:52 - 27:54
    recognition. Artificial intelligence can
  • 27:54 - 27:57
    identify patterns in data, helping
  • 27:57 - 28:00
    investigators recognize anomalies or
  • 28:00 - 28:03
    correlations in digital artifacts that
  • 28:03 - 28:06
    may indicate criminal activity.
  • 28:06 - 28:08
    Out of the whole sentence, the most
  • 28:08 - 28:12
    important question is: "What is the key word?" The key word,
  • 28:12 - 28:15
    correlation. How do we correlate data by
  • 28:15 - 28:17
    using artificial intelligence? The
  • 28:17 - 28:19
    process is going to be simplified
  • 28:19 - 28:22
    dramatically. Speaking based on my
  • 28:22 - 28:25
    personal experience, the other component is
  • 28:25 - 28:28
    NLP. This can be used to analyze
  • 28:28 - 28:31
    text-based evidence, including logs
  • 28:31 - 28:34
    and emails, to uncover communication
  • 28:34 - 28:37
    patterns or hidden minutes. A lot of
  • 28:37 - 28:40
    evidence that we collect, about
  • 28:40 - 28:44
    65%, is included in emails, chats,
  • 28:44 - 28:48
    documents, etc., so this is when NLP plays
  • 28:48 - 28:50
    a predominant role in artificial
  • 28:50 - 28:52
    intelligence in the digital forensic
  • 28:52 - 28:55
    analysis for image and video analysis. It provides
  • 28:55 - 28:58
    incredible benefits. Okay? You have the
  • 28:58 - 29:00
    ability to analyze multimedia
  • 29:00 - 29:03
    content to identify objects, people, and
  • 29:03 - 29:05
    potentially illegal or
  • 29:05 - 29:08
    sensitive content. I’m sure a word
  • 29:08 - 29:11
    is coming to your mind right now, steganography.
  • 29:11 - 29:14
    Yes, this is part of steganography, but it's
  • 29:14 - 29:18
    not similar to doing steganography by using a
  • 29:18 - 29:20
    particular application. When you
  • 29:20 - 29:23
    employ artificial intelligence tools
  • 29:23 - 29:25
    that are dedicated exclusively to
  • 29:25 - 29:28
    digital forensics, the benefit is really
  • 29:28 - 29:31
    awesome. Predictive analysis: Machine
  • 29:31 - 29:34
    learning models can predict potential
  • 29:34 - 29:37
    areas of interest in an investigation,
  • 29:37 - 29:40
    guiding forensic experts to focus on
  • 29:40 - 29:42
    critical evidence. Imagine that you are
  • 29:42 - 29:45
    analyzing a hard drive that is one
  • 29:45 - 29:49
    terabyte holds a lot of
  • 29:49 - 29:53
    documents, videos, pictures, sounds, etc. You
  • 29:53 - 29:55
    know that, right? If you are
  • 29:55 - 29:57
    attending this conference, it’s because you
  • 29:57 - 29:59
    are very familiar with information
  • 29:59 - 30:03
    technology, cybersecurity, and digital forensics.
  • 30:03 - 30:07
    Well, how do you find the specific data you
  • 30:07 - 30:09
    need to prove something in a court of
  • 30:09 - 30:12
    law? You have to be very careful
  • 30:12 - 30:15
    about the pieces of data you pick for
  • 30:15 - 30:18
    the analysis, otherwise, your
  • 30:18 - 30:20
    assessment is not appropriate. And again,
  • 30:20 - 30:23
    every single word we say in a court
  • 30:23 - 30:26
    of law or in the organization we
  • 30:26 - 30:30
    are working for is relevant. It implies
  • 30:30 - 30:32
    that probably somebody will be in jail
  • 30:32 - 30:35
    for 30 years, or probably somebody, if we’re
  • 30:35 - 30:38
    talking about a huge crime like an
  • 30:38 - 30:42
    assassination or child pornography abuse,
  • 30:42 - 30:45
    will face consequences like death. Our
  • 30:45 - 30:49
    assessment is critical. Okay? We become
  • 30:49 - 30:52
    the main players when
  • 30:52 - 30:54
    digital forensics is involved. We have to
  • 30:54 - 30:56
    be very careful about the way we do it.
  • 30:56 - 30:59
    This is not a joke; it's very serious. Okay?
  • 30:59 - 31:01
    Predictive analysis, machine learning
  • 31:01 - 31:04
    models, or artificial intelligence are
  • 31:04 - 31:06
    pretty close in this concept and can predict
  • 31:06 - 31:08
    potential areas of interest in an
  • 31:08 - 31:11
    investigation. But we also talk about
  • 31:11 - 31:13
    detection. Artificial intelligence
  • 31:13 - 31:16
    driving security tools can identify
  • 31:16 - 31:18
    cyber threats and potential cybercrime
  • 31:18 - 31:21
    activities, helping law enforcement and cybersecurity
  • 31:21 - 31:24
    teams respond effectively and
  • 31:24 - 31:27
    proactively. More importantly, the
  • 31:27 - 31:30
    majority of us have multiple tools that
  • 31:30 - 31:31
    we call proactive
  • 31:31 - 31:35
    in our place of work. Okay? We
  • 31:35 - 31:38
    have different kinds of monitors, etc. But
  • 31:38 - 31:40
    the possibility to do something in a
  • 31:40 - 31:43
    proactive mode is really what we want.
  • 31:43 - 31:46
    Evidence authentication: Artificial
  • 31:46 - 31:47
    intelligence can assist in the
  • 31:47 - 31:49
    authentication of digital evidence,
  • 31:49 - 31:51
    ensuring its integrity and the
  • 31:51 - 31:54
    possibility of this data being admitted
  • 31:54 - 31:57
    in court. Data recovery: Artificial
  • 31:57 - 32:00
    intelligence helps with the recovery of
  • 32:00 - 32:02
    data that has been deleted
  • 32:02 - 32:05
    intentionally or unintentionally. It
  • 32:05 - 32:07
    doesn't matter. When we do digital
  • 32:07 - 32:11
    forensics, we want to have as much data as
  • 32:11 - 32:15
    we can to make a case
  • 32:15 - 32:18
    against a particular party. From the
  • 32:18 - 32:20
    malware analysis standpoint,
  • 32:20 - 32:23
    artificial intelligence brings a lot of
  • 32:23 - 32:26
    speed, and this is needed because, again,
  • 32:26 - 32:29
    you are looking for a needle in a ton of
  • 32:29 - 32:33
    water or in a ton of sand, and this
  • 32:33 - 32:36
    is very complex. From the network
  • 32:36 - 32:38
    forensic standpoint, we are accustomed to
  • 32:38 - 32:41
    using tools such as Wireshark, which everybody
  • 32:41 - 32:44
    knows, well, anyway,
  • 32:44 - 32:47
    there are now specific artificial
  • 32:47 - 32:49
    intelligence tools for network forensic
  • 32:49 - 32:53
    analysis. I have included two of
  • 32:53 - 32:56
    those tools in the list on the last
  • 32:56 - 32:59
    slide. Automated trace: This is one of the
  • 32:59 - 33:02
    most important considerations for you to
  • 33:02 - 33:04
    consider with artificial intelligence in
  • 33:04 - 33:08
    digital forensics. Speed is key. It’s basically
  • 33:08 - 33:11
    the ability to do
  • 33:11 - 33:16
    correlation between large data sets. Case
  • 33:16 - 33:18
    priority: Artificial intelligence can
  • 33:18 - 33:20
    assist investigators in
  • 33:20 - 33:24
    prioritizing cases based on factors like
  • 33:24 - 33:26
    severity, potential impact, or resource
  • 33:26 - 33:29
    allocation, meaning timing.
  • 33:29 - 33:32
    Predictive policing: This is super important
  • 33:32 - 33:35
    because, until today, digital forensics has
  • 33:35 - 33:38
    always been reactive. We react to
  • 33:38 - 33:41
    something that happened. The possibility to
  • 33:41 - 33:44
    make predictions in digital forensics is
  • 33:44 - 33:47
    fantastic. It has never happened before.
  • 33:47 - 33:49
    This is new, at least for me. I started
  • 33:49 - 33:52
    using artificial intelligence back in my own
  • 33:52 - 33:55
    company in 2017, and I have been able to
  • 33:55 - 33:56
    that in
  • 33:56 - 33:59
    multiple cases for the police department
  • 33:59 - 34:03
    in Miami and in other two cities in
  • 34:03 - 34:07
    Florida: Tampa and St. Petersburg. The
  • 34:07 - 34:09
    results have been amazing. Document
  • 34:09 - 34:12
    analysis: You know that NLP can extract
  • 34:12 - 34:15
    information from documents and analyze
  • 34:15 - 34:17
    sexual content for investigations.
  • 34:17 - 34:19
    Artificial intelligence dramatically minimizes
  • 34:19 - 34:21
    the time needed for that.
  • 34:21 - 34:25
    Emotional recognition: Everybody
  • 34:25 - 34:28
    knows what happened with the DSP
  • 34:28 - 34:32
    algorithms. Okay? So we can use artificial
  • 34:32 - 34:34
    intelligence to analyze videos,
  • 34:34 - 34:38
    which is awesome because our eyes, our
  • 34:38 - 34:40
    muscles in our eyes, don't have the
  • 34:40 - 34:43
    ability to lie. We can lie when we speak,
  • 34:43 - 34:46
    or we can try, but our eyes’ reactions
  • 34:46 - 34:49
    to a particular stimulus cannot be hidden
  • 34:49 - 34:52
    or cannot be modified. So this is unique.
  • 34:52 - 34:54
    From the data privacy and compliance standpoint, you
  • 34:54 - 34:57
    also have the ability to
  • 34:57 - 35:03
    automate the specific data you want to
  • 35:03 - 35:07
    include as part of your report. Okay? Now,
  • 35:07 - 35:09
    digital forensic data acquisition steps:
  • 35:09 - 35:12
    From my standpoint, after 41 years of experience,
  • 35:12 - 35:15
    preservation--we already talked about this.
  • 35:15 - 35:18
    Documentation: Preservation is integrity.
  • 35:18 - 35:21
    Okay? This is the most important
  • 35:21 - 35:24
    consideration, categorically speaking, in
  • 35:24 - 35:26
    any kind of digital forensic
  • 35:26 - 35:28
    investigation. You have to preserve the
  • 35:28 - 35:31
    data as it is. And remember, you never use
  • 35:31 - 35:33
    the original data for your forensic
  • 35:33 - 35:37
    analysis—-never. You always use a copy. And
  • 35:37 - 35:40
    to do copies, you have to use bit-by-bit
  • 35:40 - 35:43
    applications. Bit-by-bit—you cannot
  • 35:43 - 35:47
    copy bytes, or you cannot copy data
  • 35:47 - 35:49
    and forget about the information. So,
  • 35:49 - 35:52
    preservation is the most important thing.
  • 35:52 - 35:55
    Documentation: We already know that
  • 35:55 - 35:57
    everything needs to be documented, okay?
  • 35:57 - 36:00
    From the crime scene to the
  • 36:00 - 36:03
    last point. Chain of custody: One more
  • 36:03 - 36:05
    time, and I guess I’m going to
  • 36:05 - 36:07
    mention this one more time because chain
  • 36:07 - 36:10
    of custody means or opens the door for
  • 36:10 - 36:13
    you to present a case in the court of
  • 36:13 - 36:17
    law or to prove, in
  • 36:17 - 36:20
    your organization, that what you
  • 36:20 - 36:23
    are presenting is appropriate. You have
  • 36:23 - 36:26
    to plan how you are going to collect the
  • 36:26 - 36:29
    data. You have to plan with anticipation.
  • 36:29 - 36:32
    The specific tools you are going to use.
  • 36:32 - 36:35
    What methods are you going to consider?
  • 36:35 - 36:37
    In your data collection process, this is
  • 36:37 - 36:40
    relevant, and you always have to consider it.
  • 36:40 - 36:44
    The comms. Comms is probably more important
  • 36:44 - 36:48
    than PR when you select or decide to
  • 36:48 - 36:51
    use a particular application for
  • 36:51 - 36:54
    data acquisition. You always want to
  • 36:54 - 36:57
    focus on the negative. People usually
  • 36:57 - 37:00
    tend to talk about the positive--oh, I
  • 37:00 - 37:02
    like why the Shar because this and that. I
  • 37:02 - 37:04
    It's better that you focus on the
  • 37:04 - 37:07
    negative. In Information Technology,
  • 37:07 - 37:10
    everything has a cross and comes; no
  • 37:10 - 37:13
    exceptions. Exceptions do not exist. There
  • 37:13 - 37:17
    is not one exception. Everything positive
  • 37:17 - 37:19
    has something negative in Information
  • 37:19 - 37:21
    Technology, and this is what you want to
  • 37:21 - 37:25
    focus on to avoid problems in the end.
  • 37:25 - 37:28
    Okay, so...
  • 37:28 - 37:30
    How about the verification process? You
  • 37:30 - 37:34
    have to verify, before you work with the
  • 37:34 - 37:37
    real data, that the tools and methods you
  • 37:37 - 37:40
    selected work. Okay? You never want to
  • 37:40 - 37:43
    mess up with the original data. You need to work
  • 37:43 - 37:45
    with a copy. You want to test in a test
  • 37:45 - 37:48
    environment: your tools, your methods, your
  • 37:48 - 37:50
    approach. The steps you are going to
  • 37:50 - 37:53
    follow are very time-consuming. It is, but
  • 37:53 - 37:57
    by the way, it's also very well-paid. It's
  • 37:57 - 37:59
    very well-paid. The only thing I can tell
  • 37:59 - 38:01
    you is that it's very well-paid. You have no
  • 38:01 - 38:04
    idea. If you become a cybersecurity
  • 38:04 - 38:07
    expert and specialize in digital
  • 38:07 - 38:11
    forensics, this is where the money is, and
  • 38:11 - 38:13
    trust me, this is where the money is. Okay,
  • 38:13 - 38:18
    I'm telling you from first person. Duplication--
  • 38:18 - 38:21
    we've talked about that already. The only way
  • 38:21 - 38:24
    to do that is by creating a bit-for-bit
  • 38:24 - 38:27
    image. There are no other ways. Okay, this
  • 38:27 - 38:30
    is why you want to use PR blocking
  • 38:30 - 38:32
    devices, software, and hardware. I
  • 38:32 - 38:35
    mentioned that before. Text rooms and
  • 38:35 - 38:37
    hashing--different concepts that some
  • 38:37 - 38:40
    people are still confused about. Okay?
  • 38:40 - 38:42
    There is a huge difference between the
  • 38:42 - 38:46
    two. The main one is that hashing is a
  • 38:46 - 38:50
    one-way function. You go from the left to
  • 38:50 - 38:52
    the right, and usually, you don't have the
  • 38:52 - 38:54
    ability to come back to replicate the
  • 38:54 - 38:57
    process. Of course, if you have the
  • 38:57 - 38:59
    algorithms on hand, then you can do
  • 38:59 - 39:02
    reverse engineering. This is obvious, but
  • 39:02 - 39:04
    this is not what happens under regular
  • 39:04 - 39:07
    conditions. Okay, so checksum and
  • 39:07 - 39:10
    hashing both minimize the possibility
  • 39:10 - 39:13
    that you make a mistake in your digital
  • 39:13 - 39:16
    forensic analysis.
  • 39:17 - 39:18
    The other component is
  • 39:18 - 39:22
    acquisition. Okay, so how are you going to
  • 39:22 - 39:24
    collect the data? What particular tools
  • 39:24 - 39:26
    are you going to use? You always have to
  • 39:26 - 39:29
    maintain strict R-only access to the
  • 39:29 - 39:32
    source. If you have the ability to
  • 39:32 - 39:35
    manipulate the data in the source, you
  • 39:35 - 39:38
    have the ability to tamper with it. Actually,
  • 39:38 - 39:40
    the most important consideration out of
  • 39:40 - 39:44
    the CIA, which is integrity, if the
  • 39:44 - 39:47
    opponent is the opposite part to you in
  • 39:47 - 39:50
    your organization, the defendant, in other
  • 39:50 - 39:54
    words, has the ability to prove that
  • 39:54 - 39:57
    the original data or source can be
  • 39:57 - 39:59
    manipulated in any way, the conversation
  • 39:59 - 40:02
    is 100% over, and the case will be
  • 40:02 - 40:04
    dismissed. Categorically speaking, there's no
  • 40:04 - 40:08
    more conversation. So this is a humongous
  • 40:08 - 40:10
    responsibility when it comes to data
  • 40:10 - 40:13
    acquisition. What protocols you use, what
  • 40:13 - 40:15
    the specific tools are, how you plan it, and
  • 40:15 - 40:17
    how you document it is a very painful
  • 40:17 - 40:21
    process, in other words. Okay, now data
  • 40:21 - 40:24
    recovery--we already talked about the
  • 40:24 - 40:27
    complexity of finding a needle in a ton
  • 40:27 - 40:30
    of S. This is super complex, okay? But it's
  • 40:30 - 40:34
    doable. The only thing you have to do is use
  • 40:34 - 40:36
    the appropriate tools, and you need
  • 40:36 - 40:38
    to have a specific plan because every
  • 40:38 - 40:42
    single case is 100% different. Digital
  • 40:42 - 40:45
    signatures sign the acquired data and
  • 40:45 - 40:48
    hash it with a digital signature for
  • 40:48 - 40:50
    authentication. There are multiple cases
  • 40:50 - 40:54
    today in which digital signatures are not
  • 40:54 - 40:57
    accepted anymore. In the government, I
  • 40:57 - 40:59
    am a federal officer for the U.S.
  • 40:59 - 41:02
    Department of Commerce in the USA. In the
  • 41:02 - 41:05
    government, we are not allowed to sign
  • 41:05 - 41:08
    anything by hand for many years back.
  • 41:08 - 41:12
    Many years, okay? Digital signatures have
  • 41:12 - 41:16
    a specific component that minimizes,
  • 41:16 - 41:18
    dramatically speaking, the possibility of
  • 41:18 - 41:21
    replication, and this is why this is
  • 41:21 - 41:23
    accepted in a court of law.
  • 41:23 - 41:26
    Verification verifies the integrity of
  • 41:26 - 41:29
    that acquired image by comparing hash values
  • 41:29 - 41:32
    with those calculated before. The hash
  • 41:32 - 41:36
    values must be exact. No difference, not
  • 41:36 - 41:39
    even by
  • 41:39 - 41:43
    0.001%. Much, 100%
  • 41:43 - 41:47
    categorically speaking. Otherwise, the
  • 41:47 - 41:49
    court is going to dismiss the case as
  • 41:49 - 41:52
    well, or the organization probably isn't
  • 41:52 - 41:55
    going to take the appropriate action versus
  • 41:55 - 41:59
    a particular individual or problem or
  • 41:59 - 42:03
    process. Okay, LS and no--we already talked
  • 42:03 - 42:06
    about documentation at the beginning. You
  • 42:06 - 42:09
    have to actually make sure that
  • 42:09 - 42:12
    everything is timestamped. As I mentioned
  • 42:12 - 42:15
    before at the beginning, digital forensics
  • 42:15 - 42:18
    must be collected in a particular order,
  • 42:18 - 42:21
    analyzed in a similar manner, and
  • 42:21 - 42:25
    presented in the report in the specific
  • 42:25 - 42:28
    order in which the process was done.
  • 42:28 - 42:31
    Otherwise, the process is going to be
  • 42:31 - 42:34
    disqualified, and this is exclusively at
  • 42:34 - 42:37
    this point our responsibility, and
  • 42:37 - 42:42
    nobody else’s. Okay, the storage--we already
  • 42:42 - 42:45
    know that chain of custody is one of the
  • 42:45 - 42:47
    most important components. There are
  • 42:47 - 42:49
    multiple forms depending on the state in
  • 42:49 - 42:52
    which you live and the countries as well,
  • 42:52 - 42:55
    that you have to follow. Anything--if you
  • 42:55 - 42:58
    miss a check mark or if you put a check
  • 42:58 - 43:00
    mark on those particular forms--you are
  • 43:00 - 43:04
    basically dismissing the case. You
  • 43:04 - 43:07
    intentionally... the court doesn’t work in
  • 43:07 - 43:10
    the way many of us believe. Okay, we have
  • 43:10 - 43:12
    the possibility to put somebody in the
  • 43:12 - 43:16
    electric chair or to release or provide
  • 43:16 - 43:19
    this particular individual or
  • 43:19 - 43:22
    organization what we said is relevant.
  • 43:22 - 43:24
    Okay? This is very important. The brief--
  • 43:24 - 43:26
    you always have to be in
  • 43:26 - 43:30
    communication with all parties, both the
  • 43:30 - 43:32
    one presenting the digital process or
  • 43:32 - 43:35
    ruling the process and the other party as
  • 43:35 - 43:40
    well. You cannot hide anything--Zero--from
  • 43:40 - 43:42
    your opponents in the court of law or
  • 43:42 - 43:45
    for the defendant's part. Never in your
  • 43:45 - 43:48
    life. This is why the first bullet in the
  • 43:48 - 43:50
    whole presentation was, as you may
  • 43:50 - 43:54
    remember, ethics. Okay, in digital forensics,
  • 43:54 - 43:57
    we provide what we know to the other
  • 43:57 - 44:00
    parties as well, even to the defendant, to
  • 44:00 - 44:03
    the opponents, every single time. No
  • 44:03 - 44:07
    exception. And we provide every single
  • 44:07 - 44:10
    artifact with the most clear possible
  • 44:10 - 44:12
    explanation to the opponents. This is how
  • 44:12 - 44:15
    the digital forensic process works.
  • 44:15 - 44:18
    Otherwise, it will be dismissed as well
  • 44:18 - 44:21
    in court. Steing, you have to make
  • 44:21 - 44:24
    sure that every single piece of digital
  • 44:24 - 44:27
    evidence is
  • 44:27 - 44:31
    properly stored and that you follow the
  • 44:31 - 44:33
    process by the book. Again, if you skip
  • 44:33 - 44:37
    one step, just one out of 100 or 200s
  • 44:37 - 44:40
    depending on the case, the case is going
  • 44:40 - 44:43
    to be dismissed. No exceptions. The code
  • 44:43 - 44:46
    goes by the book, as you can imagine, and
  • 44:46 - 44:48
    your opponent is going to be very
  • 44:48 - 44:50
    attentive to the minimum possible
  • 44:50 - 44:54
    failure to dismiss the case. Okay, so how do
  • 44:54 - 44:56
    you transport the data from one place to
  • 44:56 - 44:59
    the other place? Chain of custody--this is
  • 44:59 - 45:03
    the key component. Chain of custody, data
  • 45:03 - 45:06
    encryption--you have to make sure that
  • 45:06 - 45:10
    you prevent, or actually, pro-prevent
  • 45:10 - 45:13
    integrity manipulation, and you always
  • 45:13 - 45:16
    want to ensure the confidentiality of the
  • 45:16 - 45:19
    data (CIA). We already talked about the
  • 45:19 - 45:22
    components: confidentiality, integrity, and
  • 45:22 - 45:23
    availability. From the digital forensic
  • 45:23 - 45:26
    standpoint, the most important--no
  • 45:26 - 45:30
    exception--is integrity, and also
  • 45:30 - 45:32
    confidentiality. Okay, so from the
  • 45:32 - 45:35
    recovery image standpoint, you always
  • 45:35 - 45:38
    want to have a duplicate for validation
  • 45:38 - 45:41
    and reanalysis. Remember that you
  • 45:41 - 45:44
    always want to work with a copy of the
  • 45:44 - 45:48
    digital evidence 100% of the time, no exceptions.
  • 45:48 - 45:51
    You have to preserve the original
  • 45:51 - 45:53
    evidence. This is part of our
  • 45:53 - 45:58
    responsibility, and this is why we do bit-by-bit
  • 45:58 - 46:00
    analysis and bit-by-bit copy. It's
  • 46:00 - 46:04
    complex. Okay, now a specific step in
  • 46:04 - 46:06
    digital forensics is to analyze the
  • 46:06 - 46:09
    collected data. At this point, you already
  • 46:09 - 46:11
    went through multiple processes and spent
  • 46:11 - 46:14
    a lot of time. How do you analyze the
  • 46:14 - 46:16
    data you have? Because you are going to
  • 46:16 - 46:19
    have probably terabytes of data. Okay,
  • 46:19 - 46:24
    well, you have to make sure that hashing
  • 46:24 - 46:27
    and digital signatures and the chain
  • 46:27 - 46:31
    of custody have been followed. Data
  • 46:31 - 46:34
    prioritization--what happens and what is
  • 46:34 - 46:36
    more relevant? You cannot present in the
  • 46:36 - 46:39
    court two terabytes of data or 2,000
  • 46:39 - 46:42
    pages. This is irrelevant for the case.
  • 46:42 - 46:44
    Okay, you have to make sure that you use
  • 46:44 - 46:47
    keywords in order to provide a solid
  • 46:47 - 46:50
    report to the court for this particular
  • 46:50 - 46:53
    case. For the keywords, artificial
  • 46:53 - 46:56
    intelligence has been proven to be
  • 46:56 - 46:59
    of huge help. File carving--you have to
  • 46:59 - 47:02
    use a specialized tool to recover files
  • 47:02 - 47:05
    that may have been deleted or are
  • 47:05 - 47:09
    intentionally hidden. Timeline analysis--
  • 47:09 - 47:11
    we talked about it. You have to do everything
  • 47:11 - 47:14
    by following a particular sequence of
  • 47:14 - 47:17
    activities. In other words, you have to
  • 47:17 - 47:19
    present and do the analysis in
  • 47:19 - 47:21
    chronological order, in the way that you
  • 47:21 - 47:24
    collect the data. This is the exact way
  • 47:24 - 47:26
    you do the analysis, and later you do
  • 47:26 - 47:28
    correlation. Okay, but you have to follow
  • 47:28 - 47:31
    a particular chronological order. Data
  • 47:31 - 47:33
    recovery--you have to do your best to
  • 47:33 - 47:36
    reconstruct the data that has been
  • 47:36 - 47:39
    deleted or probably damaged, even by
  • 47:39 - 47:41
    physical or electronic conditions in the
  • 47:41 - 47:44
    storage media. The metadata analysis is
  • 47:44 - 47:46
    also complex. Okay, this is the next
  • 47:46 - 47:49
    component after the timeline
  • 47:49 - 47:52
    analysis. Metadata includes multiple kinds
  • 47:52 - 47:55
    of data, so this part of the analysis is
  • 47:55 - 47:58
    going to be more complete and more time-consuming
  • 47:58 - 48:00
    than the data collection, and
  • 48:00 - 48:03
    the data collection is already very time-consuming.
  • 48:03 - 48:05
    Content analysis--you have to
  • 48:05 - 48:06
    be very careful because this is
  • 48:06 - 48:09
    basically what the forensic analysis is.
  • 48:09 - 48:12
    Parent recognition--how you
  • 48:12 - 48:16
    can match one bit of data with another
  • 48:16 - 48:19
    bit. Okay? Is there any association
  • 48:19 - 48:23
    between bits, between bytes, between data,
  • 48:23 - 48:27
    between words? This is an ideal
  • 48:27 - 48:29
    component. Communication analysis--again,
  • 48:29 - 48:31
    you want to make sure that you include
  • 48:31 - 48:35
    everything. Emails today are probably the
  • 48:35 - 48:38
    most relevant component of digital
  • 48:38 - 48:40
    forensics analysis. You want to make sure
  • 48:40 - 48:43
    that you master email analysis as well.
  • 48:43 - 48:46
    Data encryption--you always have to keep
  • 48:46 - 48:48
    in mind the confidentiality, and when we
  • 48:48 - 48:51
    are talking about the recovery or the
  • 48:51 - 48:53
    recovery image, I mentioned that as well,
  • 48:53 - 48:56
    similar to the chain of custody before,
  • 48:56 - 48:58
    because you always have to preserve the
  • 48:58 - 49:01
    the original data. Evidence
  • 49:01 - 49:03
    examination--you want to make sure that
  • 49:03 - 49:06
    you verify the integrity of the data you
  • 49:06 - 49:09
    have been acquiring, including hash values,
  • 49:09 - 49:11
    digital signatures, and the chain of
  • 49:11 - 49:14
    custody. We talked about this already.
  • 49:14 - 49:17
    This is a repeat of the slide, by the way.
  • 49:17 - 49:20
    Okay, so database examination--you’re
  • 49:20 - 49:24
    forgoing a duplicate slide, so this slide
  • 49:24 - 49:28
    is the same as this one. My apologies
  • 49:28 - 49:31
    for that, it's my fault. Database
  • 49:31 - 49:33
    examination--investigate databases for
  • 49:33 - 49:35
    valuable information, including
  • 49:35 - 49:39
    structured data, logs, entries, etc.
  • 49:39 - 49:41
    Media analysis--this is a very complex
  • 49:41 - 49:44
    process because it's usually about atigo or
  • 49:44 - 49:47
    includes testigo, and this is about
  • 49:47 - 49:50
    images, videos, audios, geolocation, and
  • 49:50 - 49:52
    digital signatures. Network traffic
  • 49:52 - 49:56
    analysis tools, as well as the Shar, h--but my
  • 49:56 - 49:59
    suggestion is that you use all the tools
  • 49:59 - 50:02
    that are part of the artificial
  • 50:02 - 50:05
    intelligence applications we can use
  • 50:05 - 50:07
    today and are available in the
  • 50:07 - 50:11
    market. Estigo is always complex, okay?
  • 50:11 - 50:14
    Because estigo includes not only images but,
  • 50:14 - 50:17
    in many cases, audio as well, and this is
  • 50:17 - 50:20
    very complex, time-consuming. You always
  • 50:20 - 50:22
    want to make sure that you use the
  • 50:22 - 50:24
    appropriate estigo analysis techniques,
  • 50:24 - 50:27
    and there are multiple specific techniques for
  • 50:27 - 50:30
    volatile analysis. As I mentioned before,
  • 50:30 - 50:33
    there are multiple ways to do
  • 50:33 - 50:38
    data acquisition from RAM memory. When we
  • 50:38 - 50:41
    turn off the computer, all the data from
  • 50:41 - 50:44
    RAM doesn’t go off. This is what
  • 50:44 - 50:47
    everybody says. This is what Google says.
  • 50:47 - 50:49
    This is what people who have never done
  • 50:49 - 50:52
    forensic investigations repeat. This is
  • 50:52 - 50:55
    not appropriate if you know how to do it.
  • 50:55 - 50:57
    Again, I made the presentation for EC
  • 50:57 - 51:00
    Counsel in 2019. If you Google my name and
  • 51:00 - 51:03
    this presentation, you will be able to
  • 51:03 - 51:06
    find a particular video in which I was
  • 51:06 - 51:08
    able to recover data from RAM memory
  • 51:08 - 51:12
    after the computer was taken
  • 51:12 - 51:15
    down. Believe it or not, go for the other
  • 51:15 - 51:17
    presentation that this is EC Counsel
  • 51:17 - 51:19
    Database, and you will be able to see the
  • 51:19 - 51:22
    video. Okay, comparison--you have to
  • 51:22 - 51:24
    cross-reference every single time to
  • 51:24 - 51:27
    make sure that the data you identify is
  • 51:27 - 51:30
    appropriate, and you always identify
  • 51:30 - 51:34
    deviations and inconsistencies
  • 51:34 - 51:35
    before you do the final
  • 51:35 - 51:38
    report. I told you already, when you
  • 51:38 - 51:41
    present the report in the court of law,
  • 51:41 - 51:44
    any minimum mistake, something minimal,
  • 51:44 - 51:47
    will disqualify the case. For
  • 51:47 - 51:50
    example, in this presentation, I included
  • 51:50 - 51:53
    IED by mistake. This slide and this slide.
  • 51:53 - 51:56
    If I do that in the court of law,
  • 51:56 - 51:58
    it's dismissed.
  • 51:58 - 52:00
    Okay, that's it. There's no more
  • 52:00 - 52:02
    conversation. The emotion analysis--we've
  • 52:02 - 52:05
    talked about that. We are talking
  • 52:05 - 52:08
    about persons. Digital evidence is always
  • 52:08 - 52:12
    related to people in processes,
  • 52:12 - 52:15
    applications, hardware, software. So we
  • 52:15 - 52:18
    want to make sure that what we present
  • 52:18 - 52:20
    is accurate, and from the documentation.
  • 52:20 - 52:23
    At some point, it was the second point in
  • 52:23 - 52:25
    the presentation. We have to document
  • 52:25 - 52:28
    everything. Reporting is about compiling
  • 52:28 - 52:32
    in a clear and comprehensive manner,
  • 52:32 - 52:34
    including summaries, methodologies, and
  • 52:34 - 52:36
    supporting evidence. You have to include,
  • 52:36 - 52:39
    or at least in my case, I always include
  • 52:39 - 52:42
    the recordings of everything I do.
  • 52:42 - 52:44
    Everything means even if I open my
  • 52:44 - 52:46
    personal email, or if a notification comes
  • 52:46 - 52:49
    to my computer and I open something in
  • 52:49 - 52:53
    my WhatsApp, for example, this is
  • 52:53 - 52:56
    part of the recording as well. Okay? So,
  • 52:56 - 52:58
    you have to make sure that you provide
  • 52:58 - 53:01
    an expert testimony. In order to do that,
  • 53:01 - 53:02
    you have to be an expert in digital forensics.
  • 53:02 - 53:06
    Peer review--consult with others,
  • 53:06 - 53:08
    with your partners, with the opponent,
  • 53:08 - 53:11
    with the defendant part before you
  • 53:11 - 53:12
    present. It's not that you are going to
  • 53:12 - 53:15
    modify the report because the defendant
  • 53:15 - 53:17
    doesn't like it. This is not what I'm
  • 53:17 - 53:19
    telling you; it's just that you are going
  • 53:19 - 53:21
    to provide the report. By the way, you
  • 53:21 - 53:24
    must provide the report to the defendant
  • 53:24 - 53:27
    before you go to court. By the time
  • 53:27 - 53:28
    you stand up in court, everything
  • 53:28 - 53:30
    needs to be done. The other party needs to
  • 53:30 - 53:33
    know exactly what you are going to
  • 53:33 - 53:35
    present. This is how the legal systems
  • 53:35 - 53:38
    work. Okay, with the exception of very few
  • 53:38 - 53:41
    countries, but in the world, this is how
  • 53:41 - 53:44
    it works. So, the quality assurance is just
  • 53:44 - 53:46
    making sure that what you present is
  • 53:46 - 53:49
    appropriate. The case management is how
  • 53:49 - 53:51
    you use the digital forensic
  • 53:51 - 53:54
    system to track everything in the analysis
  • 53:54 - 53:56
    process, and from the data privacy
  • 53:56 - 53:59
    compliance. I told you already that every
  • 53:59 - 54:00
    single place, every single city, and every
  • 54:00 - 54:03
    single state operates under different
  • 54:03 - 54:05
    conditions. Popular tools for digital
  • 54:05 - 54:09
    forensics: A few of those are in case
  • 54:09 - 54:12
    autopsy, Access Data, and everybody knows how to use
  • 54:12 - 54:15
    the forensic toolkit. Hway forensic,
  • 54:15 - 54:18
    celebrity volatility, WI SH,
  • 54:18 - 54:21
    everybody most likely knows, oxygen
  • 54:21 - 54:23
    forensic detective, and the digital
  • 54:23 - 54:25
    evidence forensic toolkit. Some
  • 54:25 - 54:28
    of those are included in Cali, others are
  • 54:28 - 54:31
    not. Some are open source, and others are
  • 54:31 - 54:34
    extremely expensive. For example, in case,
  • 54:34 - 54:37
    which is very, very expensive. Some
  • 54:37 - 54:39
    relevant references about digital
  • 54:39 - 54:43
    forensics: I prefer to use keywords and
  • 54:43 - 54:46
    not particular references or books
  • 54:46 - 54:49
    because I don't recommend any specific
  • 54:49 - 54:52
    book. Instead, the combination of content,
  • 54:52 - 54:54
    knowledge, and expertise is key. But some
  • 54:54 - 54:56
    words or keywords you can use if you
  • 54:56 - 54:59
    want to expand more in digital forensics
  • 54:59 - 55:02
    are: digital forensics best practices,
  • 55:02 - 55:05
    challenges, iMobile digital forensics,
  • 55:05 - 55:07
    network forensic techniques, cloud
  • 55:07 - 55:10
    forensic investigations, Internet of
  • 55:10 - 55:13
    Things forensics, memory forensic analysis,
  • 55:13 - 55:15
    because you want to stop repeating what
  • 55:15 - 55:17
    you have been learning for years. When
  • 55:17 - 55:19
    you take down the computer, when the
  • 55:19 - 55:21
    computer is turned
  • 55:21 - 55:24
    off, there is a lot of data that
  • 55:24 - 55:27
    remains in RAM memory for a particular
  • 55:27 - 55:31
    amount of time, of course. Okay, so try to
  • 55:31 - 55:33
    expand on this topic: malware analysis in
  • 55:33 - 55:35
    digital forensics, and cybersecurity and
  • 55:35 - 55:38
    digital forensics trends. Those are
  • 55:38 - 55:41
    keywords that will facilitate your
  • 55:41 - 55:44
    expansion, or help you expand on digital
  • 55:44 - 55:48
    forensics knowledge. Other
  • 55:48 - 55:51
    considerations are some particular
  • 55:51 - 55:54
    journals. Okay, in this case, I’m going
  • 55:54 - 55:57
    to risk and recommend Digital
  • 55:57 - 56:00
    Investigation, which is published by Xier. It
  • 56:00 - 56:02
    is one of the top journals in the world. The other
  • 56:02 - 56:05
    one is the Journal of Digital Forensics,
  • 56:05 - 56:08
    Security and Law and Forensic Science
  • 56:08 - 56:10
    International: Digital Investigation Report.
  • 56:13 - 56:16
    I'm open to any questions you may
  • 56:16 - 56:19
    have, and one more time,
  • 56:19 - 56:22
    I want to sincerely thank you
  • 56:22 - 56:25
    EC-Council for another opportunity
  • 56:25 - 56:28
    to talk about this fascinating topic.
  • 56:28 - 56:30
    Thank you very much to all the staff at
  • 56:30 - 56:34
    EC-Council that worked tirelessly to make
  • 56:34 - 56:37
    this presentation possible. And
  • 56:37 - 56:39
    thank you so much as well for you guys
  • 56:39 - 56:41
    attending the conference and
  • 56:41 - 56:44
    for the questions that you may ask.
  • 56:45 - 56:48
    Thank you very much, Dr. Luis, for
  • 56:48 - 56:49
    such an insightful and informative
  • 56:49 - 56:51
    session. That was really a very
  • 56:51 - 56:53
    interesting webinar, and we hope it was
  • 56:53 - 56:55
    worth your time too. Now, before we
  • 56:55 - 56:57
    begin with the Q&A, I would like to
  • 56:57 - 57:01
    inform all the attendees that EC-Council's
  • 57:01 - 57:03
    CHFI maps to the forensic
  • 57:03 - 57:05
    investigator and the consultant in digital
  • 57:05 - 57:08
    forensics. Anyone with the CHFI
  • 57:08 - 57:10
    certification is eligible for 4,000+
  • 57:10 - 57:12
    job vacancies globally, with an average
  • 57:12 - 57:14
    salary of $95,000.
  • 57:14 - 57:15
    If you're interested to learn
  • 57:15 - 57:17
    more, kindly take part in the poll that's
  • 57:17 - 57:19
    going to be conducted now. Let us know
  • 57:19 - 57:20
    your preferred mode of training, and we
  • 57:20 - 57:22
    will reach out to you soon.
  • 57:27 - 57:29
    Dr. Luis, shall we start with the Q&A?
  • 57:29 - 57:32
    Yes, I'm ready.
  • 57:32 - 57:35
    Okay, our first question is: How do you
  • 57:35 - 57:39
    prove in a court of law that the collected
  • 57:39 - 57:41
    evidence is from the same object and not
  • 57:41 - 57:43
    collected from any other object?
  • 57:43 - 57:46
    This is a very important question.
  • 57:46 - 57:49
    I really appreciate the clarification on
  • 57:49 - 57:52
    this topic. As I said, we have to be very
  • 57:52 - 57:54
    careful about the way we collect the
  • 57:54 - 57:56
    data. When we are talking about objects,
  • 57:56 - 58:00
    objects are associated with bits, not just
  • 58:00 - 58:02
    bytes, but bits. And as I mentioned
  • 58:02 - 58:06
    multiple times, when we do the copy of
  • 58:06 - 58:09
    the original data, we want to make sure
  • 58:09 - 58:12
    that we always do it bit by bit. When you do
  • 58:12 - 58:17
    bit by bit, and not byte by byte, because a bit
  • 58:17 - 58:22
    implies up to 3.4 volts in electricity,
  • 58:22 - 58:24
    we are eliminating the possibility of
  • 58:24 - 58:28
    mistakes. Objects are bigger; a bit does not
  • 58:28 - 58:31
    constitute an object. Objects are formed
  • 58:31 - 58:34
    by multiple bits. This is why we have to
  • 58:34 - 58:37
    do the analysis bit by bit, and I have
  • 58:37 - 58:39
    mentioned that multiple times.
  • 58:42 - 58:44
    Thank you for answering that
  • 58:44 - 58:47
    question. Our next question is: What kind
  • 58:47 - 58:49
    of forensic data can we obtain from the
  • 58:49 - 58:51
    encrypted data where the key is not
  • 58:51 - 58:54
    available to decrypt the data?
  • 58:54 - 58:57
    Could you please repeat the question?
  • 58:59 - 59:02
    What kind of forensic data can
  • 59:02 - 59:04
    be obtained from the encrypted data
  • 59:04 - 59:06
    where the key is not available to
  • 59:06 - 59:07
    decrypt the data?
  • 59:09 - 59:13
    You encrypt data...
  • 59:13 - 59:16
    I'll just paste the question to you
  • 59:16 - 59:19
    on chat Dr. Luis.
  • 59:19 - 59:23
    I’m not watching the chat right now.
  • 59:23 - 59:25
    Something happened.
  • 59:28 - 59:30
    I'm not watching the chat.
  • 59:30 - 59:35
    Sorry. Hello?
  • 59:35 - 59:36
    Can you hear me?
  • 59:36 - 59:40
    Yes, I can hear you. Yes, I have posted
  • 59:40 - 59:43
    the question in the chat, Dr. Luis. Okay.
  • 59:43 - 59:47
    Okay, please. Yes, I have already pasted it.
  • 59:47 - 59:50
    Okay, let me check here.
  • 59:56 - 60:00
    Okay, give me a second. Okay, what kind of
  • 60:00 - 60:01
    forensic data can be obtained from
  • 60:01 - 60:05
    encrypted data? Oh, okay. Well, this is
  • 60:05 - 60:07
    another misperception. Everybody
  • 60:07 - 60:10
    knows that when the data is encrypted, we
  • 60:10 - 60:13
    cannot open the data or the particular
  • 60:13 - 60:16
    file, document, video, any kind of digital
  • 60:16 - 60:19
    forensics data. Let me tell you something:
  • 60:19 - 60:21
    There are multiple forensic tools that
  • 60:21 - 60:24
    have the ability to decrypt the data
  • 60:24 - 60:26
    even when we don’t have the key. This is, and
  • 60:26 - 60:29
    I understand the key component, and I
  • 60:29 - 60:30
    understand the two types of
  • 60:30 - 60:33
    encryption--symmetric and asymmetric.
  • 60:33 - 60:35
    As I said, I have multiple publications
  • 60:35 - 60:36
    about encryption.
  • 60:36 - 60:40
    But there is most likely
  • 60:40 - 60:44
    always the possibility to decrypt data
  • 60:44 - 60:47
    without having the encryption key. I
  • 60:47 - 60:50
    understand that it doesn’t sound
  • 60:50 - 60:52
    popular; and it’s not what we hear every
  • 60:52 - 60:55
    single time, but when we specialize
  • 60:55 - 60:59
    in digital forensics, we usually have the
  • 60:59 - 61:02
    tools we need to decrypt the data,
  • 61:02 - 61:04
    especially if you are using artificial
  • 61:04 - 61:07
    intelligence. Also, in the government, at
  • 61:07 - 61:09
    least in the U.S. government, in
  • 61:09 - 61:12
    my operation, in the operation I direct, I
  • 61:12 - 61:15
    handle, and I supervise, we have been using
  • 61:15 - 61:16
    artificial intelligence for multiple
  • 61:16 - 61:20
    things in cybersecurity since 2017.
  • 61:20 - 61:22
    We are also using Quantum
  • 61:22 - 61:25
    Computing. Quantum Computing is not
  • 61:25 - 61:29
    coming; quantum computers have been in use in the
  • 61:29 - 61:32
    U.S. government for years now. So, we have been
  • 61:32 - 61:35
    using Quantum Computing for years. There
  • 61:35 - 61:37
    are multiple ways to decrypt the data
  • 61:37 - 61:41
    when the encryption key is not available.
  • 61:41 - 61:43
    Multiple ways, multiple applications as
  • 61:43 - 61:45
    well, help with the process. It's
  • 61:45 - 61:48
    very time-consuming, but there is a
  • 61:48 - 61:51
    possibility for that. And this is a great
  • 61:51 - 61:53
    question because the question is, "Okay,
  • 61:53 - 61:56
    what if the hard drive is encrypted?
  • 61:56 - 61:58
    There's nothing I can do, right?" No,
  • 61:58 - 62:00
    this is not like that. There are always
  • 62:00 - 62:02
    ways to decrypt the data. Always. It
  • 62:02 - 62:05
    doesn't matter how strong the encryption
  • 62:05 - 62:07
    is, but you need to have the appropriate
  • 62:07 - 62:10
    tools in place. For example, I'm going to
  • 62:10 - 62:13
    mention just one: in case, when I present
  • 62:13 - 62:17
    this, some tools that I suggest--before I
  • 62:17 - 62:21
    said that in case is very expensive. In
  • 62:21 - 62:24
    case, do magic (in quotation marks). In
  • 62:24 - 62:26
    case, do multiple things that we don’t
  • 62:26 - 62:29
    learn in school. Okay,
  • 62:29 - 62:32
    so I can see the other question investigations.
  • 62:32 - 62:34
    here: How to adapt to investigation in
  • 62:34 - 62:36
    the cloud, since the cloud providers do
  • 62:36 - 62:38
    not allow most important operations to
  • 62:38 - 62:42
    access media? When you have to do a case
  • 62:42 - 62:45
    or conduct digital forensics in the cloud,
  • 62:45 - 62:49
    the cloud providers, 99% of the time (I
  • 62:49 - 62:51
    don’t want to say 100% because I
  • 62:51 - 62:53
    don’t want to risk it),
  • 62:53 - 62:56
    Cloud providers include in the SLA
  • 62:56 - 62:59
    (Service Level Agreement) what is
  • 62:59 - 63:02
    going to happen if a digital forensic or
  • 63:02 - 63:04
    any kind of investigation
  • 63:04 - 63:08
    needs to be performed in the cloud space.
  • 63:08 - 63:11
    So, most likely, the cloud operator is
  • 63:11 - 63:14
    going to facilitate access to everything
  • 63:14 - 63:16
    you need. Sometimes, you have to move and
  • 63:16 - 63:19
    go physically to the place where the
  • 63:19 - 63:21
    data is hosted.
  • 63:21 - 63:23
    Don't believe that the cloud
  • 63:23 - 63:26
    provider doesn’t know where the data is
  • 63:26 - 63:29
    hosted. We know where the data is hosted.
  • 63:29 - 63:31
    Specifically, I have been in San Diego,
  • 63:31 - 63:34
    California, and other states in Hawaii
  • 63:34 - 63:36
    back in 2019
  • 63:36 - 63:38
    as well. Doing forensic
  • 63:38 - 63:41
    investigations in a cloud environment. It
  • 63:41 - 63:44
    was actually for something government-related,
  • 63:44 - 63:46
    and I was given the permission I
  • 63:46 - 63:49
    needed to do any kind of investigation. So,
  • 63:49 - 63:52
    cloud providers facilitate forensic
  • 63:52 - 63:55
    analysis because forensic analysis is
  • 63:55 - 63:58
    usually related to legal cases. There are
  • 63:58 - 64:01
    multiple cases in which, in the USA, we don’t
  • 64:01 - 64:03
    have access to this data and I'm going
  • 64:03 - 64:07
    to mention an example: TikTok.
  • 64:07 - 64:09
    The problem between the U.S. government
  • 64:09 - 64:12
    and TikTok is that when TikTok got the
  • 64:12 - 64:15
    authorization to operate in the USA, the
  • 64:15 - 64:19
    government was one step behind.
  • 64:19 - 64:21
    Okay, and we don’t regulate TikTok at
  • 64:21 - 64:25
    this point. TikTok has the ability to
  • 64:25 - 64:28
    prevent forensic investigations on the
  • 64:28 - 64:31
    TikTok platform for the U.S. government or
  • 64:31 - 64:35
    legal system. Okay, but
  • 64:35 - 64:38
    again, usually, cloud providers facilitate
  • 64:38 - 64:41
    investigations in the cloud 100%. They
  • 64:41 - 64:43
    cooperate in every single manner to
  • 64:43 - 64:47
    facilitate forensic investigations.
  • 64:50 - 64:52
    Thank you for answering
  • 64:52 - 64:54
    that question. We'll take the last
  • 64:54 - 64:57
    question for the day. What are the best
  • 64:57 - 65:00
    open-source free tools for social media
  • 65:00 - 65:04
    forensics? There is no "best" open-source
  • 65:04 - 65:06
    tool. It is a combination of tools.
  • 65:06 - 65:09
    Number one: digital forensics cannot be
  • 65:09 - 65:11
    performed categorically with
  • 65:11 - 65:15
    one or two tools. This is a complex, time-consuming,
  • 65:15 - 65:18
    and expensive process. I made
  • 65:18 - 65:21
    some suggestions; they're included in the
  • 65:21 - 65:25
    slide. Let me see a slide…
  • 65:27 - 65:31
    slide number 16.
  • 65:31 - 65:34
    Okay, this is the slide in which I
  • 65:34 - 65:37
    include InCase, Autopsy, and some of
  • 65:37 - 65:41
    them are uppercase as I… I’m sorry, open-source
  • 65:41 - 65:43
    as I mentioned before, but there
  • 65:43 - 65:46
    is not a particular tool or two or three
  • 65:46 - 65:48
    tools that I will recommend. Because, on
  • 65:48 - 65:52
    top of that, every single forensic investigation
  • 65:52 - 65:55
    is about a different
  • 65:55 - 65:57
    process. You cannot use the same tools.
  • 65:57 - 66:01
    This is why there are very, at least in the
  • 66:01 - 66:04
    USA, a very small number of organizations or
  • 66:04 - 66:07
    companies that specialize in digital
  • 66:07 - 66:10
    forensics, as my company does. The reason
  • 66:10 - 66:14
    why is because, among many other things,
  • 66:14 - 66:16
    lack of expertise and expenses.
  • 66:16 - 66:19
    Okay, so I do not recommend a
  • 66:19 - 66:22
    particular tool. Instead, I recommend the combination
  • 66:22 - 66:24
    of tools. There are multiple open-source tools.
  • 66:24 - 66:28
    I mention a few in slide number 16 of
  • 66:28 - 66:31
    my PowerPoint presentation, but again,
  • 66:31 - 66:33
    those are not sufficient. Those are the
  • 66:33 - 66:36
    most popular and stronger,
  • 66:36 - 66:39
    more accurate tools that
  • 66:39 - 66:42
    you can use for digital forensics, but a
  • 66:42 - 66:44
    particular tool, one or two, to do a
  • 66:44 - 66:47
    forensic investigation--it doesn’t exist.
  • 66:47 - 66:49
    It’s impossible. Doesn't exist.
  • 66:52 - 66:54
    Thank you again to our wonderful
  • 66:54 - 66:56
    speaker, Dr. Luis, for answering those
  • 66:56 - 66:58
    questions and for the great presentation
  • 66:58 - 67:00
    and knowledge shared with our global
  • 67:00 - 67:02
    audiences. It was a pleasure to have you
  • 67:02 - 67:04
    with us, and we are looking forward to more and
  • 67:04 - 67:05
    more sessions with you. Before we
  • 67:05 - 67:07
    conclude the webinar, Dr. Luis, would you
  • 67:07 - 67:08
    like to give a small message to our
  • 67:08 - 67:10
    audiences? Please.
  • 67:11 - 67:14
    Well, no, I just want to thank
  • 67:14 - 67:17
    everybody again, the ones that worked
  • 67:17 - 67:21
    tirelessly behind the presentation to you
  • 67:21 - 67:24
    in EC-Council. As always, thank you very
  • 67:24 - 67:25
    much for the support. For all the
  • 67:25 - 67:28
    attendees, I hope you learned something new.
  • 67:28 - 67:32
    Let me clarify that every single content,
  • 67:32 - 67:34
    wording, words, etc., that I have been
  • 67:34 - 67:37
    presenting to you is my original
  • 67:37 - 67:40
    creation not 99.99%,
  • 67:40 - 67:43
    but 100% categorically speaking.
  • 67:43 - 67:45
    I put together those notes and
  • 67:45 - 67:48
    reflections for you guys with the hope
  • 67:48 - 67:49
    that you can come back to your
  • 67:49 - 67:52
    organization and serve better, that you can
  • 67:52 - 67:55
    become a public servant
  • 67:55 - 67:57
    and go to court and testify in
  • 67:57 - 68:01
    favor of the part that deserves your benefits.
  • 68:01 - 68:04
    I sincerely thank you for
  • 68:04 - 68:06
    the opportunity to share my expertise
  • 68:06 - 68:09
    with you guys. Have a nice weekend, okay?
  • 68:09 - 68:10
    Thank you very much for the time and the
  • 68:10 - 68:12
    questions. Thank you so much.
  • 68:14 - 68:17
    Thank you so much, Dr. Luis, for your
  • 68:17 - 68:19
    message. Before we end the session, I
  • 68:19 - 68:21
    would like to announce the next Cyber Talk
  • 68:21 - 68:23
    session: "Why Are Strong Foundational
  • 68:23 - 68:25
    Cybersecurity Skills Essential for
  • 68:25 - 68:27
    Every IT Professional?" which is scheduled
  • 68:27 - 68:29
    for November 8, 2023. This session is an
  • 68:29 - 68:31
    expert presentation by Roger Smith,
  • 68:31 - 68:34
    Director, Managed IT Industry Fellow
  • 68:34 - 68:37
    at the Australian Defense Force Academy. To
  • 68:37 - 68:38
    register for this session, please do
  • 68:38 - 68:39
    visit our website:
  • 68:39 - 68:43
    www.ccu.edu/cybertalks. The link is
  • 68:43 - 68:45
    given in the chat section. Hope to see
  • 68:45 - 68:48
    you all on November 8th. With this,
  • 68:48 - 68:50
    you may disconnect
  • 68:50 - 68:52
    your lines. Thank you. Thank you so much,
  • 68:52 - 68:55
    Dr. Luis. It was a pleasure having you.
  • 68:55 - 68:57
    Likewise, thank you very much for the opportunity.
  • 68:58 - 69:00
    Thank you. Have a good day.
Title:
Digital Forensics Best Practices: From Data Acquisition to Analysis
Description:

more » « less
Video Language:
English
Duration:
01:09:00

English subtitles

Revisions Compare revisions