< Return to Video

Intermediate Searching in Splunk

  • 0:02 - 0:03
    hello
  • 0:03 - 0:07
    everyone we are getting started here on
  • 0:07 - 0:10
    our August lunch and learn session
  • 0:10 - 0:13
    presented by Ken group's Atlas customer
  • 0:13 - 0:16
    experience team my name is Alice Deane I
  • 0:16 - 0:19
    am the engineering manager for the atlas
  • 0:19 - 0:22
    customer experience team and I'm excited
  • 0:22 - 0:25
    to be presenting this month's session on
  • 0:25 - 0:28
    intermediate level Splunk searching so
  • 0:28 - 0:30
    thank you all for attending I hope you
  • 0:30 - 0:33
    get some good ideas out of this uh and
  • 0:33 - 0:35
    certainly encourage engagement through
  • 0:35 - 0:37
    the chat uh and I'll have some
  • 0:37 - 0:40
    information at the end on following up
  • 0:40 - 0:42
    uh and speaking with my team directly on
  • 0:42 - 0:46
    any issues uh or interests that you have
  • 0:46 - 0:48
    uh around these types of Concepts that
  • 0:48 - 0:52
    we're going to cover today so uh jumping
  • 0:52 - 0:55
    into an intermediate level uh session I
  • 0:55 - 0:58
    I do want to say that we have previously
  • 0:58 - 1:02
    done a basic level uh searching uh
  • 1:02 - 1:05
    session so that we are really
  • 1:05 - 1:07
    progressing from that picking up right
  • 1:07 - 1:09
    where we left off uh we've done that
  • 1:09 - 1:11
    session with quite a few of our
  • 1:11 - 1:13
    customers individually uh and highly
  • 1:13 - 1:15
    recommend if you're interested in doing
  • 1:15 - 1:18
    that or this session with a larger team
  • 1:18 - 1:20
    uh we're happy to to discuss and
  • 1:20 - 1:23
    coordinate that uh so getting started
  • 1:23 - 1:26
    we're going to take a look at the final
  • 1:26 - 1:29
    search uh from our basic search session
  • 1:29 - 1:31
    uh and we're going to walk through that
  • 1:31 - 1:34
    understand some of the concepts uh and
  • 1:34 - 1:36
    then we're going to take a step back
  • 1:36 - 1:39
    look a little more generally at SPL
  • 1:39 - 1:42
    operations and understanding how
  • 1:42 - 1:46
    different commands apply to data uh and
  • 1:46 - 1:49
    really that next level of understanding
  • 1:49 - 1:52
    for how you can write more complex
  • 1:52 - 1:54
    searches and understand really uh when
  • 1:54 - 1:57
    to use certain types of commands uh and
  • 1:57 - 2:00
    of course uh in the session we're going
  • 2:00 - 2:04
    to to have a uh series of demos uh using
  • 2:04 - 2:07
    a few specific commands highlighting the
  • 2:07 - 2:10
    different SPL command types uh that we
  • 2:10 - 2:13
    discuss in the second portion uh and get
  • 2:13 - 2:16
    to see that on the tutorial data uh that
  • 2:16 - 2:18
    you can also use uh in your environment
  • 2:18 - 2:21
    in a test environment uh very
  • 2:21 - 2:24
    simply so I will always encourage uh
  • 2:24 - 2:28
    especially with search content that you
  • 2:28 - 2:30
    look into the additional resource that I
  • 2:30 - 2:34
    have listed here the uh search reference
  • 2:34 - 2:36
    documentation is one of my favorite
  • 2:36 - 2:39
    bookmarks that I use frequently in my
  • 2:39 - 2:41
    own environments and working in customer
  • 2:41 - 2:44
    environments uh it is really the the
  • 2:44 - 2:46
    best quick resource to get information
  • 2:46 - 2:50
    on syntax and examples of any Search
  • 2:50 - 2:52
    Command uh and is always a great
  • 2:52 - 2:55
    resource to have the search manual is a
  • 2:55 - 2:57
    little bit more conceptual but as you're
  • 2:57 - 2:59
    learning more about different types of
  • 2:59 - 3:00
    search operations
  • 3:00 - 3:02
    it's very helpful to be able to review
  • 3:02 - 3:03
    this
  • 3:03 - 3:06
    documentation and have reference
  • 3:06 - 3:09
    material that you can come back to uh as
  • 3:09 - 3:11
    you are studying and trying to get
  • 3:11 - 3:13
    better and writing more complex uh
  • 3:13 - 3:17
    search content I have also linked here
  • 3:17 - 3:19
    the documentation on how to use the
  • 3:19 - 3:22
    Splunk tutorial data uh so if you've not
  • 3:22 - 3:23
    done that before it's a very simple
  • 3:23 - 3:26
    process uh and there are consistently
  • 3:26 - 3:28
    updated download files that spun
  • 3:28 - 3:31
    provides uh that you're able to directly
  • 3:31 - 3:33
    upload into any spunk environment so
  • 3:33 - 3:36
    that's what I'm going to be using today
  • 3:36 - 3:39
    uh and given that you are searching over
  • 3:39 - 3:41
    uh appropriate time windows for when you
  • 3:41 - 3:44
    download the tutorial data set uh these
  • 3:44 - 3:47
    searches will work uh on the tutorial
  • 3:47 - 3:49
    data as well so highly encourage after
  • 3:49 - 3:51
    the fact if you want to go through uh
  • 3:51 - 3:54
    and test out some of the content um
  • 3:54 - 3:57
    you'll be able to access a recording as
  • 3:57 - 3:59
    well as if you'd like the slides that
  • 3:59 - 4:01
    I'm Pres presenting off of today which I
  • 4:01 - 4:02
    highly encourage because there are a lot
  • 4:02 - 4:05
    of useful links in here uh reach out to
  • 4:05 - 4:07
    my team again right at the end of the
  • 4:07 - 4:09
    slides we'll have that
  • 4:09 - 4:13
    info so looking at our overview of basic
  • 4:13 - 4:16
    search uh I just want to cover
  • 4:16 - 4:18
    conceptually uh the two categories that
  • 4:18 - 4:22
    we discuss in that session and so those
  • 4:22 - 4:24
    two are the statistical and charting
  • 4:24 - 4:28
    functions uh which consist of in those
  • 4:28 - 4:31
    demos aggregate and time functions so
  • 4:31 - 4:34
    aggregate functions are going to be your
  • 4:34 - 4:37
    commonly used statistical functions uh
  • 4:37 - 4:40
    meant for summarization uh and then time
  • 4:40 - 4:43
    functions actually using uh the
  • 4:43 - 4:47
    timestamp field underscore time or any
  • 4:47 - 4:49
    other time that you've extracted from
  • 4:49 - 4:52
    data uh and looking at earliest latest
  • 4:52 - 4:55
    uh relative time values uh in a
  • 4:55 - 4:58
    summative fashion and then evaluation
  • 4:58 - 5:02
    functions are the separate uh type where
  • 5:02 - 5:04
    we discuss comparison and conditional
  • 5:04 - 5:08
    statements so using your if and your
  • 5:08 - 5:11
    case commands uh in
  • 5:11 - 5:14
    evals uh also datetime functions that
  • 5:14 - 5:17
    apply operations to events uniquely uh
  • 5:17 - 5:20
    so not necessarily summarization but
  • 5:20 - 5:22
    interacting with the time values
  • 5:22 - 5:24
    themselves maybe changing the time
  • 5:24 - 5:27
    format uh and then multivalue evalve
  • 5:27 - 5:29
    functions we touch on that very lightly
  • 5:29 - 5:32
    uh and it is more conceptual in basic
  • 5:32 - 5:34
    search So today we're going to dive in
  • 5:34 - 5:36
    as part of our demo and look at
  • 5:36 - 5:39
    multivalue eval functions uh later in
  • 5:39 - 5:41
    the
  • 5:41 - 5:45
    presentation so on this slide here I
  • 5:45 - 5:49
    have highlighted uh in Gray the search
  • 5:49 - 5:52
    that we end basic search with uh and so
  • 5:52 - 5:55
    that is broken up into three segments
  • 5:55 - 5:57
    where we have uh the first line being a
  • 5:57 - 6:00
    filter to a data set uh this is very
  • 6:00 - 6:03
    simply how you are sourcing most of your
  • 6:03 - 6:06
    data in most of your searches in Splunk
  • 6:06 - 6:08
    uh and we always want to be a specific
  • 6:08 - 6:11
    as possible you'll most often see the
  • 6:11 - 6:13
    The Logical way to do that is by
  • 6:13 - 6:16
    identifying an index and a source type
  • 6:16 - 6:18
    possibly some specific values of given
  • 6:18 - 6:20
    fields in that data before you start
  • 6:20 - 6:23
    applying other operations in our case we
  • 6:23 - 6:25
    want to work with a whole data set uh
  • 6:25 - 6:29
    and then we mve into applying our eval
  • 6:29 - 6:30
    statements
  • 6:30 - 6:33
    so in the evals the purpose of these is
  • 6:33 - 6:37
    to create some new fields to work with
  • 6:37 - 6:40
    uh and so we have two operations here uh
  • 6:40 - 6:42
    and you can see that on the first line
  • 6:42 - 6:46
    we're starting with an error check field
  • 6:46 - 6:49
    uh these are web access logs so we're
  • 6:49 - 6:53
    looking at the HTTP status codes as the
  • 6:53 - 6:56
    status field and we have a logical
  • 6:56 - 6:58
    condition here we greater than or equal
  • 6:58 - 7:01
    to 400 we want to return errors and so
  • 7:01 - 7:04
    very simple example uh making it as easy
  • 7:04 - 7:06
    as possible if you want to get specifics
  • 7:06 - 7:09
    on your 200s and your 300s it's the
  • 7:09 - 7:12
    exact same type of logic to go and apply
  • 7:12 - 7:14
    uh likely a case statement to get some
  • 7:14 - 7:17
    additional conditions uh and more unique
  • 7:17 - 7:21
    output in an error check or some sort of
  • 7:21 - 7:24
    field uh indicating uh what you want to
  • 7:24 - 7:26
    see out of your status code so this case
  • 7:26 - 7:30
    simple errors or the value of non here
  • 7:30 - 7:32
    if we have say at
  • 7:32 - 7:35
    200 we're also using a Time function to
  • 7:35 - 7:39
    create a second field called day uh you
  • 7:39 - 7:42
    may be familiar with some of the uh
  • 7:42 - 7:46
    fields that you get out of uh by default
  • 7:46 - 7:50
    for most any events in Splunk uh and
  • 7:50 - 7:52
    that they're related to breakdowns of
  • 7:52 - 7:56
    the time stamps uh you have day month uh
  • 7:56 - 7:58
    and many others in this case I want to
  • 7:58 - 8:01
    get a specific format for day so we use
  • 8:01 - 8:03
    a strif Time function uh and we have a
  • 8:03 - 8:07
    time format variable here on the actual
  • 8:07 - 8:10
    extracted time stamp first of BL so
  • 8:10 - 8:12
    coming out of the second line we've
  • 8:12 - 8:14
    accessed our data we have created two
  • 8:14 - 8:17
    new fields to use and then we are
  • 8:17 - 8:21
    actually performing uh charting with a
  • 8:21 - 8:24
    statistical function and so that is
  • 8:24 - 8:26
    using time chart and we can see here
  • 8:26 - 8:29
    that we are counting our events that
  • 8:29 - 8:33
    actually have the error value uh for our
  • 8:33 - 8:36
    created error check field and so I'm
  • 8:36 - 8:39
    going to Pivot over to uh Splunk here
  • 8:39 - 8:41
    and we're going to look at this search
  • 8:41 - 8:43
    and I have commented out uh most of the
  • 8:43 - 8:46
    logic we'll step back through it uh we
  • 8:46 - 8:49
    are looking at our web access log events
  • 8:49 - 8:53
    here uh and we want to then apply our
  • 8:53 - 8:58
    eval and so by applying the eval we can
  • 8:58 - 9:01
    get our error check field that provides
  • 9:01 - 9:03
    error or non-error we're seeing that we
  • 9:03 - 9:05
    have mostly non-error
  • 9:05 - 9:10
    events uh and then we have the day field
  • 9:10 - 9:12
    and so day is actually providing the
  • 9:12 - 9:14
    full name of day for the time stamp for
  • 9:14 - 9:18
    all these events so with our time chart
  • 9:18 - 9:22
    this is the summarization uh with a
  • 9:22 - 9:24
    condition actually that we're spanning
  • 9:24 - 9:28
    by default over a single day so this may
  • 9:28 - 9:32
    not be a very logical use of a split by
  • 9:32 - 9:34
    day when we are already using a time
  • 9:34 - 9:37
    chart command that is dividing our
  • 9:37 - 9:41
    results by the time bin uh effectively a
  • 9:41 - 9:46
    span of one day but what we can do uh is
  • 9:46 - 9:50
    change our split by field to host and
  • 9:50 - 9:53
    get a little bit more of a reasonable
  • 9:53 - 9:55
    presentation we were able to see with
  • 9:55 - 9:58
    the counts in the individual days not
  • 9:58 - 10:00
    only split through the time chart but by
  • 10:00 - 10:02
    the day field that we only had values
  • 10:02 - 10:05
    where our Matrix matched up for the
  • 10:05 - 10:10
    actual day so here we have our uh hosts
  • 10:10 - 10:13
    one two and three and then across days
  • 10:13 - 10:16
    counts of the error events that we
  • 10:16 - 10:20
    observe so that is uh the search that we
  • 10:20 - 10:22
    end on in basic search the concepts
  • 10:22 - 10:25
    there being accessing our data uh
  • 10:25 - 10:27
    searching in a descriptive manner using
  • 10:27 - 10:29
    our metadata Fields the index and the
  • 10:29 - 10:32
    Source type uh the evaluation functions
  • 10:32 - 10:34
    where we're creating new Fields
  • 10:34 - 10:38
    manipulating data uh and then we have a
  • 10:38 - 10:40
    time chart function uh that is providing
  • 10:40 - 10:43
    some uh summarized statistics here based
  • 10:43 - 10:44
    on the time
  • 10:44 - 10:49
    range so we will pivot back and we're
  • 10:49 - 10:51
    going to take a step back out of the SPL
  • 10:51 - 10:54
    for a second just to talk about these
  • 10:54 - 10:57
    different kinds of search operations
  • 10:57 - 10:59
    that we just performed so you'll hear
  • 10:59 - 11:03
    these terms uh if you are really kind of
  • 11:03 - 11:06
    diving deeper into actual operations of
  • 11:06 - 11:10
    Splunk searching uh and you can get very
  • 11:10 - 11:13
    detailed regarding the optimization of
  • 11:13 - 11:16
    searches uh around uh these types of
  • 11:16 - 11:18
    commands and the order in which you
  • 11:18 - 11:21
    choose to execute SPL today I'm going to
  • 11:21 - 11:24
    focus on how these operations actually
  • 11:24 - 11:27
    apply to the data and helping you to
  • 11:27 - 11:29
    make better decisions about what
  • 11:29 - 11:32
    commands are best for the scenario that
  • 11:32 - 11:34
    you have or the output that you want to
  • 11:34 - 11:38
    see uh and in future sessions we will
  • 11:38 - 11:39
    discuss the actual optimization of
  • 11:39 - 11:42
    searches uh through this optimal order
  • 11:42 - 11:46
    of functions uh and some other means uh
  • 11:46 - 11:48
    but just a caveat there that we're going
  • 11:48 - 11:50
    to talk uh pretty specifically today
  • 11:50 - 11:53
    just about these uh individually how
  • 11:53 - 11:55
    they work with data uh and then how you
  • 11:55 - 11:56
    see them in
  • 11:56 - 12:00
    combination so our types of SPL command
  • 12:00 - 12:03
    the top three in bold we'll focus on in
  • 12:03 - 12:06
    our examples the first of which is
  • 12:06 - 12:07
    streaming
  • 12:07 - 12:11
    operations uh which are executed on
  • 12:11 - 12:13
    individual events as they return by a
  • 12:13 - 12:15
    search uh so you can think of this like
  • 12:15 - 12:16
    your
  • 12:16 - 12:19
    eval um that is going to be doing
  • 12:19 - 12:21
    something to every single event uh
  • 12:21 - 12:24
    modifying Fields when they're available
  • 12:24 - 12:28
    uh we do have generating functions uh so
  • 12:28 - 12:31
    generating function are going to be used
  • 12:31 - 12:34
    situationally where you're sourcing data
  • 12:34 - 12:38
    from uh non-indexed data sets and so you
  • 12:38 - 12:41
    would see that from uh either input
  • 12:41 - 12:44
    lookup commands uh or maybe tstats
  • 12:44 - 12:46
    pulling information from the tsid X
  • 12:46 - 12:49
    Files uh and so generating the
  • 12:49 - 12:51
    statistical output based on the data
  • 12:51 - 12:55
    available there transforming commands uh
  • 12:55 - 12:59
    you will see uh as often as streaming
  • 12:59 - 13:01
    commands generally speaking and more
  • 13:01 - 13:03
    often than generating commands where
  • 13:03 - 13:05
    transforming is intended to order
  • 13:05 - 13:09
    results into a data table and I often
  • 13:09 - 13:11
    think of this much like how we discuss
  • 13:11 - 13:14
    the statistical functions in basic
  • 13:14 - 13:17
    search as summarization functions where
  • 13:17 - 13:20
    you're looking to condense your overall
  • 13:20 - 13:23
    data set uh into really manageable
  • 13:23 - 13:25
    consumable results uh so these
  • 13:25 - 13:28
    operations that apply that summarization
  • 13:28 - 13:32
    are transform perform we do have two
  • 13:32 - 13:36
    additional types of SPL commands uh the
  • 13:36 - 13:39
    first is orchestrating uh you can read
  • 13:39 - 13:42
    about these I will not discuss in great
  • 13:42 - 13:45
    detail uh they are used to manipulate
  • 13:45 - 13:49
    how searches are actually U processed or
  • 13:49 - 13:51
    or how commands are processed uh and
  • 13:51 - 13:54
    they don't directly affect the results
  • 13:54 - 13:56
    in a search how we think about say
  • 13:56 - 14:00
    applying a stats or an eval uh to a data
  • 14:00 - 14:02
    set uh so if you're interested
  • 14:02 - 14:04
    definitely check it out uh link
  • 14:04 - 14:08
    documentation has details there um data
  • 14:08 - 14:11
    set processing is seen much more often
  • 14:11 - 14:15
    uh and you do have uh some conditional
  • 14:15 - 14:19
    uh scenarios where commands can act as
  • 14:19 - 14:22
    data set processing so the uh
  • 14:22 - 14:24
    distinction for data set processing is
  • 14:24 - 14:26
    going to be that you are operating in
  • 14:26 - 14:30
    bulk on a single completed data set at
  • 14:30 - 14:32
    one time so we'll we'll look at an
  • 14:32 - 14:34
    example of
  • 14:34 - 14:37
    that I want to Pivot back to our main
  • 14:37 - 14:38
    three that we're going to be focusing on
  • 14:38 - 14:40
    and I have mentioned some of these
  • 14:40 - 14:44
    examples already uh the eval functions
  • 14:44 - 14:46
    that we've been talking about so far are
  • 14:46 - 14:48
    perfect examples of our streaming
  • 14:48 - 14:51
    commands uh so where we are creating new
  • 14:51 - 14:56
    fields for each entry or log event uh
  • 14:56 - 14:59
    where we are modifying values for all of
  • 14:59 - 15:02
    the results that are available uh that
  • 15:02 - 15:05
    is where we are streaming um with the
  • 15:05 - 15:09
    search functions input lookup is
  • 15:09 - 15:10
    possibly one of the most common
  • 15:10 - 15:12
    generating commands that I see uh
  • 15:12 - 15:15
    because someone is intending to uh
  • 15:15 - 15:19
    Source a data set stored in a CSV file
  • 15:19 - 15:21
    or a KV store collection uh and you're
  • 15:21 - 15:24
    able to bring that back as a report and
  • 15:24 - 15:27
    use that logic uh in your
  • 15:27 - 15:30
    queries so that is
  • 15:30 - 15:33
    uh not requiring the index data uh or
  • 15:33 - 15:36
    any index data to actually return the
  • 15:36 - 15:38
    results that you want to
  • 15:38 - 15:41
    see and we've talked about stats very
  • 15:41 - 15:44
    generally speaking uh with a lot of
  • 15:44 - 15:46
    unique functions you can apply there uh
  • 15:46 - 15:50
    where this is going to provide a tabular
  • 15:50 - 15:54
    output uh and is serving that purpose of
  • 15:54 - 15:55
    summarization so we're really
  • 15:55 - 15:58
    reformatting the data uh into that
  • 15:58 - 16:01
    tabular report
  • 16:02 - 16:07
    so we see in this example search here uh
  • 16:07 - 16:09
    that we are often combining these
  • 16:09 - 16:12
    different types of search operations so
  • 16:12 - 16:15
    in this example that we have uh I have
  • 16:15 - 16:19
    data that already exists in a CSV file
  • 16:19 - 16:23
    we are applying a streaming command here
  • 16:23 - 16:26
    uh where evaluating each line to see if
  • 16:26 - 16:28
    we match a condition and then returning
  • 16:28 - 16:30
    the results
  • 16:30 - 16:32
    based on that evaluation and then we're
  • 16:32 - 16:34
    applying a transforming command at the
  • 16:34 - 16:37
    end which is that stats summarization
  • 16:37 - 16:40
    getting the maximum values uh for the uh
  • 16:40 - 16:44
    count of errors and the host that is
  • 16:44 - 16:48
    associated with that so let's PIV over
  • 16:48 - 16:52
    to Splunk and we'll take a look at that
  • 16:54 - 16:56
    example so I'm just going to grab my
  • 16:56 - 16:59
    search here and I pre- commented out
  • 16:59 - 17:04
    uh the specific uh lines following input
  • 17:04 - 17:06
    lookup just to see that this generating
  • 17:06 - 17:08
    command here is not looking for any
  • 17:08 - 17:10
    specific index data uh we're pulling
  • 17:10 - 17:13
    directly the results that I have in a
  • 17:13 - 17:18
    CSV file uh here into this output and so
  • 17:18 - 17:21
    we have a count of Errors observed
  • 17:21 - 17:25
    across multiple hosts our where command
  • 17:25 - 17:29
    uh you might think is reformatting data
  • 17:29 - 17:31
    in this sense it it is transforming the
  • 17:31 - 17:34
    results but the evaluation of a wear
  • 17:34 - 17:37
    function does apply effectively to every
  • 17:37 - 17:42
    event that is returned uh so it is u a
  • 17:42 - 17:44
    streaming command that is going to
  • 17:44 - 17:47
    filter down our result set based on our
  • 17:47 - 17:49
    condition that the error count is less
  • 17:49 - 17:51
    than
  • 17:51 - 17:55
    200 so the following line is our
  • 17:55 - 17:57
    transforming command where we have two
  • 17:57 - 18:02
    results left uh 187 for host 3 we want
  • 18:02 - 18:06
    to see our maximum values here of 187 on
  • 18:06 - 18:10
    host 3 so our scenario here has really
  • 18:10 - 18:13
    uh covered where you may have uh hosts
  • 18:13 - 18:16
    that are trending toward a negative
  • 18:16 - 18:19
    State you're aware that uh the second
  • 18:19 - 18:22
    host had already exceeded its uh
  • 18:22 - 18:25
    threshold value for errors but host 3
  • 18:25 - 18:27
    also appears to be trending toward this
  • 18:27 - 18:30
    threshold uh so being able to combine
  • 18:30 - 18:33
    these types of commands uh understand
  • 18:33 - 18:35
    the logical condition that you're
  • 18:35 - 18:38
    searching for uh and then also providing
  • 18:38 - 18:41
    that consumable output uh so combining
  • 18:41 - 18:44
    all three of our types of commands
  • 18:45 - 18:49
    here so uh I'm going to jump to an SPL
  • 18:49 - 18:53
    demo and as I go through these different
  • 18:53 - 18:56
    commands uh I'm going to be referencing
  • 18:56 - 18:58
    back to the different command types that
  • 18:58 - 19:00
    we're working with I'm going to
  • 19:00 - 19:02
    introduce in a lot of these searches uh
  • 19:02 - 19:05
    a lot of small commands uh that I won't
  • 19:05 - 19:07
    talk about in great detail and that
  • 19:07 - 19:09
    really is the purpose of using your
  • 19:09 - 19:12
    search manual uh using your search
  • 19:12 - 19:15
    reference documentation uh so I will
  • 19:15 - 19:17
    glance over the use case uh talk about
  • 19:17 - 19:20
    how it's meant to be applied and then
  • 19:20 - 19:22
    using in your own scenarios uh where you
  • 19:22 - 19:24
    have problem you need to solve uh
  • 19:24 - 19:27
    referencing the docs to find out where
  • 19:27 - 19:30
    you can apply uh similar functions to
  • 19:30 - 19:33
    what we observe in the the demonstration
  • 19:33 - 19:37
    here so the First Command I'm going to
  • 19:37 - 19:41
    focus on is the Rex command so Rex is a
  • 19:41 - 19:43
    streaming command that you often see
  • 19:43 - 19:47
    applied to data sets that do not fully
  • 19:47 - 19:50
    have data extracted in the format that
  • 19:50 - 19:53
    you want to be using um in your
  • 19:53 - 19:57
    reporting or in your logic uh and so
  • 19:57 - 20:00
    this could very well be handled actually
  • 20:00 - 20:03
    in the uh configuration of props and
  • 20:03 - 20:06
    transforms and extracting fields at the
  • 20:06 - 20:08
    right times and indexing data but as
  • 20:08 - 20:10
    your bringing new data sources you need
  • 20:10 - 20:12
    to understand what's available for use
  • 20:12 - 20:14
    in spunk a lot of times you'll find
  • 20:14 - 20:17
    yourself needing to extract new fields
  • 20:17 - 20:19
    in line in your searches uh and be able
  • 20:19 - 20:22
    to use those in your search Logic Rex
  • 20:22 - 20:28
    also has uh a said mode that I also see
  • 20:28 - 20:32
    testing done for masking of data in line
  • 20:32 - 20:34
    prior to actually putting that into
  • 20:34 - 20:35
    indexing
  • 20:35 - 20:38
    configurations um so Rex you would
  • 20:38 - 20:41
    generally see used um when you don't
  • 20:41 - 20:43
    have those fields available you need to
  • 20:43 - 20:46
    use them at that time uh and then we're
  • 20:46 - 20:47
    going to take a look at an example of
  • 20:47 - 20:50
    masking data as well uh to test your
  • 20:50 - 20:53
    Syntax for a said style replace uh in
  • 20:53 - 21:01
    config files so we will jump back over
  • 21:05 - 21:07
    so I'm going to start with a search on
  • 21:07 - 21:10
    an index Source type uh my tutorial data
  • 21:10 - 21:13
    and then this is actual uh Linux secure
  • 21:13 - 21:16
    logging uh so these are going to be OS
  • 21:16 - 21:19
    security logs and we're looking at all
  • 21:19 - 21:21
    of our web hosts uh that we've been
  • 21:21 - 21:22
    focusing on
  • 21:22 - 21:25
    previously in our events you can see
  • 21:25 - 21:29
    that we have uh first here uh an EV that
  • 21:29 - 21:32
    has fail password for invalid user in
  • 21:32 - 21:34
    that we're provided a source IP a source
  • 21:34 - 21:37
    port and we go to see the fields that
  • 21:37 - 21:39
    are extracted and that's that's not
  • 21:39 - 21:42
    being done for us automatically so just
  • 21:42 - 21:44
    to start testing our logic to see if we
  • 21:44 - 21:47
    can get uh the results we want to see
  • 21:47 - 21:50
    we're going to use the Rex command and
  • 21:50 - 21:53
    in doing so we are applying this
  • 21:53 - 21:55
    operation across every event again a
  • 21:55 - 22:00
    streaming command we are looking at the
  • 22:00 - 22:01
    raw field so we're actually looking at
  • 22:01 - 22:05
    the raw text of each of these log events
  • 22:05 - 22:07
    and then the rec syntax is simply to
  • 22:07 - 22:12
    provide in double quotes uh a Rex uh
  • 22:12 - 22:15
    match and we're using named groups for
  • 22:15 - 22:17
    field extractions so for every single
  • 22:17 - 22:19
    event that we see failed password for
  • 22:19 - 22:23
    invalid user we are actually extracting
  • 22:23 - 22:26
    a user field The Source IP field and the
  • 22:26 - 22:29
    source Port field for the sake of
  • 22:29 - 22:31
    Simplicity I tried to keep the RX simple
  • 22:31 - 22:34
    you can make this as complex as you need
  • 22:34 - 22:38
    to for your needs for your data uh and
  • 22:38 - 22:41
    so in our extracted Fields uh I've
  • 22:41 - 22:43
    actually pre-selected these so we can
  • 22:43 - 22:46
    see our user is now available and this
  • 22:46 - 22:50
    applies to the events where the Rex was
  • 22:50 - 22:53
    actually valid and matching on the uh
  • 22:53 - 22:57
    failed password for invalid user Etc
  • 22:57 - 23:00
    string so now that we have our Fields
  • 23:00 - 23:04
    extracted we can actually use these and
  • 23:04 - 23:05
    we want
  • 23:05 - 23:09
    to do a stats count as failed login so
  • 23:09 - 23:13
    anytime you see uh a an operation as and
  • 23:13 - 23:17
    then a unique name just a rename uh
  • 23:17 - 23:19
    through the transformation function
  • 23:19 - 23:21
    easier way to uh actually keep
  • 23:21 - 23:23
    consistency uh with referencing your
  • 23:23 - 23:27
    Fields as well as not have to rename
  • 23:27 - 23:30
    later on uh with some additional in this
  • 23:30 - 23:32
    case you'd have to reference the name
  • 23:32 - 23:35
    distinct count uh so just a way to keep
  • 23:35 - 23:38
    things clean and easy to use in further
  • 23:38 - 23:42
    uh lines of SPL so we are counting our
  • 23:42 - 23:44
    failed logins we're looking at the
  • 23:44 - 23:48
    distinct count of the source IP values
  • 23:48 - 23:50
    that we have and then we're splitting
  • 23:50 - 23:53
    that by the host and the user so you can
  • 23:53 - 23:56
    see here uh this tutorial data is
  • 23:56 - 23:58
    actually pretty flat across most of the
  • 23:58 - 24:00
    sources so we're not going to have uh
  • 24:00 - 24:05
    any outliers or spikes in our stats here
  • 24:05 - 24:08
    but you can see the resulting
  • 24:09 - 24:11
    presentation in line four we do have a
  • 24:11 - 24:15
    sort command and this is an example of a
  • 24:15 - 24:18
    data set processing command where we are
  • 24:18 - 24:20
    actually evaluating a full completed
  • 24:20 - 24:24
    data set and reordering it uh given the
  • 24:24 - 24:26
    logic here we want to descend on these
  • 24:26 - 24:29
    numeric values uh so keep mind as you're
  • 24:29 - 24:31
    operating on different fields it's going
  • 24:31 - 24:34
    to be the same sort of either basic
  • 24:34 - 24:37
    numeric or the lexicographical ordering
  • 24:37 - 24:40
    that you typically see in
  • 24:41 - 24:46
    Splunk so we do have a second example uh
  • 24:46 - 24:49
    with the said style
  • 24:54 - 24:59
    replace so you can see in my events here
  • 24:59 - 25:02
    uh we are searching the tutorial and
  • 25:02 - 25:05
    vendor sales index and Source type and
  • 25:05 - 25:07
    I've gone ahead and applied one
  • 25:07 - 25:09
    operation and this is going to be a
  • 25:09 - 25:12
    helpful operation to understand really
  • 25:12 - 25:15
    what we are replacing and how to get
  • 25:15 - 25:18
    consistent operation on these fields uh
  • 25:18 - 25:20
    so in this case we are actually creating
  • 25:20 - 25:24
    an ID length field where we are going to
  • 25:24 - 25:27
    choose to mask the value of account ID
  • 25:27 - 25:29
    in our Rex command we want to know that
  • 25:29 - 25:32
    that's a consistent number of characters
  • 25:32 - 25:34
    uh through all of our data it's very
  • 25:34 - 25:37
    simple to spot check uh but just to be
  • 25:37 - 25:39
    certain we want to apply this to all of
  • 25:39 - 25:43
    our data in this case streaming command
  • 25:43 - 25:46
    uh through this eval uh we
  • 25:46 - 25:49
    are uh changing the type of the data
  • 25:49 - 25:52
    because account ID is actually numeric
  • 25:52 - 25:54
    we're making that a string value so that
  • 25:54 - 25:57
    we can look at the length these are
  • 25:57 - 25:59
    common functions in any programming
  • 25:59 - 26:02
    languages uh and so the syntax here in
  • 26:02 - 26:04
    SPL is quite simple uh just to be able
  • 26:04 - 26:07
    to get that contextual feeli we
  • 26:07 - 26:09
    understand we have 16 characters for
  • 26:09 - 26:12
    100% of our events in the account
  • 26:12 - 26:17
    IDs so actually applying our Rex command
  • 26:17 - 26:21
    we are going to now specify a unique
  • 26:21 - 26:24
    field not just uncore raw uh we are
  • 26:24 - 26:27
    applying the said mode and this is a
  • 26:27 - 26:31
    said syntax uh replacement uh looking
  • 26:31 - 26:34
    for the uh it's a capture group for the
  • 26:34 - 26:36
    first 12 digits uh and then we're
  • 26:36 - 26:39
    replacing that with a series of 12 X's
  • 26:39 - 26:42
    so you can see in our first event the
  • 26:42 - 26:45
    account ID is now masked we only have uh
  • 26:45 - 26:49
    the remaining four digits to be able to
  • 26:49 - 26:52
    identify that and so if our data was
  • 26:52 - 26:55
    indexed and is appropriately done so uh
  • 26:55 - 26:58
    in Splunk with the full account IDs but
  • 26:58 - 27:00
    for for the sake of reporting we want to
  • 27:00 - 27:05
    be able to mask that um for the audience
  • 27:05 - 27:08
    then we're able to use the the said
  • 27:08 - 27:12
    replace and then to finalize a report
  • 27:12 - 27:14
    this is just an example of the top
  • 27:14 - 27:16
    command which does a few operations
  • 27:16 - 27:18
    together uh and makes for a good
  • 27:18 - 27:21
    shorthand report uh taking all the
  • 27:21 - 27:24
    unique values of the provided field uh
  • 27:24 - 27:26
    giving you a count of those values and
  • 27:26 - 27:29
    then showing the percentage
  • 27:29 - 27:32
    of the makeup for the total data set
  • 27:32 - 27:35
    that that unique value accounts for so
  • 27:35 - 27:37
    again pretty flat in this tutorial data
  • 27:37 - 27:40
    in seeing a very consistent
  • 27:40 - 27:45
    .3% uh across these different account
  • 27:47 - 27:51
    IDs so we have looked at a few examples
  • 27:51 - 27:55
    with the Rex command uh and that is
  • 27:55 - 27:57
    again streaming we're going to look at
  • 27:57 - 27:59
    another streaming command
  • 27:59 - 28:02
    uh which is going to be a set of
  • 28:02 - 28:07
    multivalue eval functions and so again
  • 28:07 - 28:10
    if you're to have a bookmark for search
  • 28:10 - 28:12
    documentation multivalue eval functions
  • 28:12 - 28:15
    are a great one to have uh because when
  • 28:15 - 28:17
    you encounter these uh it really takes
  • 28:17 - 28:20
    some time to figure out how to actually
  • 28:20 - 28:26
    operate on data um and so the U
  • 28:26 - 28:30
    multivalue functions are um really just
  • 28:30 - 28:32
    a collection that depending on your use
  • 28:32 - 28:35
    case uh you're able to determine the the
  • 28:35 - 28:39
    best to apply um you see it often used
  • 28:39 - 28:43
    with uh Json and XML so data formats
  • 28:43 - 28:45
    that are actually naturally going to
  • 28:45 - 28:47
    provide uh a multivalue field where you
  • 28:47 - 28:50
    have repeated tags or Keys uh across
  • 28:50 - 28:54
    unique uh events as they're extracted uh
  • 28:54 - 28:56
    and you often see a lot of times in
  • 28:56 - 28:58
    Windows event logs you actually have
  • 28:58 - 29:01
    repeated key values uh where your values
  • 29:01 - 29:03
    are different and the position in the
  • 29:03 - 29:05
    event is actually specific to a
  • 29:05 - 29:09
    condition uh so you may have um a need
  • 29:09 - 29:11
    for extraction or interaction with one
  • 29:11 - 29:14
    of those unique values uh to actually
  • 29:14 - 29:19
    get a reasonable outcome from your
  • 29:19 - 29:23
    data and so um we're going to use
  • 29:23 - 29:26
    multivalue eval functions uh when we
  • 29:26 - 29:29
    have a uh change we want to the
  • 29:29 - 29:32
    presentation of data uh and we're able
  • 29:32 - 29:35
    to do so with multivalue Fields this I
  • 29:35 - 29:37
    would say often occurs when you have
  • 29:37 - 29:40
    multivalue data uh and then you want to
  • 29:40 - 29:43
    be able to change the the format of the
  • 29:43 - 29:46
    multivalue fields there uh and then
  • 29:46 - 29:47
    we're also going to look at a quick
  • 29:47 - 29:51
    example of uh actually using multivalue
  • 29:51 - 29:55
    evaluation uh as a logical
  • 29:55 - 30:00
    condition so uh the first
  • 30:03 - 30:06
    example we're going to start with a
  • 30:06 - 30:09
    simple table looking at our web access
  • 30:09 - 30:11
    logs uh and so we're just going to pull
  • 30:11 - 30:15
    in our status and refer domain fields
  • 30:15 - 30:18
    and so you can see uh we've got a uh
  • 30:18 - 30:23
    HTTP status code uh and we've got uh the
  • 30:23 - 30:26
    format of a protocol subdomain uh domain
  • 30:26 - 30:30
    tldd and our scenario here is that for a
  • 30:30 - 30:32
    Simplicity of reporting uh we just want
  • 30:32 - 30:34
    to work with this referred domain field
  • 30:34 - 30:38
    and be able to simplify that so in
  • 30:38 - 30:42
    actually splitting out the field in this
  • 30:42 - 30:45
    case uh split refer domain and then
  • 30:45 - 30:48
    choosing the period character as our
  • 30:48 - 30:50
    point to split the data we're creating a
  • 30:50 - 30:53
    multivalue uh from what was previously
  • 30:53 - 30:57
    just a a single value field uh and using
  • 30:57 - 31:02
    this we can actually create a new field
  • 31:02 - 31:06
    by using the index of a multivalue field
  • 31:06 - 31:08
    and in this case uh we're looking at
  • 31:08 - 31:10
    index
  • 31:10 - 31:13
    012 the multivalue index function allows
  • 31:13 - 31:16
    us to Target a specific field and then
  • 31:16 - 31:19
    choose a starting and ending index to
  • 31:19 - 31:21
    extract given values there are a number
  • 31:21 - 31:23
    of ways to do this in our case here
  • 31:23 - 31:25
    where we have three entries it's quite
  • 31:25 - 31:27
    simple just to give that start and end
  • 31:27 - 31:29
    of range as the
  • 31:29 - 31:30
    two entries
  • 31:30 - 31:35
    apart so as we are working to recreate
  • 31:35 - 31:39
    our domain and so that is just applying
  • 31:39 - 31:42
    uh for this new domain field we have
  • 31:42 - 31:44
    Buttercup games.com and what was
  • 31:44 - 31:48
    previously the HTTP www. Buttercup
  • 31:48 - 31:51
    games.com uh we can now use those fields
  • 31:51 - 31:55
    in a transformation function in this
  • 31:55 - 31:58
    case simple stats count by status uh in
  • 31:58 - 32:00
    the
  • 32:03 - 32:07
    domain so I do want to look at another
  • 32:07 - 32:10
    uh example here that is similar but
  • 32:10 - 32:14
    we're going to use a multivalue function
  • 32:14 - 32:17
    to actually test a condition and so I'm
  • 32:17 - 32:18
    going
  • 32:18 - 32:22
    to in this case uh be searching the same
  • 32:22 - 32:24
    data we're going to start with a stats
  • 32:24 - 32:29
    command and so a stats count as well as
  • 32:29 - 32:32
    a values of status and so the values
  • 32:32 - 32:33
    function is going to provide all the
  • 32:33 - 32:37
    unique values of a given field uh based
  • 32:37 - 32:42
    on uh the split by and so that produces
  • 32:42 - 32:45
    a multivalue field here in the case of
  • 32:45 - 32:47
    status we have quite a few events uh
  • 32:47 - 32:51
    that have multiple status codes and as
  • 32:51 - 32:53
    we're interested in pulling those events
  • 32:53 - 32:57
    out we can use an MV count function to
  • 32:57 - 33:01
    eval valate and filter our data set to
  • 33:01 - 33:04
    those specific events so a very simple
  • 33:04 - 33:07
    operation here just looking at what has
  • 33:07 - 33:10
    the uh what has more than a single value
  • 33:10 - 33:13
    for status uh but very useful as you're
  • 33:13 - 33:16
    applying this in reporting especially in
  • 33:16 - 33:19
    combination with others and uh with more
  • 33:19 - 33:23
    complex conditions
  • 33:23 - 33:28
    um so uh that is our set of multivalue
  • 33:28 - 33:33
    eval functions there as streaming
  • 33:34 - 33:38
    commands so for a uh final section of
  • 33:38 - 33:42
    the demo I want to talk about a concept
  • 33:42 - 33:45
    that is not so much a set of functions
  • 33:45 - 33:48
    uh but really enables uh more complex
  • 33:48 - 33:50
    and interesting searching and can allow
  • 33:50 - 33:53
    us to use a few different types of
  • 33:53 - 33:57
    commands uh in our SPL and so concept of
  • 33:57 - 34:00
    sub searching for both filtering and
  • 34:00 - 34:04
    enrichment uh is taking secondary search
  • 34:04 - 34:07
    results uh and we're using that to
  • 34:07 - 34:10
    affect a primary search uh so a sub
  • 34:10 - 34:12
    search will be executed the results
  • 34:12 - 34:15
    returned and depending on how it's used
  • 34:15 - 34:18
    uh this is going to be processed in the
  • 34:18 - 34:22
    original search uh and that is going to
  • 34:22 - 34:24
    will look at an example that it is
  • 34:24 - 34:27
    filtering So based on the results we get
  • 34:27 - 34:31
    a effectively a value equals X or value
  • 34:31 - 34:34
    equals y uh for one of our fields that
  • 34:34 - 34:37
    we're looking at in the sub search uh
  • 34:37 - 34:39
    and then we're also going to look at an
  • 34:39 - 34:42
    enrichment example so you see this often
  • 34:42 - 34:46
    when you have uh a data set maybe saved
  • 34:46 - 34:48
    in a lookup table uh or you just have a
  • 34:48 - 34:50
    simple reference where you want to bring
  • 34:50 - 34:53
    in more context maybe descriptions of
  • 34:53 - 34:55
    event codes things like
  • 34:55 - 35:00
    that so in that case
  • 35:02 - 35:05
    we'll look at the First Command here now
  • 35:05 - 35:08
    I'm going to run my search and we're
  • 35:08 - 35:12
    going to Pivot over uh to a sub search
  • 35:12 - 35:14
    tab here and so you can see our sub
  • 35:14 - 35:20
    search looking at the secure uh logs uh
  • 35:20 - 35:22
    we are actually just pulling out the
  • 35:22 - 35:24
    search to see what the results are uh or
  • 35:24 - 35:26
    what's going to be returned from that
  • 35:26 - 35:29
    sub search so we're applying the same
  • 35:29 - 35:31
    rex that we had before to extract our
  • 35:31 - 35:34
    Fields we're applying a wear a streaming
  • 35:34 - 35:36
    command looking for anything that's not
  • 35:36 - 35:39
    null for user we observed that we had
  • 35:39 - 35:41
    about 60% of our events that were going
  • 35:41 - 35:43
    to be null based on not having a user
  • 35:43 - 35:47
    field and so looking at that total data
  • 35:47 - 35:50
    set uh we're just going to count by our
  • 35:50 - 35:54
    source IP and this is often a quick way
  • 35:54 - 35:57
    to really just get a list of unique
  • 35:57 - 36:00
    values of any given field uh and then
  • 36:00 - 36:03
    operating on that uh to return just the
  • 36:03 - 36:05
    the list of values few different ways to
  • 36:05 - 36:09
    do that uh see stats count pretty often
  • 36:09 - 36:11
    and in this case we're actually tbling
  • 36:11 - 36:14
    out just keeping our source IP field and
  • 36:14 - 36:17
    renaming to client IP so the resulting
  • 36:17 - 36:21
    data set is a single column table uh
  • 36:21 - 36:21
    with
  • 36:21 - 36:26
    182 results and the field name is client
  • 36:26 - 36:30
    IP so so when returned to the original
  • 36:30 - 36:32
    search we're running this as a sub
  • 36:32 - 36:36
    search the effective result of this is
  • 36:36 - 36:40
    actually client IP equals my first value
  • 36:40 - 36:44
    here or client IP equals my second value
  • 36:44 - 36:47
    and so on through the full data set and
  • 36:47 - 36:49
    so looking at our search here we're
  • 36:49 - 36:52
    applying this to the access logs you can
  • 36:52 - 36:55
    see that we had a field named Source IP
  • 36:55 - 36:59
    in the secure logs uh and we renamed a
  • 36:59 - 37:02
    client IP so that we could apply this to
  • 37:02 - 37:06
    the access logs where client IP is the
  • 37:06 - 37:09
    actual field name for the uh Source IP
  • 37:09 - 37:14
    data and in this case we are filtering
  • 37:14 - 37:16
    to the client IPS relevant in the secure
  • 37:16 - 37:20
    logs for our web access
  • 37:20 - 37:24
    logs so uncommenting here we have a
  • 37:24 - 37:27
    series of operations that we're doing uh
  • 37:27 - 37:29
    and I'm just going to run the mall at
  • 37:29 - 37:33
    once and talk through uh that we are
  • 37:33 - 37:37
    counting uh the status or we're counting
  • 37:37 - 37:40
    the events by status and client IP uh
  • 37:40 - 37:43
    for the client IPS that were relevant to
  • 37:43 - 37:45
    authentication failures in the secure
  • 37:45 - 37:49
    logs we are then creating a status count
  • 37:49 - 37:52
    field just by combining uh our status
  • 37:52 - 37:55
    and count Fields uh adding a colant
  • 37:55 - 37:59
    between them uh and then we are doing a
  • 37:59 - 38:02
    second uh stats statement here to
  • 38:02 - 38:04
    actually combine all of our newly
  • 38:04 - 38:06
    created Fields together in a more
  • 38:06 - 38:11
    condensed report so transforming command
  • 38:11 - 38:13
    then streaming for creating our new
  • 38:13 - 38:15
    field another transforming command and
  • 38:15 - 38:18
    then our sort for data set processing
  • 38:18 - 38:21
    actually gives us the results here for a
  • 38:21 - 38:25
    given client IP and so we are in this
  • 38:25 - 38:28
    case looking for the scenario that
  • 38:28 - 38:31
    these client IPS that are involved in
  • 38:31 - 38:34
    authentication failures to the web
  • 38:34 - 38:37
    servers in this case these were all over
  • 38:37 - 38:40
    SSH uh we want to see if there are
  • 38:40 - 38:43
    interactions by these same Source IPS uh
  • 38:43 - 38:46
    actually on the uh website that we're
  • 38:46 - 38:50
    hosting uh so seeing a high number of
  • 38:50 - 38:53
    failed values looking at actions also is
  • 38:53 - 38:56
    a use case here for just bringing in
  • 38:56 - 38:58
    that context and seeing if there's any
  • 38:58 - 39:01
    sort of relationship between the data uh
  • 39:01 - 39:04
    this is discussed often as correlation
  • 39:04 - 39:08
    of logs I'm usually careful about using
  • 39:08 - 39:09
    the term correlation in talking about
  • 39:09 - 39:11
    spun queries especially in Enterprise
  • 39:11 - 39:13
    security talking about correlation
  • 39:13 - 39:16
    searches where I typically think of
  • 39:16 - 39:18
    correlation searches as being
  • 39:18 - 39:21
    overarching Concepts that cover data
  • 39:21 - 39:24
    from multiple data sources and in this
  • 39:24 - 39:26
    case correlating events would be looking
  • 39:26 - 39:28
    at unique data types that are
  • 39:28 - 39:31
    potentially related uh in finding that
  • 39:31 - 39:34
    logical connection uh for the condition
  • 39:34 - 39:36
    that's a little bit more up to the user
  • 39:36 - 39:38
    it's not uh quite as easy as say
  • 39:38 - 39:42
    pointing to a specific data
  • 39:42 - 39:45
    model so we are going to look at one
  • 39:45 - 39:48
    more sub search here and this case is
  • 39:48 - 39:52
    going to apply uh the join command and
  • 39:52 - 39:56
    so I talk about using lookup files uh or
  • 39:56 - 39:59
    uh other data returned by sub searches
  • 39:59 - 40:02
    uh to enrich to bring more data in
  • 40:02 - 40:06
    rather than filter um we are going to
  • 40:06 - 40:09
    look at our first part of the command
  • 40:09 - 40:11
    here uh and this is actually just a
  • 40:11 - 40:16
    simple uh stats report based on this rex
  • 40:16 - 40:18
    that keeps coming through uh the SPL to
  • 40:18 - 40:21
    give us those user and Source IP Fields
  • 40:21 - 40:24
    uh so our result here is authentication
  • 40:24 - 40:26
    failures for all these web hosts so
  • 40:26 - 40:29
    similar to what we had previously
  • 40:29 - 40:31
    returned and then we're going to take a
  • 40:31 - 40:33
    look at the results of the sub search
  • 40:33 - 40:35
    here actually split this up so that we
  • 40:35 - 40:39
    can see uh the first two lines we're
  • 40:39 - 40:42
    looking at our web access logs for
  • 40:42 - 40:46
    purchase actions uh and then we are
  • 40:46 - 40:51
    looking at uh our stats count for errors
  • 40:51 - 40:53
    and stats count for successes we have
  • 40:53 - 40:55
    pretty limited status code to return in
  • 40:55 - 40:59
    this data so this uh is is uh viable for
  • 40:59 - 41:02
    the data present uh to observe our
  • 41:02 - 41:03
    errors and
  • 41:03 - 41:06
    successes and then we are actually
  • 41:06 - 41:08
    creating a new field based on the
  • 41:08 - 41:11
    statistics that we're generating uh
  • 41:11 - 41:14
    looking at our transaction errors so
  • 41:14 - 41:18
    where we have uh high or low numbers uh
  • 41:18 - 41:22
    of failed purchase actions uh and then
  • 41:22 - 41:26
    summarizing that so in the case of our
  • 41:26 - 41:28
    final command here another transforming
  • 41:28 - 41:31
    command of table just to reduce this to
  • 41:31 - 41:35
    a small data set uh to use in the subur
  • 41:35 - 41:37
    and so in this case we have our host
  • 41:37 - 41:39
    value and then our transaction error
  • 41:39 - 41:41
    rate that we observe from the web access
  • 41:41 - 41:45
    logs and then over in our other search
  • 41:45 - 41:49
    here uh we are going to perform a left
  • 41:49 - 41:51
    join based on this host field so you see
  • 41:51 - 41:53
    in our secure logs we still have the
  • 41:53 - 41:56
    same host value and this is going to be
  • 41:56 - 42:00
    used uh to to actually add our
  • 42:00 - 42:03
    transaction uh error rates in for each
  • 42:03 - 42:06
    host so as we observe uh increased
  • 42:06 - 42:09
    authentication failures if there's a
  • 42:09 - 42:12
    scenario for a breach and some sort of
  • 42:12 - 42:15
    interruption to the ability to serve out
  • 42:15 - 42:18
    or perform these purchase actions that
  • 42:18 - 42:21
    that are affecting uh the intended
  • 42:21 - 42:23
    operations of the web servers uh we can
  • 42:23 - 42:25
    see that here of course our tutorial
  • 42:25 - 42:27
    data there's not really much that
  • 42:27 - 42:30
    jumping out or showing uh that there is
  • 42:30 - 42:33
    any correlation between the two but the
  • 42:33 - 42:35
    purpose of the join is to bring in that
  • 42:35 - 42:37
    extra data set to give the context to
  • 42:37 - 42:40
    further
  • 42:41 - 42:47
    investigate so um that is uh the final
  • 42:47 - 42:52
    portion of the SPL demo uh and I do want
  • 42:52 - 42:55
    to say for any questions I'm going to
  • 42:55 - 42:57
    take a look at the chat I'll do my best
  • 42:57 - 43:00
    to answer any questions um and then if
  • 43:00 - 43:03
    you have any other questions uh please
  • 43:03 - 43:06
    feel free to reach out to my team at
  • 43:06 - 43:09
    support keny group.com and we'll be
  • 43:09 - 43:12
    happy to get back to you and help um I
  • 43:12 - 43:15
    am taking a look
  • 43:26 - 43:29
    through
  • 43:32 - 43:34
    okay seeing some questions on
  • 43:34 - 43:38
    performance of the uh Rex said Rex
  • 43:38 - 43:42
    commands um so off the top of my head I
  • 43:42 - 43:44
    I'm not sure about a direct performance
  • 43:44 - 43:46
    comparison uh of the individual commands
  • 43:46 - 43:49
    definitely want to look into that um and
  • 43:49 - 43:52
    definitely follow up uh if you'd like to
  • 43:52 - 43:54
    uh explain a more detailed scenario or
  • 43:54 - 43:57
    look at some SPL uh that we can apply in
  • 43:57 - 43:58
    observe those
  • 43:58 - 44:02
    changes um the question on getting the
  • 44:02 - 44:05
    data set uh that is what I mentioned at
  • 44:05 - 44:08
    the beginning uh reach out to us for the
  • 44:08 - 44:10
    slides uh or just uh reach out about the
  • 44:10 - 44:15
    link and the uh Splunk tutorial data you
  • 44:15 - 44:18
    can actually search that as well um and
  • 44:18 - 44:20
    there's documentation on how to use the
  • 44:20 - 44:22
    tutorial data one of the first links
  • 44:22 - 44:26
    there uh takes you to a page that has uh
  • 44:26 - 44:29
    it is a tutorial data zip file uh and
  • 44:29 - 44:31
    instructions on how to injust that it's
  • 44:31 - 44:34
    just an upload uh for your specific
  • 44:34 - 44:38
    environment so uh in add data and then
  • 44:38 - 44:40
    upload data two clicks uh and upload
  • 44:40 - 44:43
    your file so that is uh freely available
  • 44:43 - 44:46
    for anyone uh and again that package is
  • 44:46 - 44:47
    dynamically updated as well so your time
  • 44:47 - 44:51
    stamps are pretty close to to normal uh
  • 44:51 - 44:53
    as you download the app kind of depends
  • 44:53 - 44:56
    on the time of the the cycle for the
  • 44:56 - 44:59
    update um but search overall time you
  • 44:59 - 45:02
    won't have any issues there um and then
  • 45:02 - 45:05
    yeah again on receiving slides uh reach
  • 45:05 - 45:08
    out to my team uh and we're happy to to
  • 45:08 - 45:10
    provide those discuss further and we'll
  • 45:10 - 45:16
    have uh the um the recording available
  • 45:16 - 45:18
    for this session you should be able to
  • 45:18 - 45:21
    after uh the recording processes when
  • 45:21 - 45:23
    the session ends uh actually use the
  • 45:23 - 45:25
    same link and you can watch this
  • 45:25 - 45:26
    reporting and post uh without having to
  • 45:26 - 45:32
    sign up or transfer that file so
  • 45:34 - 45:38
    um so okay Chris seeing uh seeing your
  • 45:38 - 45:41
    comment there um let me know if you want
  • 45:41 - 45:44
    to reach out to me directly anyone as
  • 45:44 - 45:49
    well um we can discuss what slides and
  • 45:49 - 45:52
    presentation you had attended I'm not
  • 45:52 - 45:55
    sure I have the attendance report uh for
  • 45:55 - 45:57
    for what You' seen previously so uh
  • 45:57 - 46:00
    happy to get those for
  • 46:07 - 46:10
    you all right and uh seeing thanks Brett
  • 46:10 - 46:13
    so you see Brett Woodruff in the chat
  • 46:13 - 46:17
    commenting uh systems engineer on the uh
  • 46:17 - 46:19
    expertise on demand team so very
  • 46:19 - 46:20
    knowledgeable guy and he's going to be
  • 46:20 - 46:24
    presenting next month's session uh that
  • 46:24 - 46:25
    is going to take this concept that we
  • 46:25 - 46:29
    talked about the subur in as a just
  • 46:29 - 46:31
    general search topic he's going to go
  • 46:31 - 46:34
    specifically into Data enrichment using
  • 46:34 - 46:38
    uh joins lookup commands and how we see
  • 46:38 - 46:41
    uh that used in the wild so definitely
  • 46:41 - 46:43
    excited for that one encourage you to
  • 46:43 - 46:46
    register for for that
  • 46:47 - 46:52
    event all right I'm not seeing any more
  • 46:56 - 46:58
    questions
  • 46:58 - 47:02
    all right with that uh I am stopping my
  • 47:02 - 47:05
    share I'm going to hang around for a few
  • 47:05 - 47:07
    minutes uh but thank you all for
  • 47:07 - 47:11
    attending and we'll see you on the next
  • 47:16 - 47:19
    session
Title:
Intermediate Searching in Splunk
Description:

more » « less
Video Language:
English
Duration:
47:17

English subtitles

Revisions Compare revisions