Group by in splunk

Splunk Employee. 08-20-2014 02:10 PM. No difference between the two. chart something OVER a BY b. and. chart something BY a b. a will be the vertical column, and b the horizontal columns. View solution in original post. 6 Karma.

Group by in splunk. 1. You want to create a field which is the URL minus the UserId part, And therefore the stats will be grouped by which url is called. You can do this by using split (url,"/") to make a mv field of the url, and take out the UserId by one of two ways depending on the URLs. Mvfilter: Eg: mvfilter (eval (x!=userId))

1 Answer. Splunk can only compute the difference between timestamps when they're in epoch (integer) form. Fortunately, _time is already in epoch form …

Solution. somesoni2. SplunkTrust. 01-09-2017 03:39 PM. Give this a try. base search | stats count by myfield | eventstats sum (count) as totalCount | eval percentage= (count/totalCount) OR. base search | top limit=0 count by myfield showperc=t | eventstats sum (count) as totalCount. View solution in original post.Essentially I want to pull all the duration values for a process that executes multiple times a day and group it based upon performance falling withing multiple windows. I.e. "Fastest" would be duration < 5 seconds.first i filter all the fields that are interesting to me (the a_* fields), than via sum (*) as * a sum is built over every field in the result set with the name of the field as the column, hence the as * part. index=foo | fields + a_* | stats sum (*) as *. this leaves us with a result in the form. a_foo a_bar a_baz 16 8 24.The spath command enables you to extract information from the structured data formats XML and JSON. The command stores this information in one or more fields. The command also highlights the syntax in the displayed events list. You can also use the spath () function with the eval command. For more information, see the evaluation functions .18-Oct-2023 ... stats, Provides statistics, grouped optionally by fields ; mstats, Similar to stats but used on metrics instead of events ; table, Displays data ...avg (<value>) This function returns the average, or mean, of the values in a field. Usage You can use this function with the stats, eventstats, streamstats, and timechart commands. Examples The following example returns the average of the values in the size field for each distinct value in the host field. ... | stats avg (size) BY host

01-Jan-2017 ... Make sure you split data using the SplitJson processor in NiFi before putting into Splunk. The reason is the syslog receiver may bundle incoming ...In the above query I want to sort the data based on group by query results in desc order. when i try | sort 0 -Totals, Totals column appearing first row in table. | query | chart count by x y | addtotals col=true labelfield=x label="Totals" | sort 0 -Total. Any inputs here really helps me.Jul 11, 2020 · 07-11-2020 11:56 AM. @thl8490123 based on the screenshot and SPL provided in the question, you are better off running tstats query which will perform way better. Please try out the following SPL and confirm. | tstats count where index=main source IN ("wineventlog:application","wineventlog:System","wineventlog:security") by host _time source ... 1 Answer. Sorted by: 0. Before fields can used they must first be extracted. There are a number of ways to do that, one of which uses the extract command. index = app_name_foo sourcetype = app "Payment request to myApp for brand" | extract kvdelim=":" pairdelim="," | rename Payment_request_to_app_name_foo_for_brand as brand | chart count over ...It's another Splunk Love Special! For a limited time, you can review one of our select Splunk products through Gartner Peer Insights and receive a $25 Visa gift card! Review: SOAR (f.k.a. Phantom) >> Enterprise Security >> Splunk Enterprise or Cloud for Security >> Observability >> Or Learn More in Our Blog >>A Splunk search retrieves indexed data and can perform transforming and reporting operations. Results from one search can be "piped", or transferred, from command to command, to filter, modify, reorder, and group your results. table/view. search results. Search results can be thought of as a database view, a dynamically generated table of rows ...Now I want to know the counts of various response codes over time with a sample rate defined by the user. I am using a form to accept the sample rate from the user. To convert time into different intervals, I am using -. eval inSec = startTime/ (1000*60*sampleR) | eval inSec= floor (inSec) | eval inSec=inSec*60*sampleR | fieldformat inSec ...Jan 11, 2022 · 10. Bucket count by index. Follow the below query to find how can we get the count of buckets available for each and every index using SPL. You can also know about : Comparison and conditional Function: CIDRMATCH. Suggestions: “ dbinspect “. |dbinspect index=* | chart dc (bucketId) over splunk_server by index.

Splunk Cloud Platform To change the check_for_invalid_time setting, request help from Splunk Support. If you have a support contract, file a new case using the Splunk Support Portal at Support and Services. Otherwise, contact Splunk Customer Support. Splunk Enterprise To change the check_for_invalid_time setting, follow these steps. PrerequisitesThe Interagency Working Group on Cooperative Development has held a series of meetings in 2023 focusing on how the creation of cooperatives in the housing, …Sure. What the regex is doing: Find forward slash but don't capture it (needs to be escaped): \/ Start a capturing group (parenthesis with label customer_name) Find 1 or many characters (plus symbol) different (^) from forward slash or question mark (escape needed again): [^\?\/]+ Then find a question mark but do not capture this in your token …Oct 23, 2023 · Specifying time spans. Some commands include an argument where you can specify a time span, which is used to organize the search results by time increments. The GROUP BY clause in the from command, and the bin, stats, and timechart commands include a span argument. The time span can contain two elements, a time unit and timescale: You could use stats and group by _time and user: index="_audit" action=edit_user NOT search | stats values (object) as object,values (operation) as operation by user,_time. If you have events that happen at roughly the same time but not the exact same time, and you want to group them together anyway, you could use bucket to …

How to reset red triangle prius.

Founded in 2003, Splunk is used by companies to sift through large troves of data and find security threats that could affect their businesses. The deal is a huge feat for the company, which made ...Off the top of my head you could try two things: You could mvexpand the values (user) field, giving you one copied event per user along with the counts... or you could indeed try to mvjoin () the users with a \n newline character... if that doesn't work, try joining them with an HTML <br> tag, provided Splunk isn't smart and replaces that with ...Splunk Cloud Platform To change the check_for_invalid_time setting, request help from Splunk Support. If you have a support contract, file a new case using the Splunk Support Portal at Support and Services. Otherwise, contact Splunk Customer Support. Splunk Enterprise To change the check_for_invalid_time setting, follow these steps. Prerequisites This application is build for integration of Threat Intelligence with Splunk SIEM to consume TI feeds. To use integration, please make sure you have an active Group-IB Threat Intelligence license access to the interface.You must be logged into splunk.com in order to post comments. Log in now. Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

Opening: Splunk Dev Exp: 4-6 years. Immediate joiners. Client: Happiest Minds, Bangalore Send your resume to [email protected] is, what I have somewhere already -- the field Mnemonic (singular), specific to every event, is grouped into Mnemonics (plural), which is then passed to multi-value join: I am having a search in my view code and displaying results in the form of table. small example result: custid Eventid 10001 200 10001 300 10002 200 10002 100 10002 300 ...Solved: We need to group hosts by naming convention in search results so for example hostnames: x80* = env1 y20* = prod L* = test etc.. Also can this SplunkBase Developers Documentationsort command examples. The following are examples for using the SPL2 sort command. To learn more about the sort command, see How the sort command works.. 1. Specify different sort orders for each field. This example sorts the results first by the lastname field in ascending order and then by the firstname field in descending order. …This application is build for integration of Threat Intelligence with Splunk SOAR to ingest incidents and IOCs from Group-IB Threat Intelligence (TI). To use integration, please make sure you have an active Group-IB Threat Intelligence license access to the interface. Supported ActionsMethod 1. Click the icon.. On the Apps page, click Find More Apps.. On the Browse More Apps page, search for Alibaba Cloud Log Service Add-on for Splunk, and click Install.. After the add-on is installed, restart Splunk as prompted. Method 2. Click the icon.. On the Apps page, click Install app from file.. On the Upload app page, select the …I am struggling quite a bit with a simple task: to group events by host, then severity, and include the count of each severity. I have gotten the closest with this: | stats values (severity) as Severity, count (severity) by severity, host. This comes close, but there are two things I need to change: 1) The output includes an duplicate column of ...Apr 30, 2012 · Now I want to know the counts of various response codes over time with a sample rate defined by the user. I am using a form to accept the sample rate from the user. To convert time into different intervals, I am using -. eval inSec = startTime/ (1000*60*sampleR) | eval inSec= floor (inSec) | eval inSec=inSec*60*sampleR | fieldformat inSec ... 1 Answer. Sorted by: 0. Before fields can used they must first be extracted. There are a number of ways to do that, one of which uses the extract command. index = app_name_foo sourcetype = app "Payment request to myApp for brand" | extract kvdelim=":" pairdelim="," | rename Payment_request_to_app_name_foo_for_brand as brand | chart count over ...I am attempting to create sub tables from a main table, progressively removing columns and grouping rows. I have created the following sub table, but would now like to remove "Process" and group by "Phase" while summing "Process duration" to get "Phase duration": index=fp_dev_tsv "BO Type" = "assess...You must be logged into splunk.com in order to post comments. Log in now. Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.volga is a named capturing group, I want to do a group by on volga without adding /abc/def, /c/d,/j/h in regular expression so that I would know number of expressions in there instead of hard coding. There are other expressions I would not know to add, So I want to group by on next 2 words split by / after "net" and do a group by , also ignore ...

Splunk Group By By Naveen 1.4 K Views 24 min read Updated on August 9, 2023 In this section of the Splunk tutorial, you will learn how to group events in Splunk, …

@jw44250, your questions/requirements seems to be changing. Since you have different types of URIs, I still expect that you should perform a match on URI with values like messages, comments, employees for you to come up with count etc. (you need to come up with cases based on your data):Count Events, Group by date field. 11-22-2013 09:08 AM. I have data that looks like this that I'm pulling from a db. Each row is pulling in as one event: When I do something like this below, I'm getting the results in minute but they are grouped by the time in which they were indexed.Splunk Group By By Naveen 1.4 K Views 24 min read Updated on August 9, 2023 In this section of the Splunk tutorial, you will learn how to group events in Splunk, use the transaction command, unify field names, find incomplete transactions, calculate times with transactions, find the latest events, and more.Hello, I am very new to Splunk. I am wondering how to split these two values into separate rows. The "API_Name" values are grouped but I need them separated by date. Any assistance is appreciated! SPL: index=... | fields source, timestamp, a_timestamp, transaction_id, a_session_id, a_api_name, ...With the where command, you must use the like function. Use the percent ( % ) symbol as a wildcard for matching multiple characters. Use the underscore ( _ ) character as a wildcard to match a single character. In this example, the where command returns search results for values in the ipaddress field that start with 198.To create a group from the Groups tab: In Splunk IAI, select the Browse view. Click the Groups tab. Click + Group. Type a Name for your group. Click Add. Splunk IAI lists your new group on the Groups tab. Click Add Assets. In the Add Assets dialog, filter or navigate to the assets that you want to add to the group.I'm trying to group IP address results in CIDR format. Most likely I'll be grouping in /24 ranges. Is there an easy way to do this? Maybe some regex? For example, if I have two IP addresses like 10.10.3.5 and 10.10.3.50 I want them to be counted in the 10.10.3.0/24 range, and then see how many IP's are in each range.3) error=the user xxxx already exists (more number of users are there) 4) error= we were unable to process you request {xx=cvb,xx=asdf,} 5) Exception message: no such user: Unable to locate user: {xx=cvb,xx=asdf,}} the result should be: errormessage total. Unable to find element with path. total count of similar messages beside.Grouping Results. The transaction command groups related events. For more details refer to our blog on Grouping Events in Splunk. transaction. The transaction command groups events that meet various constraints into transactions—collections of events, possibly from multiple sources. Events are grouped together if all transaction …Dec 19, 2018 · Engager. 12-19-2018 05:18 AM. Hello, I am trying to find a solution to paint a timechart grouped by 2 fields. I have a stats table like: Time Group Status Count 2018-12-18 21:00:00 Group1 Success 15 2018-12-18 21:00:00 Group1 Failure 5 2018-12-18 21:00:00 Group2 Success 1544 2018-12-18 21:00:00 Group2 Failure 44 2018-12-18 22:00:00 Group1 ...

Restaurants near alamance crossing.

Extra space storage assistant manager salary.

1 Solution Solution somesoni2 SplunkTrust 02-28-2017 11:29 AM Give this a try your base search giving fields Location, Book and Count | stats sum (Count) as Count by Location Book | stats list (Book) as Book list (Count) as Count by Location View solution in original post 4 Karma Reply All forum topics Previous Topic Next Topic DalJeanis1 Answer. Sorted by: 0. Once you have the DepId and EmpName fields extracted, grouping them is done using the stats command. | stats values (EmpName) as Names by DepId. Let us know if you need help extracting the fields.First, create the regex - IMO sedmode - to remove the date piece. ... | rex field=Field1 mode=sed "/\d {4}-\d {2}-\/d {2}//". Now, that shoudl remove the first piece that looks like a date from Field1. NOTE if you need to use this full date field later in this search, you won't be able to do it this way.Splunk Tutorial: Getting Started Using Splunk. By Stephen Watts July 01, 2022. W hether you are new to Splunk or just needing a refresh, this article can guide you to some of the best resources on the web for using Splunk. We’ve gathered, in a single place, the tutorials, guides, links and even books to help you get started with Splunk.Sure. What the regex is doing: Find forward slash but don't capture it (needs to be escaped): \/ Start a capturing group (parenthesis with label customer_name) Find 1 or many characters (plus symbol) different (^) from forward slash or question mark (escape needed again): [^\?\/]+ Then find a question mark but do not capture this in your token …I'm not sure if the two level grouping is possible (group by Date and Group by num, kind of excel type merging/grouping). You may be able to achieve this. Dates ID Names Count total Date1 num1 ABC 10 100 DEF 90 Date1 num2 XYZ 20 50 PQR 30. If you can post your current query, I can update it to provide above format. 0 Karma.Group the results by a field. This example takes the incoming result set and calculates the sum of the bytesfield and groups the sums by the values in the hostfield. ... | stats sum(bytes) BY host. The results contain as many rows as there are distinct host values. There are two columns returned: hostand sum(bytes).if this is your need, you should try to use dc function in stats command, so to have the ex eption you could run something like this: index="main_idx" app="student_svc" | stats dc (browser_id) AS browser_id_count dc (guid) AS guid_count dc (x_id) AS x_id_count BY student_id | where browser_id_count>1 OR guid_count>1 OR x_id_count>1. See my ...The above query fetches services count group by status . How to further transform into group service status of 429 and not 429 . Like below . service count_of_429 count_of_not_429 ----- my-bag 1 3 my-basket 1 2 my-cart 1 1Thanks for contributing an answer to Stack Overflow! Please be sure to answer the question.Provide details and share your research! But avoid …. Asking for help, clarification, or responding to other answers.Essentially I want to pull all the duration values for a process that executes multiple times a day and group it based upon performance falling withing multiple windows. I.e. "Fastest" would be duration < 5 seconds. ….

Splunk Cloud Platform To change the check_for_invalid_time setting, request help from Splunk Support. If you have a support contract, file a new case using the Splunk Support Portal at Support and Services. Otherwise, contact Splunk Customer Support. Splunk Enterprise To change the check_for_invalid_time setting, follow these steps. PrerequisitesThis application is build for integration of Threat Intelligence with Splunk SOAR to ingest incidents and IOCs from Group-IB Threat Intelligence (TI). To use integration, please make sure you have an active Group-IB Threat Intelligence license access to the interface. Supported ActionsGroup by and sum. 06-28-2020 03:51 PM. Hello - I am a Splunk newbie. I want to get sum of all counts of all machines (src_machine_name) for every month and put that in a bar chart with Name of month and count of Src_machine_name in that month. So in january 2020, total count of Src_machine_name was 3, in Feb It was 3. This is what I started with.May 29, 2014 · Once you convert the duration field to a number (of seconds?), you can easily calculate the total duration with something like stats sum (duration) AS total_time by Username. 0 Karma. Reply. I have a query which runs over a month period which lists all users connected via VPN and the duration of each connection. Grouping URLs by their path variable pattern. 07-15-2021 01:44 PM. I need to do an analysis on API calls using logs, like avg, min, max, percentile99, percentil95, percentile99 response time, and also hits per second. Expectation: I want them to be grouped like below, as per their API pattern : These path variables (like {id}) can be …Jun 14, 2016 · I am struggling quite a bit with a simple task: to group events by host, then severity, and include the count of each severity. I have gotten the closest with this: | stats values (severity) as Severity, count (severity) by severity, host. This comes close, but there are two things I need to change: 1) The output includes an duplicate column of ... Hi, I need help in group the data by month. I have find the total count of the hosts and objects for three months. now i want to display in table for three months separtly. now the data is like below, count 300 I want the results like mar apr may 100 100 100 How to bring this data in search?State of Splunk Careers Access the Splunk Careers Report to see real data that shows how Splunk mastery increases your value and job satisfaction. Find out what your skills are worth! Group by in splunk, where those uri's are grouped by: [whatever is between the 3rd and 4th slash that doesn't contain numbers] and [whatever is between the 4th and 5th slash] So in the output above, there would only be an average execution time for: for-sale-adverts.json (this is the only "uri" that would be captured by my first grouping) adverts.json. forrent.json., Count Events, Group by date field. 11-22-2013 09:08 AM. I have data that looks like this that I'm pulling from a db. Each row is pulling in as one event: When I do something like this below, I'm getting the results in minute but they are grouped by the time in which they were indexed., It's another Splunk Love Special! For a limited time, you can review one of our select Splunk products through Gartner Peer Insights and receive a $25 Visa gift card! Review: SOAR (f.k.a. Phantom) >> Enterprise Security >> Splunk Enterprise or Cloud for Security >> Observability >> Or Learn More in Our Blog >>, However, I would like to present it group by priorities as. P0. p1 -> compliant and non-complaint. p2 -> compliant and non-complaint. p3 -> compliant and non-complaint. p4 -> compliant and non-complaint. in a graphic like this, were there are two bars for one value, as seying the compliant and not compliant bars together for the same prority:, Apr 1, 2017 · Splunk Employee. 04-01-2017 07:50 AM. I believe you are looking for something like this: * |stats values (dest) by src. Do your search to get the data reduced to what you want and then do a stats command by the name of the field in the first column, but then do a values around the second column to get all the test1, test2, test3 values. 0 Karma. , How do you group by day without grouping your other columns? kazooless. Explorer. 05-01-2018 11:27 AM. I am trying to produce a report that spans a week and groups the results by each day. I want the results to be per user per category. I have been able to produce a table with the information I want with the exception of the _time …, There is a good reference for Functions for stats in the docs. Depending on your ultimate goal and what your input data looks like, if you're only interested in the last event for each host, you could also make use of the dedup command instead. Something like: | dedup host. View solution in original post. 2 Karma., 2 Answers. To get the two (or 'N') most recent events by a certain field, first sort by time then use the dedup command to select the first N results. While @RichG's dedup option may work, here's one that uses stats and mvindex: Using mvindex in its range form, instead of selecting merely the last item., where I would like to group the values of field total_time in groups of 0-2 / 3-5 / 6-10 / 11-20 / > 20 and show the count in a timechart. Please help. Tags (4), Jul 11, 2020 · 07-11-2020 11:56 AM. @thl8490123 based on the screenshot and SPL provided in the question, you are better off running tstats query which will perform way better. Please try out the following SPL and confirm. | tstats count where index=main source IN ("wineventlog:application","wineventlog:System","wineventlog:security") by host _time source ... , Group results by a timespan To group search results by a timespan, use the span statistical function. Group results by a multivalue field When grouping by a multivalue field, the stats command produces one row for each value in the field. For example, suppose the incoming result set is this:, This example uses the sample data from the Search Tutorial but should work with any format of Apache web access log. To try this example on your own Splunk instance, you must download the sample data and follow the instructions to get the tutorial data into Splunk. Use the time range All time when you run the search., Search for transactions using the transaction command either in Splunk Web or at the CLI. The transaction command yields groupings of events which can be used in reports. To use transaction, either call a transaction type (that you configured via transactiontypes.conf ), or define transaction constraints in your search by setting the search ..., I am attempting to create sub tables from a main table, progressively removing columns and grouping rows. I have created the following sub table, but would now like to remove "Process" and group by "Phase" while summing "Process duration" to get "Phase duration": index=fp_dev_tsv "BO Type" = "assess..., Solved: Hello! I analyze DNS-log. I can get stats count by Domain: | stats count by Domain And I can get list of domain per minute' index=main3, Group my data per week. 03-14-2018 10:06 PM. I am currently having trouble in grouping my data per week. My search is currently configured to be in a relative time range (3 months ago), connected to service now and the date that I use is on the field opened_at. Only data that has a date in its opened_at within 3 months ago should only be fetched., Count Events, Group by date field. 11-22-2013 09:08 AM. I have data that looks like this that I'm pulling from a db. Each row is pulling in as one event: When I do something like this below, I'm getting the results in minute but they are grouped by the time in which they were indexed., Hi there, I have a dashboard which splits the results by day of the week, to see for example the amount of events by Days (Monday, Tuesday, ...) My request is like that: myrequest | convert timeformat="%A" ctime(_time) AS Day | chart count by Day | rename count as "SENT" | eval wd=lower(Day) | eval ..., @jw44250, your questions/requirements seems to be changing. Since you have different types of URIs, I still expect that you should perform a match on URI with values like messages, comments, employees for you to come up with count etc. (you need to come up with cases based on your data):, Jan 11, 2022 · 10. Bucket count by index. Follow the below query to find how can we get the count of buckets available for each and every index using SPL. You can also know about : Comparison and conditional Function: CIDRMATCH. Suggestions: “ dbinspect “. |dbinspect index=* | chart dc (bucketId) over splunk_server by index. , 2 Answers. Sorted by: 1. Here is a complete example using the _internal index. index=_internal | stats list (log_level) list (component) by sourcetype source | streamstats count as sno by sourcetype | eval sourcetype=if (sno=1,sourcetype,"") | fields - sno. For your use-case I think this should work., Mar 16, 2012 · 03-16-2012 07:17 AM. I am trying to find a way to turn an IP address into CIDR format to group by reports. Ideally, I'd be able to do something like: eval ip_sub=ciderize (ip,25) So, for instance, an address of 172.20.66.54 in the forumla above would return 172.20.66.0/25, while 172.30.66.195 would return a value of 172.20.66.128/25. , Method 1. Click the icon.. On the Apps page, click Find More Apps.. On the Browse More Apps page, search for Alibaba Cloud Log Service Add-on for Splunk, and click Install.. After the add-on is installed, restart Splunk as prompted. Method 2. Click the icon.. On the Apps page, click Install app from file.. On the Upload app page, select the …, Jan 1, 2022 · Teams. Q&A for work. Connect and share knowledge within a single location that is structured and easy to search. Learn more about Teams , Solution. jluo_splunk. Splunk Employee. 09-21-2017 11:29 AM. So it sounds like you have something like this.. | stats count by group, flag | appendpipe [stats sum (count) by group] Instead, try this.. | chart count by group, flag | addtotals row=t col=f. View solution in original post., Path Finder. 06-24-2013 03:12 PM. I would like to create a table of count metrics based on hour of the day. So average hits at 1AM, 2AM, etc. stats min by date_hour, avg by date_hour, max by date_hour. I can not figure out why this does not work. Here is the matrix I am trying to return. Assume 30 days of log data so 30 samples per each date ..., I am attempting to create sub tables from a main table, progressively removing columns and grouping rows. I have created the following sub table, but would now like to remove "Process" and group by "Phase" while summing "Process duration" to get "Phase duration": index=fp_dev_tsv "BO Type" = "assess..., At its start, it gets a TransactionID. The interface system takes the TransactionID and adds a SubID for the subsystems. Each step gets a Transaction time. One Transaction can have multiple SubIDs which in turn can have several Actions. 1 -> A -> Ac1 1 -> B -> Ac2 1 -> B -> Ac3. It's no problem to do the coalesce based on the ID and …, Analysts have been eager to weigh in on the Technology sector with new ratings on Plug Power (PLUG – Research Report), Splunk (SPLK – Research ... Analysts have been eager to weigh in on the Technology sector with new ratings on Plug Power ..., where those uri's are grouped by: [whatever is between the 3rd and 4th slash that doesn't contain numbers] and [whatever is between the 4th and 5th slash] So in the output above, there would only be an average execution time for: for-sale-adverts.json (this is the only "uri" that would be captured by my first grouping) adverts.json. forrent.json., Opening: Splunk Dev Exp: 4-6 years. Immediate joiners. Client: Happiest Minds, Bangalore Send your resume to [email protected]., I'm trying to group IP address results in CIDR format. Most likely I'll be grouping in /24 ranges. Is there an easy way to do this? Maybe some regex? For example, if I have two IP addresses like 10.10.3.5 and 10.10.3.50 I want them to be counted in the 10.10.3.0/24 range, and then see how many IP's are in each range., One thing to keep in mind is that extracting the field via a regex is a totally separate step from grouping an aggregated result. index="something" sourcetype=blah OR meh "def" | rex field=uri "POST\s+\/user ... The regex Splunk comes up with may be a bit more cryptic than the one I'm using because it doesn't really have any ...