Security

Query to find failed logins by domain admins

swamysanjanaput
Explorer

I was using the below search query to find the failed logins by domain admins however i was asked not to use lookup file cause client wanted the real time data.

macro EventCode=4625 | stats count values(Workstation_Name) AS Workstation_Name, Values(Source_Network_Address)
AS Source_IP_Address, values(host) AS Host by Account_Name | where count > 5
| search [| inputlookup xxxxxxx.csv | rex field=user "(?\w+)\W(?\w+)" | table Account_Name]

So now i have two searches which i need to combine to get the real time data however i am unable to do so and not getting the correct results.
Query 1 : To find failed logins for all users
macro EventCode=4625 | stats count values(Workstation_Name) AS Workstation_Name, Values(Source_Network_Address)
AS Source_IP_Address, values(host) AS Host by Account_Name | where count > 5

Query 2: To fetch all domain admin details (FYI: We are not using AD add-in our environment to read AD directly)

index=xxxxxxxx
| rex field=distinguishedName "DC=(?[\w]+)"
| eval user_domain=upper(user_domain)
| eval user=user_domain + "\" + sAMAccountName
| eval user=upper(user)
| eval lastLogonTimestamp=strptime(lastLogonTimestamp, "%Y-%m-%dT%H:%M:%s.%6N")
| stats latest(lastLogonTimestamp) as lastlogon latest(timeCollected) as timecollected values(memberOf{}) as memberof by user
| fields - lastlogon timecollected
| search memberof IN ("CN=xxxx*", "CN=xxxx,CN=xxxx*")

I need to now combine query1 and query2 searches to get real time data(i.e. failed logins by domain admins details in real time). So, could anyone help me in merging these two queries. Thanks in advance

Tags (2)
0 Karma

dmarling
Builder

How often is the data on query 2 written to those logs? Is it being written at the same time as the data that is being ingested by query 1? If so, the query will can be written to combine the two different requests into a single query that has the stats join the data together by the account_name. If not, we can still do that, but we will have to get a little more creative and I'll need to know the frequency that that data is ingested so we will know how far back to look when creating the query.

If this comment/answer was helpful, please up vote it. Thank you.
0 Karma
Get Updates on the Splunk Community!

Join Us for Splunk University and Get Your Bootcamp Game On!

If you know, you know! Splunk University is the vibe this summer so register today for bootcamps galore ...

.conf24 | Learning Tracks for Security, Observability, Platform, and Developers!

.conf24 is taking place at The Venetian in Las Vegas from June 11 - 14. Continue reading to learn about the ...

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...