All Apps and Add-ons

Oracle Logs - Permission Issue while forwarding.

kamal_jagga
Contributor

Oracle Logs - Permission Issue while forwarding.

I am monitoring various files on oracle db. The owner of these files is 'oracle' user
Issue : permission of these files is 640 and my splunk user doesn't have access to the 'oracle' group to read the files. And i have to manually update the permissions to start the ingestion.

My 'splunk' is actually running as root and still can't access the oracle group and my linux admin is not letting me add the splunk user to the 'oracle' user group.
I am looking for a solution other than the 2 mentioned below.
1. Add splunk to oracle group
2. create a script to regularly update the permissions

Would anyone be able to advise some solution

Kindly advise.

Tags (2)
0 Karma

ddrillic
Ultra Champion

What about giving read access to other?

0 Karma

DalJeanis
Legend

Preliminary analysis.

Here are the organizational and technical considerations in play.

1) Some people who need to see/analyze/be alerted on information from these logs have access to Splunk, or can receive alerts and reports from Splunk, but do not and cannot have direct access to the logs.

2) Therefore it is a business requirement that the Oracle logs need to be monitored by Splunk.

3) Splunk does not have a business need for access to Oracle itself.

4) Since the Oracle group has access to much more than the logs, security rules preclude giving Splunk the excess authority.

5) Therefore the Linux admin refuses to add Splunk to the Oracle group.

6) Splunk, as root, has access to update the log permissions and expose the logs, which it MUST ingest.

7) The logs contain other sensitive information that means the logs cannot be left exposed to the world, so Splunk must "unexpose" them after each time it ingests the logs.


Preliminary Response

If I were over the security group, I would just tell my people, "Give me the simplest secure way to give Splunk access to these logs without giving them the keys to the car. "

They are currently having you hot-wire the car, pull it out onto the street and leave it running every time you have to check the mileage.... because they don't want to give you access to the car's odometer while the car is in the garage.

In any case, the solution you are looking for goes right through the desks of your Unix security folks, so bring donuts down there and throw yourself on their mercy.

No, they don't have any mercy; it's just a saying. *That's why you bring donuts. *

Tell them you are consulting with them to make sure your solution is as secure as it is allowed to be, then discuss the stuff above, and tell them some ((cough)) some guy on the internet suggested the stuff below.


Suggestions

Okay, you have a choice between (A) using the root superpowers to temporarily expose the logs at will, or (B) creating a security subgroup UNDER THE ORACLE GROUP that would allow Splunk access to the logs, and only the logs.

Potentially, there might be other workable ways of describing option B. For example, without regard to the Oracle group, a separate group could be created that has read access to the Oracle log DIRECTORIES, and then leave the logs themselves exposed to the world within the directory, since the world can't get there, but I'm not a Unix admin so I may be mistaking vague theory or internet rumor for fact. I also don't know whether you are using AD or PAM or another method, and those details are probably important in identifying the right tweak. Thus, donuts. Or, worst case scenario, pizza.

Option B (in some flavor) seems technologically feasible, better, more secure, and a one-shot pain in the rear for the security admins.

Option A is your current manual process or a script. Doable, within your power, and probably what you will have to do, even though it leaves the logs exposed during the ingestion process whenever it runs.

If you have to do this, then carefully analyze the nature of the log reporting and alerting to determine optimum frequency and timing for ingestion. If the reporting does not have to be near-real-time, and the logs are not huge, then it might be optimum to ingest them nightly before any maintenance windows that are likely to move/truncate/alter them.


And then there is Option (C) make Oracle (or Unix) responsible for securely sending the logs for ingestion on a periodic basis. This is a completely different methodology that falls under your "script" side of your idontwannas, but there's no reason your organization couldn't have a cron job that copies the logs from their current location to an ingestion location that Splunk has security access to, every N minutes.

The difference here is that they are responsible for maintaining it, since it is inside their security covers. The other difference is that the entire file is being copied, not just the tail, so there is more bandwidth involved. And the security for the new ingestion location(s) has to be set up. And both sides need to monitor when the last log was copied, to know if the cron job has gone AWOL.

0 Karma
Get Updates on the Splunk Community!

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...

Introducing the 2024 Splunk MVPs!

We are excited to announce the 2024 cohort of the Splunk MVP program. Splunk MVPs are passionate members of ...

Splunk Custom Visualizations App End of Life

The Splunk Custom Visualizations apps End of Life for SimpleXML will reach end of support on Dec 21, 2024, ...