Security

Splunk Docker container error "cannot create /opt/container_artifact/splunk-container.state: Permission denied"?

Graham_Hanningt
Builder

I had been successfully using a custom Dockerfile to create a Docker container based on the Splunk-provided Docker image for Splunk 7.2.0:

from splunk/splunk:7.2.0

user root

# Do custom stuff...

user ${SPLUNK_USER}

# Do more custom stuff...

(With apologies for being coy about the "custom stuff".)

I wanted to upgrade to Splunk 7.3.0, so I updated the FROM command to refer to the 7.3.0 tag.

That introduced the following error:

sh: 1: cannot create /opt/container_artifact/splunk-container.state: Permission denied

What changed between 7.2.0 and 7.3.0 to cause this error?

0 Karma
1 Solution

Graham_Hanningt
Builder

Answering my own question, in case it helps someone else...

The error was caused by the introduction of an ansible user into the base Splunk Docker image.

My custom Dockerfile was setting the user to splunk (or rather, the user specified by the corresponding environment variable). That caused a problem, because an updated shell script in the base Docker image was subsequently attempting to write to a file that the ansible user could write to, but the splunk user couldn't.

I fixed the problem by changing the following line in my Dockerfile:

user ${SPLUNK_USER}

to:

user ${ANSIBLE_USER}

I might have been able to solve the problem by removing all USER commands from my Dockerfile, and inserting sudo in front of the apt-get command in the "custom stuff" in that Dockerfile. However, after reading, but not fully understanding, some topics on the web that recommended against using sudo in a Dockerfile (I'm neither a Unix expert nor a Docker expert), I decided against that approach.

View solution in original post

0 Karma

Graham_Hanningt
Builder

Answering my own question, in case it helps someone else...

The error was caused by the introduction of an ansible user into the base Splunk Docker image.

My custom Dockerfile was setting the user to splunk (or rather, the user specified by the corresponding environment variable). That caused a problem, because an updated shell script in the base Docker image was subsequently attempting to write to a file that the ansible user could write to, but the splunk user couldn't.

I fixed the problem by changing the following line in my Dockerfile:

user ${SPLUNK_USER}

to:

user ${ANSIBLE_USER}

I might have been able to solve the problem by removing all USER commands from my Dockerfile, and inserting sudo in front of the apt-get command in the "custom stuff" in that Dockerfile. However, after reading, but not fully understanding, some topics on the web that recommended against using sudo in a Dockerfile (I'm neither a Unix expert nor a Docker expert), I decided against that approach.

0 Karma
Get Updates on the Splunk Community!

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...

Introducing the 2024 Splunk MVPs!

We are excited to announce the 2024 cohort of the Splunk MVP program. Splunk MVPs are passionate members of ...

Splunk Custom Visualizations App End of Life

The Splunk Custom Visualizations apps End of Life for SimpleXML will reach end of support on Dec 21, 2024, ...