Splunk Dev

Addon to upload data using python script

beqanaveriani
New Member

Hello, I have written python script which parses information from website, and i want to make addon which will upload data autmatically after parsing done.

from selenium import webdriver
import time
from bs4 import BeautifulSoup
import csv

driver = webdriver.Chrome()
url= "https://tenders.procurement.gov.ge/public/?lang=ge"
driver.maximize_window()
driver.get(url)

time.sleep(5)
content = driver.page_source.encode('utf-8').strip()
soup = BeautifulSoup(content,"html.parser")
officials = soup.findAll("table",{"id":"lastevents"})

file = open('data.log', 'a')

with open('data.log', 'a') as datafile:
for entry in officials:
spas = entry.findAll('td')
dat = [el for el in spas]
for da in dat:
time = da.find('span', {'class':'color-1'}).text
strong = da.find_all('strong')
number, acquirer = [k.text for k in strong]
catergory = da.find('span', {'class':'color-2'}).text
datafile.write(f"Time: {time} ; OperationNumber: {number} ; Acquirer: {acquirer} ; CategoryID: {catergory}\n")

driver.quit()

This is my script, help me please. 🙂

0 Karma

p_gurav
Champion
0 Karma

starcher
Influencer

If you dont make a modular input using Add On Builder you can also send it in via Splunk HTTP Event Collector from outside Splunk.

http://docs.splunk.com/Documentation/Splunk/7.0.3/Data/UsetheHTTPEventCollector
https://github.com/georgestarcher/Splunk-Class-httpevent

0 Karma
Get Updates on the Splunk Community!

Introducing the Splunk Community Dashboard Challenge!

Welcome to Splunk Community Dashboard Challenge! This is your chance to showcase your skills in creating ...

Wondering How to Build Resiliency in the Cloud?

IT leaders are choosing Splunk Cloud as an ideal cloud transformation platform to drive business resilience,  ...

Updated Data Management and AWS GDI Inventory in Splunk Observability

We’re making some changes to Data Management and Infrastructure Inventory for AWS. The Data Management page, ...