Splunk Dev

REST modular input JSON custom handler for AWS Pricing Data

pgreer_splunk
Splunk Employee
Splunk Employee

Having a bit of a struggle. AWS has a pricing API available at:

AWS JSON Pricing API URL

Because of how the JSON is formatted, it looks like a custom handler is needed. Snipped of the JSON is:

{
  "formatVersion" : "v1.0",
  "disclaimer" : "This pricing list is for informational purposes only. All prices are subject to the additional terms included in the pricing pages on http://aws.amazon.com. All Free Tier prices are also subject to the terms included at https://aws.amazon.com/free/",
  "offerCode" : "AmazonEC2",
  "version" : "20170921013650",
  "publicationDate" : "2017-09-21T01:36:50Z",
  "products" : {
    "76V3SF2FJC3ZR3GH" : {
      "sku" : "76V3SF2FJC3ZR3GH",
      "productFamily" : "Compute Instance",
      "attributes" : {
        "servicecode" : "AmazonEC2",
        "location" : "Asia Pacific (Mumbai)",
        "locationType" : "AWS Region",
        "instanceType" : "d2.4xlarge",
        "currentGeneration" : "Yes",
        "instanceFamily" : "Storage optimized",
        "vcpu" : "16",
        "physicalProcessor" : "Intel Xeon E5-2676v3 (Haswell)",
        "clockSpeed" : "2.4 GHz",
        "memory" : "122 GiB",
        "storage" : "12 x 2000 HDD",
        "networkPerformance" : "High",
        "processorArchitecture" : "64-bit",
        "tenancy" : "Host",
        "operatingSystem" : "Windows",
        "licenseModel" : "No License required",
        "usagetype" : "APS3-HostBoxUsage:d2.4xlarge",
        "operation" : "RunInstances:0002",
        "ecu" : "56",
        "enhancedNetworkingSupported" : "Yes",
        "normalizationSizeFactor" : "32",
        "preInstalledSw" : "NA",
        "processorFeatures" : "Intel AVX; Intel AVX2; Intel Turbo",
        "servicename" : "Amazon Elastic Compute Cloud"
      }
    },
    "G2N9F3PVUVK8ZTGP" : {
      "sku" : "G2N9F3PVUVK8ZTGP",
      "productFamily" : "Compute Instance",
      "attributes" : {
        "servicecode" : "AmazonEC2",
        "location" : "Asia Pacific (Seoul)",
        "locationType" : "AWS Region",
        "instanceType" : "i2.xlarge",
        "currentGeneration" : "No",
        "instanceFamily" : "Storage optimized",
        "vcpu" : "4",
        "physicalProcessor" : "Intel Xeon E5-2670 v2 (Ivy Bridge)",
        "clockSpeed" : "2.5 GHz",
        "memory" : "30.5 GiB",
        "storage" : "1 x 800 SSD",
        "networkPerformance" : "Moderate",
        "processorArchitecture" : "64-bit",
        "tenancy" : "Host",
        "operatingSystem" : "Windows",
        "licenseModel" : "No License required",
        "usagetype" : "APN2-HostBoxUsage:i2.xlarge",
        "operation" : "RunInstances:0102",
        "ecu" : "14",
        "enhancedNetworkingSupported" : "Yes",
        "normalizationSizeFactor" : "8",
        "preInstalledSw" : "SQL Ent",
        "processorFeatures" : "Intel AVX; Intel Turbo",
        "servicename" : "Amazon Elastic Compute Cloud"
      }
....

I added a handler to responsehandlers.py as:

#handler for AWS pricing API call, split and print out all product stanzas
class AWSPriceHandler:

    def __init__(self,**args):
        pass

    def __call__(self, response_object,raw_response_output,response_type,req_args,endpoint):
        if response_type == "json":
            output = json.loads(raw_response_output)
            for product in output["products"]:
                print_xml_stream(json.dumps(product))
        else:
            print_xml_stream(raw_response_output)

But that isn't working for me. It doesn't ingest any events at all when referring to that handler. Take the handler out, and the ingest is a single event that gets truncated at the line length limit.

Any suggestions?

0 Karma

Damien_Dallimor
Ultra Champion

What does your configuration look like , post a screenshot or copy/paste the inputs.conf entry.

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...