Transform your API Management Practice with SOLSYS’ API Strategy Workshop
Technical Blog

Solsys’ Azure EventHub Application

Solsys is a technology service organization delivering business and technology services to Canadian market leaders.

Solsys delivers agile leadership and technical expertise through professional services and product development across the following practices.

digital identity icon

enterprise api


cloud icon

Agile Cloud

To effectively leverage our expertise and experience we provide recommended workshops and agile approaches to delivery and value measurement.

  • Agile canvasing to identify immediate and strategic opportunities for success.
  • Align and inform leveraging Agile Cloud solution templates for discovery and delivery. 
  • Agile team to deliver results prioritized by business and measured in weekly sprints. 

Data Intelligence

Solsys has identified common Cloud use cases and solution templates for Cloud Service intelligence.

Azure Event Hub Application

Included in these patterns is a recently Solsys released “Azure Event Hub application” 

This Solsys Splunk application brings security and reliability features not available on the market today to stream Azure events to Splunk. (Notably SIEM events)

This is important because:

Addresses a reliability and scalability issues for Azure event ingestion …

  • CISO (Azure SIEM, Audit, Compliance, Fraud Events)
  • VP Cloud Operations (Azure Billing, Health & Utilization Events)
  • VP Services (Cloud, Mobile applications Health, IOT Intelligence)

Measuring Value

Reliably collect and process enterprise scale amounts of Azure event data with low latency and at low cost. Ensure support for high availability and necessary recovery scenarios. Increase velocity and simplify technical operations (Event Hubs) for service data onboarding . Provide a flexible authentication framework which is extensible Enterprise rigours such as Oauth tokens.


  • Splunk Azure Monitor App is not supported by Microsoft or Splunk and is not sufficient to meet reliability, security policies and performance needs of a large financial institution


  • Leverage python 3 and Azure async libs & Event Processor Hosts which massively speed up reads.
  • Leverage  Event Processor Hosts and allows for scalable checkpoint storage using abstract backends. Our reads will always resume from the last known checkpoint and are therefore fault tolerant up to the decay time of the EventHub partition data
  • Deliver flexible appropriate support for authorization and authentication policies at EventHub partition level. 
  • Deliver a supported Splunk connector with SLA appropriate for a large financial institution.

Installing the Solsys Splunk Event Hub Connector

The Azure Event Hub Connector developed by Solsys Corp can pull raw data that from event hub partitions and ingests them into your Splunk environment. The use case driving this is often the need to correlate data from different cloud environments into Splunk.

Azure’s Event Hub is a real-time data ingestion tool that allows for events to be streamed and stored into partitions. According to Microsoft documentation Event Hub “represents the front door for an event pipeline, of called an event ingestor in solution architecture” (Microsoft, 2018). From a data engineers’ perspectives, it is a treasure cove of data that you can utilize with an existing Splunk environment for insights, analytics and correlation with other data. As any Splunkie would ask “Why don’t you Splunk the data?”

The Event Hub Connector was installed on a Heavy Forwarder according to Splunk’s best practices. In order to utilize the Connector, there were a few pre-requisites for ensuring it runs smoothly.

  • Installation of Python3.6 or above for developers: This was needed to utilize the Azure Event Hub SDK for python.
  • Installation of Azure Event Hub Software Development Kit (SDK) for Python: This was required to communicate with azure portal and event hub for retrieving the data. This would allow us to stay up to date with Microsoft Updates on the Azure side.
  • Installation of Splunk SDK for python: This allows the connector to interact with Splunk through python.

Once the prerequisites were installed, we proceeded to install the Azure Event Hub Connector. The logic diagram below shows the software components that were needed on the heavy forwarder. Ignore the “Splunk – Azure EventHub TA” for now as that is a custom Technical add-on, that we developed to assist with sourcetypes and parsing the data.

Once the three SDKs are installed, we proceeded to install the Event Hub Connector using the setup script. The SDKs installation ensures all the necessary python libraries and packages are available and work in conjunction to retrieve data from your eventhub partition.

solsys event hub

The next step after the Connector installation, is to invoke the wrapper script that is contained in the Connector directory. The wrapper script was copied from the the Connector into the $SPLUNKHOME/bin directory. To invoke the wrapper, a scripted input was created by updating the inputs.conf in Splunk as shown below.


We restarted Splunk once the input was defined and observed the data being written to disk. Woot! You can see a snippet of raw data from our ‘Operational Insights Logs’ eventhub. For more information and steps to export Azure Activity logs to Event Hubs can be found on this link. In our lab we exported Azure Monitoring Logs to the operational-insights-logs informing us when objects were changed in each of the services Microsoft Azure Provided.

If I deleted a virtual machine theactivity log would be stored as “/MICROSOFT.COMPUTE/VIRTUALMACHINE/DELETE”, in the raw logs. This is the red highlighted text in the screenshot blow. We will see later in the blog how this data can be displayed in a dashboard.


Our next step involved, create an input to monitor the files written to disk from the wrapper script while ensuring that the data is on-boarded with the correct sourcetype for time extraction.

Hopefully, the steps have not lost your train of thought. To streamline this process of on boarding data for a user, we create an Azure Event Hub TA which contains an inputs.conf sample scripted inputs and monitoring stanzas that once can use once the Connector is installed. The TA can be installed from

Previous/Next Article

Related Resources

What’s your business waiting for?