PortX Log Parsing gateway released – Parsing as a service for log data with IT data governance
XPLG release PortX parsing gateway, a log parser that can be used as self-service by teams to parse and optimize log data before being shipped to logging, monitoring, observability, and security services.
PortX product suite is a data pipeline and optimization solution that helps the organization take control of Log data, and IT / machine data. The growing complexity and demand around security, compliance and observability introduce challenges to the organization. While log data analysis solutions are highly popular for over a decade, still the major problem is to extract value and insights from the data and serve the growing number of use cases.
Logging and security products and services are highly expensive and require vast resources to manage. In addition, the solutions or servers require large volumes of computing and storage.
PortX – Log Parser gateway – stream log events as pipelines
Log Parsing challenges
Every system, application, or service generates many type logs and log events, some log events are different from others and constant effort to parse the logs is invested.
Log Parsing bottlenecks
There are many tools out there to manage and parse logs, the biggest issue is that endless rules, agents, and patterns need to be applied to log events, on application logs, systems, cloud, and more.
Applying bad rules can crash the logging system and cause a denial of service for the entire critical operation. In order, to prevent that infrastructure teams must have control of the agents, parsing rules, and logging infrastructure.
Applications and IT groups struggle to get configuration and parsing services from the infrastructure team that operates tools like Splunk, Logstash, Fluent, and others.
PortX Log parsing gateway to the rescue
PortX Log parsing gateway is a simple yet powerful concept that allows every group to control the parsing, schema, security, and structure of the log data. PortX provides a web interface, security policies, and audits that allow each group to manage and ship its own log data to other tools like ELK, Splunk, SIEM, etc.
The infrastructure team provides a federated Log parsing gateway and pipeline control that helps the team to control, parse and optimize the log data being sent to the central infrastructure. Application groups clean noise reduce volume and forward optimized events to the log repository.
Data governance and log data policy
PortX allows the architecture and infrastructure team to force policies and rules that add additional control on the logging data.
Here’s a bit more detail about each:
- Mask data fields apply data masking on sensitive or PII data.
- Control log events schema verifies that each logged events contain a timestamp, normalized timezone, priority, security, and other required fields.
- Limit log event size limits log event size.
Reduce cost and time to execute
By providing application groups and organizations to control data pipelines the organization applies a systematic approach of data optimization workflow. By collecting once, parsing once, and reducing bottlenecks, data is optimized, cleaned and highly customized tot he each stakeholder’s needs. Reducing volumes, EPS, time to parse, and more.
How to start?
Reach out to the XPLG PortX team for a live demo today. request a demo.