![]() Yes, qmail – it works great when it comes to doing high-volume, outbound-only deliveries in short time. The email in question was part of a larger mail processing job, and we're using qmail to process these mails. Use a subsearch to narrow down relevant events.Tried changing some of them to boldĮDIT: there seems to be some limit on how many code markups there can be in one post, which screws up the rendering of the page. ![]() The last table is for illustration purposes.ĮDIT: there seems to be some limit on how many code markups there can be in one post, which screws up the rendering of the page. Now duration is calculated automatically by transaction.įor option 2 you need to edit which timestamp splunk uses when indexing an event, and this is done in nf Īnd you can create your searches in the shortest way possible your search ![]() | transaction source_event startswith="STARTED" endswith="FINISHED"Ī shorter version, which does not preserve the original _time (but only for the search, the evens are not changed in any way on disk) your search Also, I'm not 100% sure that the duration field that is calculated by the transaction will keep subsecond precision, so you might have to go for the longer version of option 1 below, if your transactions are very short.įor option 1, the you would do like this (assuming that the second timestamp is extracted as timestamp) your search ![]() If there can be a significant difference between the first and second timestamp in the events - and if this is important - you should go for option 1. If the second timestamp ( timestamp_event as you call it) is always going to be very close to 'regular' timestamp in the beginning of each event, you should consider option 2, as it's a simpler configuration, and will also let the transaction command calculate the duration automatically. So I guess that these are the two events that make up the transaction, right? 03-01-2014 06:55:30, EventLoggerListener, Event=id='3241388266', message='Report1', timestamp=03-01-2014 06:55:30.535 GMT, type='ENRICHMENT', source='IAS', status='STARTED'Ġ3-01-2014 06:55:30, EventLoggerListener, Event=id='1670471136', message='Report1', timestamp=03-01-2014 06:55:30.544 GMT, type='ENRICHMENT', source='IAS', status='FINISHED'ġ) convert the (second) timestamp in each event into a format that is suitable for mathematical operations like subtraction.Ģ) configure splunk to use the second timestamp (instead of the first) when extracting the _time field. Status_event = FINISHED status_event = STARTED Sourcetype=eventstore | transaction source_event startswith="STARTED" endswith="FINISHED" | eval elapsed=timestamp_event-timestamp_event I've tried various different ways using the support portal but have failed miserably The problem I have is the timestamp is an extracted field and not the _time given by splunk. ![]() I'm new to splunk and I'm trying to calculate the elapsed time between two events 'STARTED & FINISHED' by event_type by context_event. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |