Sid, can you please upload a video demonstrating how to extract fields using props.conf so you can see the fields while doing searches?
@splunk_ml5 жыл бұрын
Yes that video is in pipeline...I will post it soon
@santhoshig77845 жыл бұрын
Hi Sir,I have a log file which has 2 different time stamps.. how to write the TIME_FORMAT in that case? Is it possible to write 2 different time stamps?
@splunk_ml5 жыл бұрын
In this case its better to use two different source types.
@happyBongGirl5 жыл бұрын
Hi Siddhartha, Can you please make a video on timezone normalization topic.
@splunk_ml5 жыл бұрын
Hi Pritha, Sure...I will create a video for that. Sid
@mamathapanabaka96854 жыл бұрын
If we are adding data via Universal forwarder , how can we do this extractions...?
@splunk_ml4 жыл бұрын
The time is extracted where the log data is parsed. You may need to have HF.
@prateekpatro56733 жыл бұрын
Data goes through various stages before getting ingested into Splunk. In this case you must remember that data goes into 'parsing' stage before 'indexing'. In parsing stage you can break your events, extract fields and timestamps etc. Parsing can be taken care in 'Indexers' as well but that may impact your performance. It is better to use HF.
@nishadt5 жыл бұрын
Hi Siddhartha, If I have 3 CSV files that I am monitoring, one has timestamp field as job_finished and other has time field end_time, my inputs.conf [/*//*.csv] so how I can be define the timestamp_fields for multiple fields from multiple csv , do I need to use transforms.conf
@splunk_ml5 жыл бұрын
Hi Nishad, Yes you need to setup props and transforms.conf for this. Basically for each csv you need to setup different transforms.conf stanzas.
@nishadt5 жыл бұрын
@@splunk_ml - do you have a video on transforms.conf, with example it would be great!
@splunk_ml5 жыл бұрын
@@nishadt You can refer below videos, kzbin.info/www/bejne/q2bKlImBrtKqras kzbin.info/www/bejne/sHrNlnaPlst_eac kzbin.info/www/bejne/g3rVZamuptSkj5Y kzbin.info/www/bejne/aKbXlGlojd5mh8U