Hello everyone.I've been challenged with providing an FME Server workspace that parses all or some of the the Summary Stats info from the jobs log file in order to post it in a Slack and email notification.
My specific question is how can I ensure that the correct job_id log file is called once the workspace completes on the server?
These server workspaces are several that run independent of one another from the same repository to update a PostGIS table with updated records.
I currently have a workspace that successfully parses the Features Written and Number Records Added Summaries,but it's being ran against an older log file from the Desktop.I'd like to have Server recognize a new job id log file,call that file into this published workspace,parse the necessary info,write it to a MSWord template or templated email message,and send out as an update notification.
*I understand that Python is probably the best practice to perform this 亚搏在线workflow,but outside of Python and using only my FME Pro Workbench,how can I achieve this using what's available in the Desktop and published to Server (FME Cloud) instance.
**My versions of all are 2018.1 Build 18520.
It would be great if FME could monitor log files of one database,and when a transaction occurs,relay that change to another type of database in real time.
provide better query functionality within the FME Server logs,maybe Elasticsearch
In my workspace I have a FMEServerJobSubmitter,that submits several tables for processing,and a FMEServerLogFileRetriever that collects the log to be sent through a FMEServerEmailGenerator
(the PythonCaller just condenses the log content)
My problem is that just the last logfile gets sent,obviously because the log_attribute,in which the retriever copies the content,gets emptied at that stage.
How can I collect all log messages to get them sent together when the workspace finishes its work?
It would be nice to be able to Filter on Just messages from my Loggers.I think this would be a great addition to FME 2018's Log window filter.It would also same space in my log files and searching time (I could turn off INFORM).
This idea has been created to track interest from the FME 亚搏国际在线官网Community for displaying line numbers for the Log on the Job Details section of the Jobs Page (pictured below).
For example,the line numbers could appear when the 'Show Time Stamps' is toggled on.
Please add any additional comments or suggestions!
First off,is there a location where I can download the dashboard pages?Our dashboards seemed to have gone missing.
Secondly,I'm looking for a way to get what is coming out of the JobHistoryReader,but including FME_SERVER_REQUEST_PARAMETERS and FME_SERVER_REQUEST_HEADERS arguments with an FME job so I can run stats and be able to filter stats based on remote address or an application ID that is passed through as a parameter.Has anyone done this or have any idea of how to do it?
Thanks in advance!
I was talking to a FME Server friend and they asked about how others are dealing with log monitoring,management and analysis for FME Server.他们提到Splunk等工具,Solarwinds,Liberato.
What tools are people using?What processes are in place?What gotchas are there?Best practices?Tips and Tricks?What are the main reasons for undertaking log monitoring,management and analysis?
I am wondering if someone can share your experience of pulling certain fme server logs into an elasticsearch database and using Kibana to monitor / analyse the activities in the logs.
Log files can be really helpful when debugging FME Server.FME Server creates log files for every workspace it runs and for all of its components.
Log files can be found and viewed from two locations: the FME Server System Share location (specified at installation) or the FME Server web interface (Resources > Logs).
Log files are grouped in subdirectories relating to different components of FME Server,in either a current or old directory.Log files are auto-archived from the current to the old directory when either of the following occurs:
You can control the way log files are auto-archived,and other properties of log files,by editing the applicable messagelogger.properties file.For more information,seeMessage Logger Properties.
FME Server Logs are split into 4 folders:
Core:
This contains logs about the core functionality and configurations of FME Server.It also contains logs for the Publishers and Subscribers.
Engine:
Contains job logs,as well as logs for every engine.
Service:
Contains log files relating to FME Server services.
Tomcat:
Contains 5 log files generated by the FME Web Application Server (Tomcat).
Can I set a custom log file location?
Where are python errors logged in FME Server?
Can I change the logging level for jobs on FME Server?
Where can I find the sub-workspace logs?
Why do my FME Server job logs have errors relating to a factory?
Why are my job logs being deleted daily?
Is it possibly to email the job log or job statistics from FME Server?
Can I turn off the logging info on FME Server?
Do any log files show which python dll is being used?
Can I write the FME Server log files to a database?
How are child workspaces logged in FME Server 2017?
Can I download FME Server log files in FME Workbench?
Has anyone used Kibana to monitor FME Log files?
For more information about individual log files,please seehere.
Are you still experiencing issues?
Please consider posting to theFME 亚搏国际在线官网Community Q&Aif you are still experiencing issues that are not addressed in this article.There are alsodifferent support channelsavailable.
Have ideas on how to improve this?
You can add ideas or product suggestions to ourIdeas Exchange.
It would be helpful to have a job history summary page that groups by job name and gives average run time for that job and number of times it ran.Having the capability to change the date range would be an added bonus (e.g.show me a summary of all jobs in the last 24hours).
Currently I am achieving this by directly querying the FME Server POSTGRES fme_job_history table and sending the info via email.A cleaner approach would be the ability to do so in the FME Server interface and even drill down to specific jobs to find failures and view logs.
Good morning,
I have a FME Server job that runs a python caller.Part of this python caller is to run some sql in a postgres database.When I look in my fme job log I can see the step being called but I don't see any logging from the sql.
Is it possible to add the logging from the postgres log into the fme log?
I am already familiar with: https://knowledge.亚搏在线safe.com/questions/4717/fme-server-email-entire-log.html but it isn't what I want.I want to know how to send the job stats at the bottom of the log file to a user like this:
I would like to argue for more consistency in the log messages that readers generate.Specifically the message about which source file is being read.
For example the Shape reader says:Opened Shape File 'C:\FMEData2016\Data\ElevationModel\Contours\J11.shp' for input
The MITAB reader says:Opened native MapInfo file `C:\FMEData2016\Data\Parks\\Parks.tab'
The PostGIS reader says:Reading POSTGIS table: 'public.steden'...
The ACAD reader says: Successfully opened the'Release2013' AutoCAD file 'C:/FMEData2016/Data/Transportation/Roads.dwg'
Note that the wording is slightly different but at least these readers all report which file they're reading.The GML reader doesn't though...it shows quite a lot of information about how it's reading the file,except the filename.So when I was doing a bulk load of a large dataset,90+ GML files of 300+ Mb each into 15 PostGIS tables,which failed about halfway through I had no easy way of finding out which files had processed properly and thus could be skipped in the next load.
I ran into this issue when using the GML reader (FME 2016.1.2.1),but with 360+ readers there might be more that have this shortcoming.
It is currently possible to download individual (or multiple selected) files from the Resources page of the FME Server Web UI.Please vote for this idea if you would like FME Server to support downloading entire folders.
One example would be to download the entire Logs folder.