Improve Alma Job Reports for Failed Record
It would be extremely helpful if more details were given when there are failed records or records with exceptions on the jobs we run in Alma. Many of the job reports only report the number of failed records. It would be great if the report would provide MMSID information so the we know what records failed.
![](https://secure.gravatar.com/avatar/79a69f59b10728e3e9a6a147735f3762?size=40&default=https%3A%2F%2Fassets.uvcdn.com%2Fpkg%2Fadmin%2Ficons%2Fuser_70-6bcf9e08938533adb9bac95c3e487cb2a6d4a32f890ca6fdc82e3072e0ea0368.png)
We are investigating the options for enhancing the job reports as part of a general enhancement to the jobs UX, which will then allow gradually deploying report enhancements to the different jobs in Alma.
-
Katrin commented
-
Katrin commented
-
Jane Daniels commented
I don't have any votes left but wholeheartedly support this!
Jane Daniels
-
François Renaville commented
This would indeed safe time and make things clearer for customers. Thanks for submitting this idea.
-
Daniela Nastasie commented
University of South Australia Library would be interested in accessing various error reports for the jobs that we run at the time when we run the jobs.
We are aware we can check Import errors as per "Resolving Import Issues" but for other type of jobs we cannot check error reports.
As an example, running a filtering job on a set provides us with a report including a brief information on the number of records that "failed" as in this message:
"Of the 67201 records processed, 4 records failed. For more information view the report details (or contact Support using the process ID)." I attached a file with full details of this report.
We would very much appreciate having access the all error reports for the jobs that we run at the time when we run them, without having to create an incident case asking Support for error reports each time when we have problem records.
We believe that allowing us to check the problem records before creating an incident report would save time for everyone, as we could fix certain problems without creating incidents.
Thank you for your consideration,
Daniela Nastasie -
Pat Kohl commented
I'd give this all 20 of my votes if it let me. I just had a failed Authorities-Preferred Term Correction job over the weekend. Error says "Of the 223 records processed, 100 records failed" I downloaded the report. I got MMS, Vocabulary code, Field number, Old value, and New value -- no indication of whether these are the changes made or the the ones that failed. I figured, I should be able to tell, since there will be either 100 lines (fails) or 123 lines (sucesses). Nope -- 138 lines. So now what do I do?????
-
JMC commented
Yes, please. The "bug fix" implemented last week does not truly fix this, just obfuscates the problem by simply not reporting back duplicate or blank records.
Background:
We filed a case (#00180926) to get the details about the failed items (per the message in the report to contact Support using the process ID for more information) on an "Add Members to Set" job. We were told it was a bug planned to be fixed by Q3 of 2016.
On 11/30/16 we received the following update from Ex Libris: "It seems that the issue was with the reporting, and these records did not actually fail. In the report of the Add Member to Set job, the count included duplicate and empty values. This was fixed, and now duplicate and empty values are skipped and won't appear, both in the report and in the count."
We did reply with a request that the duplicate and blank records be counted and reported as such, rather than ignored or reported as failed records. Ex Libris responded very quickly, but explained that as this would require a redesign of the job, it needs to go through the enhancement process and/or the idea exchange.
SO ...
I agree that more useful details would be fabulous.
For most jobs, we compare gross numbers as a first check to make sure everything ran as expected. If the new reports now discount duplicate and empty lines, the gross numbers may not match and much time will be spent trying to figure out why Alma thinks the file had "x" number of records, when the file clearly has "y".
Similarly, when importing electronic records and creating portfolios, it is vital to know if there were duplicate records in the file which had gone unnoticed. And in files of sometimes tens of thousands of records, those things can be easy to miss. Sometimes they are true duplicates, but sometimes they are different volumes, parts, or segments of a resource which are each put by the vendor on a separate copy of the bib record, with the volume/part/segment specific URL in a single 856 (rather than a single bib with multiple 856s). If the duplicates didn't load, we need to be certain the portfolios attached properly and, if not, add additional portfolios for the additional URLs. Again, having the report spell out that there were duplicate records is more helpful than simply stating there were failed records, and far more helpful than just removing the duplicate records from the total count of records handled.