Improve Alma Job Reports for Failed Record
It would be extremely helpful if more details were given when there are failed records or records with exceptions on the jobs we run in Alma. Many of the job reports only report the number of failed records. It would be great if the report would provide MMSID information so the we know what records failed.
Salihin M A commented
For troubleshooting failed discovery import profile jobs. We need to know which harvested records cause the errors, not simply what the error was.
Veronica Wang commented
I would add that the list of reasons of failure has to be updated. For example, with the April release, Alma can now maintain separate bib records for electronic and physical inventory types. However, this is not reflected in the report correctly.
I have given my votes here to combine my suggestion: It would be good to add in this reason "Mismatch inventory type" when two records of the same title but different inventory type are not matched.
Stacey van Groll commented
As mentioned by others, can Ex Libris please merge these two submissions into one, so that we can see a true picture of the desire for this functionality and don't split our votes?
Right now there are 85 votes on one and 78 on another, putting this underlying idea at 163 votes. From a quick skim of the Top ideas, this would put it in the Top 30.
Brenda Norton commented
This is also an issue for us - We run regular 'publish to OCLC' jobs from Alma.
Each job history returns a report of:
- new records added
- existing records deleted
- updated records
- Not published records (record content did not change)
But this report does not identify WHICH records have been actioned, and this severely curtails our troubleshooting efforts.
Jane Daniels commented
I don't have any votes left I'm afraid but the ability to identify failed records is absolutely essential now that so many of us are exporting records to other discovery layers to expose collections. e.g. in the UK many libraries are exporting records to the National Bibliographical Knowledgebase. Setting up the feed to this service was simple (this is one of the best features of Alma!) but now I need to know wich records have failed (not simply that they have) so that I can check for metadata errors and rectify them ready for the next export.
Pat Kohl commented
I'd give this all 20 of my votes if it let me. I just had a failed Authorities-Preferred Term Correction job over the weekend. Error says "Of the 223 records processed, 100 records failed" I downloaded the report. I got MMS, Vocabulary code, Field number, Old value, and New value -- no indication of whether these are the changes made or the the ones that failed. I figured, I should be able to tell, since there will be either 100 lines (fails) or 123 lines (sucesses). Nope -- 138 lines. So now what do I do?????
Yes, please. The "bug fix" implemented last week does not truly fix this, just obfuscates the problem by simply not reporting back duplicate or blank records.
We filed a case (#00180926) to get the details about the failed items (per the message in the report to contact Support using the process ID for more information) on an "Add Members to Set" job. We were told it was a bug planned to be fixed by Q3 of 2016.
On 11/30/16 we received the following update from Ex Libris: "It seems that the issue was with the reporting, and these records did not actually fail. In the report of the Add Member to Set job, the count included duplicate and empty values. This was fixed, and now duplicate and empty values are skipped and won't appear, both in the report and in the count."
We did reply with a request that the duplicate and blank records be counted and reported as such, rather than ignored or reported as failed records. Ex Libris responded very quickly, but explained that as this would require a redesign of the job, it needs to go through the enhancement process and/or the idea exchange.
I agree that more useful details would be fabulous.
For most jobs, we compare gross numbers as a first check to make sure everything ran as expected. If the new reports now discount duplicate and empty lines, the gross numbers may not match and much time will be spent trying to figure out why Alma thinks the file had "x" number of records, when the file clearly has "y".
Similarly, when importing electronic records and creating portfolios, it is vital to know if there were duplicate records in the file which had gone unnoticed. And in files of sometimes tens of thousands of records, those things can be easy to miss. Sometimes they are true duplicates, but sometimes they are different volumes, parts, or segments of a resource which are each put by the vendor on a separate copy of the bib record, with the volume/part/segment specific URL in a single 856 (rather than a single bib with multiple 856s). If the duplicates didn't load, we need to be certain the portfolios attached properly and, if not, add additional portfolios for the additional URLs. Again, having the report spell out that there were duplicate records is more helpful than simply stating there were failed records, and far more helpful than just removing the duplicate records from the total count of records handled.