This is Part 4 of a Case File series that describes how real auditors tried to apply questionable methods to auditing and data profiling. See Part 1, Part 2, Part 3.
Does the Process X team provide metrics around their process?” I asked.
“Yes,” the most senior auditor replied, showing me the web page where the Process X metrics were displayed.
After reviewing the page briefly, I said, “I see they do metrics by month. You have a year’s data; are you planning to understand how they prepare their metrics and re-calculate them to see if you get the same numbers?”
This is Part 3 of a Case File series that describes how real auditors tried to apply questionable methods to auditing and data profiling. See Part 1 and Part 2.
I looked at the third page of the handout and asked, “What is this?”
“A list of Active Directory (AD) groups and the user IDs in each group. I searched AD for any group containing the system name,” the junior auditor said, “and identified these 6 groups. I then downloaded all the members of these groups from AD into Excel.”
This is Part 2 of a Case File series that describes how real auditors tried to apply questionable methods to auditing and data profiling. See Part I.
I picked one of the fields and said, “Please show me how you profiled the Status field, for example.”
The auditor proudly projected his Excel spreadsheet on the conference room screen. He said, “I filtered the Status field to display only records containing ‘Complete’, noted the number of filtered records in the lower left corner, and recorded the value and the number of records in the document.”
Some auditors struggle with basic auditing. So when these auditors try to data analysis, well you can imagines how that goes.
I recently met with a team of auditors to give them input on what data profiling would be appropriate to perform. And what analytics might be insightful.
This is Part 1 of a 4-part Case File series that describes how real auditors tried to apply questionable methods to auditing and data profiling. Do not try these methods at home or work. Don’t even dream about them, awake or asleep.
Whether you script your projects or use menu commands, you need to review your ACL log carefully.
Good analysts review their results and the log as they work in ACL, after they think they are done, and have others review their log before the ACL project is relied upon.
(You can’t imagine the dumb mistakes my team and I found that saved us a lot of embarrassment later.)
Before you analyze data, you must first validate it.
Otherwise, your analysis may not be accurate, and you may miss some important insights or errors.
This post is part of the Excel: Basic Data Analytic series.
Before analyzing your data, you need to check the following:
- Duplicate transactions do not exist.
- Required fields/key fields do not contain blanks, spaces, zeroes, unprintable characters, or other invalid data.
- Date fields contain real dates, and the range of dates is appropriate.
- Amount fields don’t contain inappropriate zero, positive, or negative amounts, and the range of values is appropriate.
- Each field is stored in the correct format. This prevents data from being converted on the fly into something else unexpectantly (e.g., user ID JUL15 becomes 15-Jul).
Recently, I ran an import script to import a delimited file into ACL, but the last 10 fields were not imported. And I didn’t know it right away, because I received no error message.
In addition (or should I say, in subtraction), the log did not indicate anything was wrong. Continue reading