Some of these posts are oldies, and yet they are still pulling in plenty of traffic. Check out the list, and see if you missed any of them, especially new readers.
Too often, audits are performed on one process, one category, or one system: Earning Commissions, Windows Servers, or Wire Transfer. Each one of those is a separate silo (one for oats, one for corn, one for rice).
I remember seeing the notice that the software was being uninstalled and replaced by another package.
I could have removed the left over components myself (I am admin on the server), but I wanted to see if they would ever be removed. Did the Windows server team forget about this, or did the team not concern itself with such things? Maybe the procedures don’t include a process to ensure all components are removed.
I waited about 2 months, but the components were not removed.
And the Command Cancelled message (see the end of this post regarding that message).
Usually, it means I did something stupid, and I can figure out what, and fix it pretty fast.
Sometimes I have to scratch my head for quite some time before I figure it out.
I wrote a post in 2017 about Deleting ACL Table Covers A Multitude of Sins. This post is an expansion of that post, but mainly focuses on the “Table Already Open” error.
These tricks are the kind that they don’t teach you in class or in tutorials (at least I’ve never learned any of them there; maybe I was in the bathroom during that session); I either figured them out on my own or had someone say, “Let me show you something.”
The Command Line
When I train someone in ACL, the command line is one of the first bonus items to which I draw their attention. The command line allows you to run individual ACL commands without using the ACL menu or scripts.
To open the command line: in the menu bar, click Window, Command Line. This will appear:
You can run most ACL commands from the command line, such as OPEN a table, ASSIGN a variable value, and lots more (the commands can be entered in lower/upper/camel case, but I use uppercase in this post to help them stand out).
My 2 most frequently used command are listed below.
DISPLAY – list the fields in a table, along with their start position, length, and more.
To run this command, 1) open the table you want to run this command against, and 2) enter the command in yellow in the command line, and press Enter.
Note that the last line shows you a computed field and the formula behind it.
DISPLAY VARIABLES – list all currently active variables, their type/format, and their values.
To run the following command, just enter it in the command line, and press Enter.
Note that user-defined variables (v_record and v_table) are shown, along with system variables (OUTPUTFOLDER and WRITE1). If you’re not familiar with ACL system variables, look them up in the ACL help file (it will be worth your time).
Note that 2 of the variables are character (C) type and 2 are numeric (N).
This command is extremely helpful when you are troubleshooting variables.
Bonus: Instead of DISPLAY, you can type DIS; instead of DISPLAY VARIABLES, you can type DIS VAR. Much shorter!
Bonus #2: Another useful use of the command line is to enter variable values. For example, if you have a NOTIFY command at the end of a script that will send an email if v_Run_Notify = “Y”, you can enter v_Run_Notify = “N” in the command line and press Enter to change the variable value and prevent the NOTIFY command from running while you test changes to your script.
Open a Table You Can’t Find
Sometimes I can’t find a table because I don’t remember (or know) which ACL folder it is hiding in (the folder in your project, not a Windows folder on your hard drive).
If you know the name of the table, you can just type OPEN <tablename> and press Enter (where <tablename> is the name of the table you want to open). When I don’t remember the table name or I’m too lazy to type it out, I copy the name from the ACL log or a script that uses it, and copy it to the command line.
When the table opens, you can then see what folder the table was hiding in (the folder is not shown in screenshot below).
Clear the Command Line
When you use the command line a lot, you have to clear it before entering another command. Instead of backspacing and deleting the text, or highlighting and deleting the text, just click the X at the far right.
Likewise, instead of pressing Enter after entering a command, you can click the checkmark.
When you’re working on a big project that contains many different tables, sometimes it’s hard to remember how that table was created. Or you haven’t opened the ACL project in a while, or you have to troubleshoot or review a project someone else created.
So what table(s) were used to create that table, and what filters/joins were used to create it? How many records did the original table contain?
I used to hunt through the ACL log or the scripts to find all that info, but for the most part, it’s all in the table history.
To access a table’s history, 1) open the table you’re interested in, and 2) from the menu bar, select Tools, Table History. You’ll see something like this:
The first line shows the original table (PcardTransactions) and the FILTER used. The second line shows the filtered data (all fields) was extracted to a new table (PCardUSA).
The third line shows number of records in the original table (Input) and the fourth line shows the number of resulting records (Output) in the extracted table.
If a JOIN was used, the table history would list the primary and secondary tables as well as the JOIN command parameters used.
The other nice thing is that you can take a screenshot of the table history and use it for documentation or evidence.
Bonus: Instead of selecting Tools, History from the menu, you can type DIS HIS in the command line, and press ENTER. Same results!
If you have some ACL tricks up your sleeve, let me know.
2 changes that were submitted by developers on her behalf.
2 changes she didn’t know anything about, so she didn’t consider them her problem.
This post is a Quote of the Weak post. For more info on these types of posts, see the Quote of the Weak topic under About.
About a month ago, I received a letter saying that I could save a lot of money on my 15-year mortgage. It gave my current rate, the rate I could get if I refinanced, and the amount of the new payment.
I recently posted about 4 common AI fallacies or myths regarding artificial intelligence (AI). I wanted to dive a little deeper into some of these myths, and discuss why AI will NOT take over the world.
First of all, it is easy to fear what we don’t really understand, especially when some people push the narrative of computers becoming ‘aware’, which would result in them dominating the human race.
An article posted on MachineLearningTimes.com discusses 4 common fallacies or myths regarding artificial intelligence (AI). These misconceptions lead to many misunderstandings and fear* regarding AI.
Wikipedia defines AI as “intelligence demonstrated by machines, unlike the natural intelligence displayed by humans and animals, which involves consciousness and emotionality.”
I like Investopedia’s definition better*: “the simulation of human intelligence in machines that are programmed to think like humans and mimic their actions.”
In the post, Melanie Mitchell, Davis Professor of Complexity at the Santa Fe Institute and author of Artificial Intelligence: A Guide For Thinking Humans, lists the 4 most common fallacies that I would summarize as follows:
- Narrow intelligence (being really good at one task) leads to general intelligence (being good at many things, the way humans are). In other words, computers will become super-smart and take over the world.
- Easy tasks are hard to automate/hard tasks are easy to automate.
- AI works like the human mind. This comes from using ‘human-y” terms like learn, understand, read, and think, which leads some to believe AI can achieve humanness.
- Intelligence is all in the AI brain. In other words, “the right algorithms and data…can create AI that lives in servers and matches human intelligence.”
As an auditor, I am told all the time by the business that “we have a current project plan that is addressing that risk”, which implies that I shouldn’t waste everyone’s time writing up an audit issue regarding the problem.
It means that the risk isn’t as big as it appears.
The other day I was in a meeting to discuss a new analytics project and discovered the team had no end goal.
When the discussion started with the software to be used, I knew they were already off track.
But in data science, you can generate the experience you need yourself.
You might have seen one of my earlier posts, How to get an IT Audit job with little or no experience. Let me say from the beginning that getting an IT audit job with no experience is easier than a data science job with no experience. But according to an article from KDnuggets, it can be done. And like everything else, it takes hard work.
The article defines data science as “an interdisciplinary field that focuses on solving problems and gathering information.”
If you read my last post about auditor judgment, I’m struggling with some of the junior auditors that I’m working with.
But I’m also struggling with quite a few of the senior auditors that I work with, those that are my peers (which means they peer at what I’m doing and how I’m doing it and then continue on their merry paths).
I came to this opinion based on most of the auditors I’ve met through the years across many companies, small and big, and across sectors, including public service. And also by the many articles calling for the profession to do more critical thinking, and yes, it is needed.
But let’s start with plain old thinking (walk before run).
Can you imagine if companies didn’t have a computer help desk and each department had figure out their own computer issues? If each department had to find, load, configure, and troubleshoot their own hardware and software?
But isn’t that how most companies operate when it comes to data and data projects?
I’ve written before how some periodic reviews provide management with little assurance, but management doesn’t realize how little.
My previous post focused mostly on server access￼. In this post, I want to look at normal user access.
For example, let’s assume your company has a policy that states that all IDs must be assigned within an Active Directory group. In other words, IDs are assigned to groups, and groups are assigned to assets; IDs should not be assigned directly to an asset.
Assume the control you are testing states that user access is reviewed annually.
A looooooong time ago, Leeann asked me to write a post about blogging about internal audit, so here goes. Most of this post applies to blogging on any subject, too.
First of all, there is a dearth of good internal audit blogs, and even less good IT audit blogs. So if you’re thinking about, we sure could use you in the blogsphere!
Writing a blog is hard work, and you often get tired of it. Life finds a way to get in the way. This is my 11th year of the blog (see the first post here), which, ironically, was written by skyyler. Fortunately, we’ve gotten better since that first year.
Blogging about internal audit is like a moon shining in a dark place… here’s my 10 tips…
It was a great experience for me.
Well, sort of. No one likes being audited (ahem). But it gave me a fresh perspective of how others feel when I audit them.
This is the first of 3 posts; this post contains some background info on the project that was audited, and the second one discusses the audit and the results, and in the third post, I describe my perspective on the whole thing, and some takeaways.
Have you ever wondered why I selected the picture above to represent my blog?
This picture illustrates so many aspects and nuances of this blog’s theme.
Here’s your chance to put on your thinking cap, and based on what skyyler and I have written about over the years, tell me what YOU think it represents.
As the comments roll in, we’ll comment on them.
Then, after a few weeks, I’ll peel back my brain and give you a peek inside as to what my reasons were.
Not sure how many of you will take me up on the challenge, but here goes…
And I’m not talking about the blog posts (those are good too).
Whether you a new reader or you’ve been around since the beginning (2009!), when you find a post you like, don’t forget to do the following after you read it:
- Look in the upper right corner of the website for my Quick Links. This will take you to multiple posts on these subjects.
- Use the Search Box to search for key words.
- When you read a post, check out the Comments. We respond to a lot of questions and provide information that isn’t in the blog posts.
- Leave a question of your own in Comments. We will respond.
Does the Process X team provide metrics around their process?” I asked.
“Yes,” the most senior auditor replied, showing me the web page where the Process X metrics were displayed.
After reviewing the page briefly, I said, “I see they do metrics by month. You have a year’s data; are you planning to understand how they prepare their metrics and re-calculate them to see if you get the same numbers?”
I looked at the third page of the handout and asked, “What is this?”
“A list of Active Directory (AD) groups and the user IDs in each group. I searched AD for any group containing the system name,” the junior auditor said, “and identified these 6 groups. I then downloaded all the members of these groups from AD into Excel.”
I recently met with a team of auditors to give them input on what data profiling would be appropriate to perform. And what analytics might be insightful.
This is Part 1 of a 4-part Case File series that describes how real auditors tried to apply questionable methods to auditing and data profiling. Do not try these methods at home or work. Don’t even dream about them, awake or asleep.
The best things about xLookup: 1) it fixes some of the limitations of vLookup, 2) it is easy to understand and use, and 3) it replaces hLookup also.
Also, vLookup and hLookup are not going anyway, so if any of your colleagues struggle to learn new things, they can continue to use them as is.
When auditors need to identify and understand IT controls, they search the company intranet, review policies, look for Github repositories, review inventories, schedule meetings, and analyze IT asset data.
I stumbled on a better way to get insight into the IT controls in my company, and I didn’t have to email anyone, do any research, or frankly, anything outright. The IT controls came after me.
Fortunately, the IT controls were blind to the fact that I am an IT auditor. To them, I was just an ordinary bloke. But that didn’t last long (more on that later).
It Began a Few Years Back
It all started a couple years ago when I was building the infrastructure required to support our data analytic efforts in internal audit.
This post assumes, of course, that you already accomplished some of the hardest tasks already: figuring out what data you need, where to get it, and actually getting the data. Good luck with that. :)
This post is part of the Excel: Basic Data Analytic series.
Otherwise, your analysis may not be too broad, too narrow, or you may miss some important insights or errors.
This post is part of the Excel: Basic Data Analytic series.
Data profiling is developing a profile of your data, just as facial profiles of a person, taken from various angles, helps you size up a person’s nose, identify whether his chin is sagging, and how far apart the person’s eyes are.
It’s official: ACL is changing its name AND its spots.
I’ve claimed several times that ACL has left its first love (analytics) and doesn’t put enough work into their flagship product, ACL Analytics.
Correction: their FORMER flagship product.
At least they are publicly admitting it finally–they NO LONGER are an ANALYTICS company!
That is, according to an ACL user who attended the 2018 ACL Connections conference.
Let me share a recent experience with you….
A young IT auditor texted me at work and asked for some Active Directory user account data that I capture automatically every week, using some scheduled ACL scripts.
Test how much you know about automation technologies by taking the job automation quiz at Financial Management magazine.
Contrary to what ACL has been touting as their new ‘robotics’ feature, it is NOT robotics process automation (RPA).
[The ‘robotics’ feature is due out later in 2018. It appears to be ACL’s latest attempt to get you to use their GRC software.]
ACL, via John Verver, defines the term this way in his RPA article: “The idea is a relatively simple one: get computers to perform tasks normally performed by humans, and cut resource and time requirements for many repetitive activities.” Continue reading
When you need to rename ACL tables, be careful to also rename the associated .fil file also.
Otherwise, you (or your ACL script) might get confused. You might delete the wrong table or .fil file, and create a head-scratching problem.
I know because I confused myself.Continue reading
Recently, a large U.S. bank was found to have created unauthorized accounts; a similar bank closed one of my accounts, but doesn’t know why it happened.
More than a decade ago, I opened a safety deposit box at a local bank (a very large U.S. bank that all U.S. residents would have heard of). This wasn’t my regular bank, as my bank didn’t have such boxes; I only went to this other bank when I needed to access my safety deposit box, which was not often.
If you’re not familiar with agile methods, check out the first 5 topics listed here (just click Next at the bottom of each page; the topics are quick to the point and full of pictures).
Briefly, agile projects are performed in cycles, or iterations, rather than in a long, linear-waterfall fashion, which is: do all planning, then field work, then reporting. Each iteration of the project creates some value and includes feedback, which is used in the next iteration to increase the value of the project.
It started with his reading my Excel:Basic Data Analytics post where I list a number of procedures that anyone can do in Excel.
Kyle said he was expecting some “super sophisticated process & methodology that works like magic.”
In the previous post, Create a Team for Audit Analytics? Part 2, I explored the pros and cons of expecting all auditors to develop a level of data and analytic proficiency.
These auditors would continue to do audit testing that involves analytics as well as testing that does not involve analytics. In addition to keeping up their business skills, they would be learning and upgrading their data analytic skills.
In the first post of this series, I reviewed some of the pluses and minuses of creating a dedicated analytics team.
However, a third option exists, which is sort of a hybrid between having dedicated analytic auditors doing all the analytic work and requiring everyone to increase and develop their data and analytic skills.
Let’s explore the hybrid method in this post, and wrap up the series with a few final thoughts.
This is the third post of a 3-part series…
In the previous post, Create a Team for Audit Analytics? Part 1, I explored the pros and cons of developing an analytics team.
This team consists of analytic auditors who are dedicated to analytic projects; they would NOT typically manage audits or testing that did not include analytics.
In this post, let’s explore another option for managing and growing analytics in an audit department — expecting all auditors to develop a level of data and analytic proficiency.
This is the second post of a 3-part series…
Or should we expect all auditors to develop some levels of analytics proficiency?
Of course, this question often comes a bit further down the trail on the analytics journey, but I think the sooner it is decided, the better.
This is the first post of a 3-part series…
If you’ve ever wondered what Audit Command Language (ACL) is, here’s a quick way to find out.
ACL has provided a quick, one-page introduction to ACL. And I mean quick.
It doesn’t explain a lot, but it gives you a quick peek at the basic user interface.
You could call it the ACL Overview for Dummies.
At least on one major point, anyway. And it’s a big one.
As the tombstone reads, this point is D.O.A (dead on arrival, or more specifically, dead on analytics).
The article, Building a data analytics program, requires IIA membership to view, and is located at https://iaonline.theiia.org/2017/Pages/Building-a-Data-Analytics-Program.aspx (that’s actually good, as it means a lot fewer people will ever read it).
A debate on this blog over analytics and the future of internal audit is heating up.
A few readers, including our colleague across the sea, AuditMonkey, have dove in, and skyller and I have responded in kind.
Well, not exactly. AuditMonkey has been more kind, to his credit. But I digress.
Now I understand the purpose of SharePoint and company intranets is to share data, but even then, some data should be restricted to a limited number of people.
So I decided to check (before doing things like this, you better know How to Stay Out of Jail).
If YOUR audit department doesn’t embrace data, analytics, and automation eventually, your audit department will NOT exist.
No data, no analytics. No analytics, no automation. Eventually, no audit department.
Editor Note: This post really applies to all departments in a company, but mainly I’m addressing auditors, but you might want to read between the business lines….
By embrace, I don’t mean have one or two auditors working on this. I mean the entire department.
Before you cite all the regulatory requirements mandating the existence of an audit department in companies, having an audit department in name only won’t cut it.
Having an inept audit department will not be acceptable to regulators, and it shouldn’t be acceptable to company management either. Or Audit Committees!
Companies need skilled and efficient auditors that can do the heavy lifting, and this need will only increase.