Another Nail in ACL’s Coffin

Diligent’s acquisition of Galvanize (ACL) is another nail in the ACL analytics coffin.

First, ACL acquired another company and created Galvanize. And we were told governance, risk, and compliance (GRC) would never be the same.

And I told you that ACL analytics would never be the same. In fact, I predicted that this acquisition meant that ACL analytics was dying (when I say ACL analytics, I’m referring to the Windows desktop version that they built the original company on).

For more on this, see ACL Officially Changes Name & Spots  and Is ACL Analytics Dying?

Well, the latest merger is just another nail in the coffin that was carved out (pardon the audit pun) a while back. Again, the press releases emphasized GRC, GRC, GRC.

Was really anything mentioned about ACL analytics? Not really. And when was the last time you got an email promoting a seminar or a video about ACL analytics?

It’s all GRC and (fake) robotics. For my analysis of robotics, see ACL Robotics is NOT Robotics.

My other concern about robotics is whether you can keep your data on premise. ACL techs have told me you can, but the diagrams they showed me weren’t convincing. And if I wasn’t convinced, how do I convince my audit peers and the security team? If any of you have robotics and keep your data on premise, and you have TESTED that it doesn’t get uploaded to ACL, please post in the comments. I’m also open to hearing from ACL staff on this. That’s my challenge, so please convince me!

The other thing that is killing ACL analytics is the inability to deal with huge data files. I don’t know if the so-called robotics configuration adds more processing power. If you have run huge data files through robotics successfully, please post in the comments. By huge I mean AT LEAST 25 million or more records containing at least 25 columns.

Anyone who has tried to process huge data files in ACL analytics desktop knows that the software just chokes.

The larger the company becomes, the more likely management will start looking for cost cutting. Standalone ACL on the desktop will be on the chopping block. Why maintain a standalone version apart from ACL robotics? Especially if ACL sales and persistency are decreasing?

Like I’ve said before, if you haven’t started looking for other tools to replace ACL yet, you better start looking. The more you have running on ACL, the sooner you need to get started.

As always, I’m interested in what others think.

Are you still using ACL on the desktop? Do you have any automated ACL scripts running? Or is your group still using ACL predominately in menu driven mode without any scripting?

Do you think ACL desktop is dying? Why or why not?

 

 

25 Comments

Filed under ACL, Audit, Data Analytics, Scripting (ACL), Technology, Written by Skyyler

25 responses to “Another Nail in ACL’s Coffin

  1. Grant Brodie here. Full disclosure I am a vendor, representing Arbutus Software (www.ArbutusAnalytics.com), but I was also one of the co-founders of ACL, and in that regard responsible for most of the cool desktop technology you love.

    I disagree that ACL Analytics is dead. Certainly the ACL company is going in a different direction, but ACL Analytics can be considered an ecosystem. In that regard, not just the company that bears the name. Arbutus supports that ecosystem. Arbutus supports exactly the same desktop architecture, your existing projects will auto convert (literally just open them with Arbutus Analyzer), the UI is almost the same, and scripts will at least 95% run out of the box. Not only that, but we have a wealth of additional features, and we’re faster.

    The conversion is just a click, so I ask you not to give up on the ACL ecosystem before checking out the alternatives within the same space, as switching to another platform will inevitably be a real pain.

    Grant

    Like

    • Good to see you back, Grant. Always appreciate your input. Appreciate your disclosure, which is pretty classy in this day and age.

      skyyler was going to check out your product, and I’ve been waiting for him to do it, and see what he thinks.

      I know he has concerns that Arbutus can’t handle the huge data files he processes, but again, he hasn’t tried your software yet. I’ll give him another nudge, as I’d really like to know also.

      So for the record, Arbutus can handle files with 25 million records and 25 columns? What’s the upper limit? I’ve talked with one of your reps and was told there’s no limit, but I find that hard to believe, especially for a desktop product.

      If your architecture is anything like ACL’s (text based), I find that hard to believe. But then I haven’t test it either…

      But you bring up a good point, which is converting to Arbutus would be much easier than a total rewrite in power bi, nifi, SASeg, and a host of other products, some which would NOT be able to do all that ACL does.

      The one concern I have with Arbutus is one of the big problems skyyler and I have had with ACL: it’s not an enterprise tool like power bi or SAS which my company already uses. Management does not like one-offs in either of the groups we belong to. That not only means that I can’t get support on it internally, but more importantly, I can’t share any scripts or logic built in the tool with others in the enterprise (so they can plug them into their own tools).

      One other question: How can you automate and schedule Arbutus scripts? Can you create a .bat file like you can with ACL? Any other methods available?

      Once we do a test run of Arbutus, we’ll do some writeups. I don’t see any in-depth reviews of your product. Can you point me to any? The ones I’ve found are rather limited.

      Finally, I find your disagreement that ACL is dying interesting. While you may not want to diss a competitor (again, classy), you stand to profit if ACL dies and companies want a product where you don’t have to rewrite everything or convert to their pseudo-robotics. Makes me wonder if you know something I don’t know…. :)

      We’ll see what skyyler thinks…

      Like

      • Regarding being back, I -am- following your posts here, I guess I didn’t want to be a pain, given my obvious self-interest. That said, the recent events at Galvanize are causing concerns for many, and I felt compelled to ensure that everyone at least knew there’s a plug-compatible alternative that is a painless switch; before they go off and make a rash and expensive decision regarding switching.

        As to your questions, you may or may not know this, but when I say I was a co-founder of ACL, I was the technical half. My partner was the marketing half. While there (and subsequently at Arbutus), I was all about performance, and all about analytics, so it shouldn’t be a surprise that performance is king.

        For the record, 25 Million records with 25 columns would be absolutely no problem. In fact, 10 times that would be no problem. There really are no limits on records or columns. As long as the file fits on disk we will process it, typically at a million or more records/second. Beyond 2 billion records there are a few visual glitches, but it still processes and produces the correct result with any size file.

        I can’t speak for ACL’s current architecture, but we are not typically text-based. We do read text-based data, but then we read pretty much any other data as well.

        As for the compatibility issue you bring up (can’t share with others), there are a couple of dimensions to that discussion. While Galvanize may have been primarily a -desktop- tool (and more recently a cloud tool), Arbutus is architected as an -enterprise- tool. We offer a server-centric solution (that supports shared analytics, scheduling, results management, browser access, universal data access and more), but also continue to support the desktop model. You are using the term ‘enterprise’ in the sense of every department uses it. Unfortunately, it’s very hard for any company to occupy that niche (Galvanize didn’t). Only the very largest companies get there. But that doesn’t mean our architecture isn’t ready for that, when the time comes.

        As for sharing scripts, Arbutus does share a scripting language with ACL. This means the scripts themselves wouldn’t be easily portable to other tools. The only viable candidate for such script sharing would be SQL, and I think that language is inadequate for the kinds of analytics we do (without getting super arcane). Instead of making the -scripts- portable, we make the -data- portable. through a tool called “ConnectPlus”. This is an ODBC driver that allows any organizational user to share the “results” of any of your work with any other Windows user, via ODBC. Every table in your project is exposed, including those on other platforms like the mainframe. We also offer built-in technology that will allow these ODBC users to launch queries (without actually seeing the script). Not the same, I get it, but short of crippling the language, I can’t see an alternative.

        Another (somewhat related) topic is that with our next major release we will be offering visual workflow development (kind of like Alteryx and other similar tools). You will be able to create workflows in a flowchart-like style, using drag and drop. This hides the actual scripting language, and relies instead on a very familiar visual interface. This still won’t be portable, but then I don’t think any of the offerings in this space are.

        As for scripts in general, we still rely heavily on the model that ACL desktop uses. Our script extensions are .pro (for procedure) instead of .bat, but otherwise are pretty much the same. They are so similar that, as I mentioned in my last post, any existing scripts are auto-converted when you open the ACL project.

        When it comes specifically to automating scripts, we differ significantly from ACL. Rather than the AX model requiring you to upload a script into a different product and convert it into an analytic there, our scheduling architecture uses the same script, and it is directly accessible from the Analyzer desktop (just right-click/schedule from the Overview). You do require an Arbutus Hub server to be available, but this just ensures there is a machine up at 2AM (or whenever) to run the job.

        As for write-ups, there used to be a number of comparisons available on-line, but they were just about all pulled, I believe due to pressure from ACL, but that’s just a guess. We did a comparison ourselves a few years back (https://www.arbutussoftware.com/performance-comparison-arbutus-vs-competition), but as it’s produced by Arbutus, it is less than independent (though every effort was made to be so). With all the goings on at Galvanize right now we are getting a lot more interaction with ACL influencers (most of the names you would know), so I expect more such reports in the near future. By far the best approach is to try it out for yourself.

        As for diss-ing ACL, I don’t know anything you don’t (really). I just find that it’s not good business to slag a competitor. ACL is still a great product, compared to many alternatives, although their current emphasis on GRC -is- causing some concern. As I said, I believe the environment we find ourselves in to have evolved into an ecosystem, rather than just a product from a single company. Kind of like SQL evolved from a product only offered by IBM (as DB2) to what we see today. Not quite as many suppliers, but certainly two. Regardless of what the future holds for Galvanize, we intend to continue supporting the ecosystem, and I want the community to know this.

        I look forward to any reviews you care to share, and am available at any time should you have further questions.

        Grant

        Liked by 1 person

        • I am not in an audit role at the moment, but I always wanted to try Arbutus, it was easily the top of my list of platform to check out if/when the department I was in moved away from ACL. I am excited to see what you all can do in that space.

          Like

  2. Louis Entsi

    Grant Brodie the link you shared isn’t available ‘SORRY! PAGE NOT FOUND’. any reason for that?

    Like

    • Louis, I’m sorry but we are in the process of updating our website and this page had been de-activated in the process. With ACL’s well-documented move away from analytics, we tend to focus on them less these days, and so some of the “vs. ACL” content was removed. That said, information like this is clearly important to existing ACL users, so I have had the page re-activated. The link now works again.

      Grant

      Like

      • Devin Self

        Grant, the link to view the commands isn’t working on that page. I’m intrigued by just how much faster Arbutus is compared to “the competition” and would like to see what commands you issued.

        Like

        • Grant

          The extra page link showing the actual commands has now been restored.

          Sorry for any inconvenience.

          Grant

          Like

  3. I just want to thanks you guys about your discussion on this subject.
    It echoes both my worries and hopes regarding the future of ACL.
    Very informative, thanks!

    Liked by 1 person

    • Xavier,
      I’d be interested to hear your opinion. Are you using ACL? What else are you looking at?

      Like

      • Xavier Theoret

        Sorry for the late reply.

        I have worked with both Excel and ACL since 2010 at two different companies. My typical process is to get the data in ACL, do the bulk of the work in it and then export/present it in Excel for auditors to work with. I was skeptical of ACL the first year but then it quickly became a swiss knife do-it-all tool. For example it helped me work on a table that had one billions rows.. or some with EBCDIC data. Contrary to others I’m very comfortable with ACL “relate”. I love the fact it lets me do a JOIN on the fly without having to JOIN anything until its needed.

        But… name change, aging software… Because of that I’m having internal discussion for next year renewal. There are fears on the management side that ACL is slowly fading away and I cannot blame them. We are not yet mature so there is a will for establishing a more ‘long term’ solution. And the current trend is that audit need to get to closer to the IT teams instead of going solo (and it makes business sense).

        I already found the solution for small tables, I’m having good results with ‘Power Query’ and I love its integration to Excel. I recently had to import 20 diffrerents small CSV files and I asked myself ACL or PowerQuery? I chose the latter. It was twice as fast in Power Query. The main advantage of Power Query is that it doesn’t bother defining a ‘size’ for each field. Its a date, a text a number (and their variations) but that’s it. And once you have done it, the recipe is done for you. In ACL if you import a CSV the wizard will determine the field length by looking at the longest string.But there is no garantee that the next CSV would not need an even longer size for any fields. I created once an Excel sheet for the purpose of generating the ACL import script tailored to the size found in a Meta data table like ‘All_Columns’.

        But what if I need to join multi-millions tables together? Power Query will likely crash because it is designed to move everything in RAM. While ACL (and likely Arbutus) are design to keep most on disk.

        SQL is great but I’m not sure I have a DBA expertise to correctly setup the system while ensuring the data is safe (not as easy as protecting a few .FIL files). And on more than one occasion the IT teams have sent me incompletes tables because of an incorrect SQL statement (often linked to the default ‘inner join’ ). Will I do the same mistakes?

        There are a few contenders out there. Alteryx looked interesting (but a tad expensive), there ACL old rivals IDEA and Arbutus. There are a few clouds solutions as well, but I’m still not comfortable to go that way yet ;-)

        Maybe Grant has a point, going Arbutus would leverage or previous script knowledge and still have the graphical flow that makes other software faster to code. On the other hand I’m fluent in VBA so IDEA’s solution would liklely be interesting as well. Its not a clear choice by far… and ACL is not even dead yet. ;-)

        Liked by 1 person

        • Xavier,
          I’m really finding a billion rows in ACL unbelievable. Are you exaggerating? ACL slows way down on around 5-7 million records (35 columns wide), although I have loaded a 120 million record, 10 column table and analyzed it. It was slow, but it did it (I scripted it).

          I’d like to hear more about your billion record file….

          Yes, Power Query and even Power BI will choke eventually. THat’s when I turn to SQL. I had someone else create the database, but I create and manage all the tables. I’m getting better writing SQL queries, but usually work with an expert when I need a complicated query written. I don’t find the basics of managing tables and writing queries too difficult, so I’d suggest you consider SQL. I’d suggest the free site, https://www.w3schools.com/sql

          I did do a cartesian join once and brought down our production server, so you have to be careful. When I’m not sure, I just ask someone to review my query.

          Personally, I don’t like the graphical systems like Alteryx. I don’t find the graphics that helpful–more like a table of contents in a book. I don’t like clicking an icon (like load data) and then having to click several times to see each option and setting. I prefer command line where I can see all the settings in one sweep.

          We are maintaining our ACL scripted processes (10+), but are moving new analyses and automation to Power BI or Python, and will probably use ACL for the adhoc stuff.

          I looked into Arbutus, but it sounds too good to be true. One of these days I should really get a copy and port one of my big ACL processes to it and see how it handles the 5, 7, 10 million file analyses. But I probably never will because my mgmt already wants to move on to tools used by others in the company. The last thing they want is another one-off tool that only audit uses. It’s hard to disagree with that, as using those tools increase our understanding of the business systems we audit.

          No, ACL is not dead yet, but it is bleeding profusely. See my posts about why I think so. Search ‘ACL dying’.

          Thanks for your comments!

          Like

        • Grant Brodie

          I sat on the sidelines with Xavier’s post, due to my obvious biases, but your most recent post pressed so many buttons I just can’t sit on the sidelines. I’ll address your comments in the order you wrote them:

          – A billion records shouldn’t be unbelievable (for ACL or Arbutus). It might take longer with ACL, but Arbutus is up to the task, and ACL was too (the last time I was involved with it). If you try Arbutus and it chokes at 5-7 million records then our support would be happy to suggest the issue and offer a solution.

          – If a tool requires you to work with an expert, then I’d suggest it’s possibly not the right tool. This is especially true if a query will bring down a production server.

          – I get your preference for command line over a graphical system like Alteryx, but you have to look at the bigger picture. Sure, the command line gives you the power you need to do your analytics, but what happens to ongoing processes when you move on (or give your analytic to an associate)? The likelihood that your successor/associate will be as good an ACL expert as you is getting slimmer every day. If you aren’t there then it will likely be all but useless.

          Low-code/no-code solutions offer two big advantages: they don’t require intimate knowledge of a language to work with, and they offer a way to visually see the entire process at a glance, enhancing comprehension. That is why they are so compelling, and becoming the way of the future for analytics.

          Even Arbutus is about to release a Workflow tool (a’la Alteryx) to engage less technical users. This combines all the benefits of no-code analytics, while retaining continuity with the past. When using Arbutus Workflows you edit using the same standard command dialogs, and you always have the ability to use the product interactively or write scripts as you wish.

          – I do wish you luck with Power BI and Python, but in my experience they don’t offer the number-crunching capabilities of even ACL, let alone Arbutus. Also, to use Python well, you’ll end up being even more of a programmer than you are right now. Not something you should aspire to, in my opinion.

          – It’s hard to ‘look into’ Arbutus without actually trying it. It may seem too good to be true, but (in my biased opinion) that’s only for lack of trying (it). I’d suggest making ‘one of these days’ tomorrow. You can try it for free. It will make a copy of any ACL projects you point it at automatically transferring just about everything). And if you like it, the annual subscription is a fraction of what ACL users are typically asked to pay these days. Even if your organization switches to another tool, there is almost no down-side. It’s not -another tool-, just a -different tool-, that can be just as easily tossed if the organization goes in a different direction.

          On the issue of standardizing on tools used by others in the organization, I can make a couple of points:
          – typically these are graphical tools like Alteryx, that you have already expressed an opinion on.
          – it is very shallow thinking that one tool fits every job, despite the superficially compelling argument about training and familiarity. If that were true, all analysts would use Excel. No, tools are built to do specific jobs better. Just like in carpentry, there are a variety of tools to get the job done. Why not just a hammer and a saw? Because a wider variety of tools does a better job. In the case of audit analytics, ACL (and now Arbutus) are specifically designed to address the shortcomings of existing tools for auditors. Like the ability to access and work with a wide range of disparate data sources and types, or the ability to process millions of records quickly (we really should talk off-line about your speed issues). If there is a better tool for -your- job then the organization is best served if you use it. The alternative is to compromise the quality of the work you do. Tools are cheap, but people are (relatively) expensive. Why buy a calculator, when we were taught arithmetic in grade school? Because it’s better and cheaper than the alternative (by hand). To extend the argument, why not standardize on one type of calculator across the organization? Because engineers need different functionality than programmers, or accountants. The same is true for auditors and their tools.

          I hope these comments give you something to ponder.

          Grant Brodie

          Like

  4. Pingback: Master List of ACL Articles and Tips | ITauditSecurity

  5. Hi,
    > I’m really finding a billion rows in ACL unbelievable. Are you exaggerating?

    It really worked although at that point the treatment of the files were basic. The transaction file was initially 850 millions lines. Then it slowly grew every quarter to a billion after something like 3-4 years. It was then a 270 Gb file splitted in large chunk of about 90 Gb each. ACL would combine theses files then apply a filter and keep something like 200 millions to process (cannot quite remember but I’m pretty sure it was over 100M). ACL would just chug along, it took about 2-3 hours per big operations and the whole script finished in about 14 hours. I did not wrote the script initially but I’ve maintained it. I had a backup plan if ACL would choke at the billion line. I would have applied the filter to each chunk and THEN merge the result… but it never failed so I’ve left things as is. :-)

    Grant: Just want to say I stole your comment from another post for my manager. I told my management that even though ACL may fail at some point that we can always fall back on Arbutus (maybe sooner than later?). ;-)

    I’m not against SQL per say or Python. But at the same time I’m looking for solution that:
    A) Like Grant mentionned – Tailored to the audit (which include automatic loggin capability and preserve data integrity)
    B) Is quick to program even if it comes at the expense of a bit of performance.

    Languages come in differents forms: Low level to higher level and performance is usually the tradeoff. I came to the conclusion that I would work faster if I can keep a higher level language. It would be slower on a large data set, but if you run the batch at night it doesn’t make much of a difference if it runs in 8 hours instead of 6 hours. Why then program 4 lines when only one will do in a higher level language? :-)

    Sorry for the long answer! :-) The way I see it Grant idea was to take the information I/O as much as he could to the disk instead of trying to do everything in RAM. Not sure if it is the same with Arbutus anymore (heard there are optimistation) but the .FIL structure is clever. All rows having the same length enable ACL to locate any rows and any fields by simply multiplying the record length. For fields it is an offset of the lenghts of any previous fields and from there you just add the record length again to get the next value of the same field. Theses are just regular Disk “SEEK” functions in programming that do not require RAM. And that’s how you can get exaggeratly big files in ACL ;-)

    (Lol there is a lot of assumptions and I can be wrnog – that’s just the way I interpreted all of this)

    Like

    • Regarding the speed issue, I dug up an old Arbutus/ACL speed comparison I did quite a number of years ago for our website.

      While the comparison is old (ACL was on V10 at the time), I think the comparisons are still valid. I did the testing at that time with a 125M record (10GB) file. Typical commands took about 70ish seconds to complete. I would expect faster results for Arbutus today, but I can’t speak for ACL.

      I remain surprised that ACL lets you down with so few records.

      I can’t attach the comparison PDF to this post, but if anyone is interested please contact support at support@ArbutusSoftware.com they can forward a copy to you.

      Grant

      Like

  6. Great comments, thanks!
    I’m still trying to figure out why ACL chokes like it does. Tried it on several machines and a windows server (ACL gives you an ‘unsupported’ message, but lets you install it, and it runs fine). ACL choked on multiple files over the years. I just don’t get it.
    Grant, glad I got you back out in the open. I hope to push your buttons again in the future. “)
    Regarding your excellent points about command line/graphical, I don’t agree with you. Except that low code allows people who really don’t know what they are doing to do bad work faster. Personally, I don’t expect everyone to have my skills, but they should understand the basics and grow that understanding over time. I don’t think the best future is people selecting dropdowns and options, as I’ve found they don’t typically understand what the output is and whether it is accurate or even close. Yes, low code allows some people to do great work, and there’s a place for that. But I’ve seen too much “oh, all you have to do is press a button” work that’s worthless. The IIA requires ‘audit management’ over the work and too many team leads, managers, and directors don’t know when work is quality or not, but IT IS DONE. YAY! If you manage auditors or analysts and can’t tell whether the work is right when done low code (or command line), then you have a problem and aren’t providing oversight, which is bad for the audit or analytics profession. So I’ll admit it comes down to management/knowledge ability, not how the software delivers its product. But the easy route is often taken and mgmt doesn’t know better.
    I disagree with your comment that “Power BI and Python…don’t offer the number-crunching capabilities of even ACL, let alone Arbutus.” This really surprises me, coming from you. Power BI certainly can crunch numbers and so can Python. We are using Python scripts today instead of ACL and it works great, as many Python libraries can do the work of scripts I wrote by hand. I agree PBI and Python are more technical, and certainly Python makes you more of a programmer. And yes, it is less likely that other auditors will work in Python when they are unable to work in ACL, which is much simpler. [I want to address using Python libraries (which could be called ‘low code’ as I didn’t code it, but took it off the shelf that someone else wrote). Some would say it’s the same as graphical interface in a sense, but the difference is that I know how to determine whether the output is correct, and I do extensive testing to ensure it is, just like I do with an ACL script that I wrote.]
    I disagree that my thinking is shallow. I’m not trying to get one tool (ACL) to do it all. Lots of times I do work in Excel simply because it’s faster. The only think I’ve ever needed to do that ACL could not do is load and analyze huge files.
    Re: Arbutus, does it store data in files like ACL (.fil) does? Can it write the output to a database instead of a file?
    Thanks for all your comments, Grant, and yours too, Xavier!

    Like

  7. You have indeed got me back out in the open. I am reluctant to post, especially given my obvious self-interest, but when I see comments about choking on just a few million records I can’t sit on the sidelines.

    Again, I’ll address the comments in the order you wrote them.

    – Can you give me a sense of what you are trying to do that is causing ACL to choke? Specifically, what commands are you running, and where are you reading the data from? That would give me a better understanding of your issue. Not that I’m trying to improve your use of ACL (I still you’ll give us a try), but I may be able to make a suggestion.

    – You are absolutely right that low code solution can just enable people who don’t know what they are doing. That said, if the alternative is no analytics at all then I’ll take the risk. In my experience many (perhaps even most) users don’t have the ability to “grow that understanding over time”. They just don’t have the analytic bent that you or I have.

    – You seem to characterize low code as somehow less reviewable. I don’t think this is the case. Certainly, as you point out, the appropriate level of management review is often not applied to the results, but that is not a problem with the technology used, but with the management of it. A properly implemented low code solution (as I think we have done) is just as reviewable as a script. I would argue more so, as it is much easier to see the flow of a visual workflow diagram than a script. There should be nothing hidden in a workflow solution, other than the code itself. All the specs are there for the reviewer to see, just in dialog form instead of text. You trust an EXTRACT command in a script, so why wouldn’t you trust that same command in a workflow? In fact, the Arbutus approach to workflows actually converts them into scripts behind the scenes. When you run a workflow you still get the same log (time and date stamped) as with any other Analyzer run. Your results are just as reviewable as in the past.

    – When I was referring to PBI and Python crunching numbers, I was referring to speed, not the results. These products will clearly deliver the results. You may not be experiencing the same speed differences as I because of the other ‘choking’ issues that are still to be resolved. I just haven’t seen either of these products process millions of records per second, as I see every day with Arbutus. I’d love to do a benchmark myself, so if you could publish a typical Python script I can try to sub-in some appropriate data. If I’m wrong (about speed) then I clearly want to know this, as saying so affects my credibility.

    – Regarding ‘shallow thinking’, I certainly didn’t mean to imply that your thinking was shallow. Instead, I was addressing the corporate drive to standardize on a single product. I was specifically referring to your earlier comment that “my mgmt already wants to move on to tools used by others in the company”. We are seeing more and more of this with our customers, and the auditors don’t have the arguments to debate the merits. I think a ‘one product fits all’ solution does a disservice to those areas that would benefit from the capabilities of a more specialized tool. Worse, I think it can affect the quality (and reviewability) of the work performed, such as the absence of an audit trail.

    – Regarding your last question, Arbutus normally works in a .fil model, like ACL, but we also support writing (and reading) directly to and from the database of your choice. There is a performance penalty to do so (because I’ve never seen a DB as speedy as either ACL or Analyzer), but speed isn’t always the main consideration. When reading from a DB you have the choice to either flatten the data to a .fil (like ACL), or read the data directly. You can also embed any SQL SELECT statement in your script, to off-load the heavy lifting to the DB.

    This is an interesting conversation, and it’s a pleasure to be a part of it.

    Grant

    Like

    • Grant,
      I hope you had a great Christmas and your New Year is happy!
      The pleasure of this conversation is all mine (and that of our readers. We all appreciate the time you put into this).

      ACL Choking: I don’t recall the issues other than ACL would not load the file. Next time it happens, I’ll post it. The biggest problem my mgmt has with ACL is that it doesn’t write to a database; they hate files on the LAN. I tried to help them understand that other products do the same (Power BI, Python, SASeg, etc.); yes, some of the products can write to databases, but how they are typically used across the company, files are still dropped on the LAN (like Power BI’s .pbix).

      It irks me because our LAN drives are encrypted at rest and in transit, and all across the company, most of the people store all their sensitive files on the LAN. I have tried to educate mgmt, but the sensitive files that ACL drops are on our LAN are much larger than the average file, so they have a point, but I don’t think that it is as problematic as they make it. Users may access mistakes with databases also, and databases also get hacked.

      Low code: You are correct that my concern about low code environments is a management issue, not a design/product issue. Because my mgmt is weak and doesn’t provide a quality management review, low code will only increase our quality problems while mgmt thinks we are succeeding.

      I didn’t mean graphical workflows can’t be reviewed, I just find it harder to review because you have to look at a workflow and then click a bunch to see the details. In my dept, people get the details wrong often, so no, I won’t trust what I see in a workflow as the workflows are usually higher level and hide the details. Again, not a design flaw, but a personal preference to review all the details in a script without going back and forth between graphical objects and details.

      Give the choice of using no code or having no analytics, I’d choose no analytics, especially with many of the auditors I have worked with (in my company and others). We have trained them, held their hand, beat them (exaggeration), etc., to no avail. My mgmt refuses to make getting better in analytics a requirement or tying those skills to promotions, raises, or even hiring. If we go no code, I’d just spend more time unraveling the mess before the audit is completed., while at the same time, mgmt expects me to do more advanced analytics. Yes, I’m looking for a new job…

      Speaking of analytic bent, another thing I can’t get my mgmt to understand is that some people don’t have the bent. And I’m not the only one trying to help them understand these things, so it’s not just me and my methods of influencing mgmt. Some managers have been told to leave because they keep trying to educate managers. Yes, it’s a bad situation. But I’ve been in audit departments across multiple companies and I know this is NOT unusual. That makes finding a new job harder.

      As I learn to code in Python, it seems to require less code in many instances than ACL. True, I’m doing more programming that I needed to do in ACL, but having all the python libraries available really helps. While I know ACL & Arbutus have ready-made scripts to use also, I feel that I have to review those scripts carefully before using them. At this point, it doesn’t seem like I can really review Python libraries, you just call them and use them and trust them explicitly. Surely easier, but more risky, but I’m assuming others have used them and reviewed them a lot more than any ready-made ACL/Arbutus scripts. I’d like your input on that.

      So, my understanding is that ACL’s Achilles heel is the input/out of text. Power BI and Python load everything in memory, so it seems we will run up agains issues there (I have already hit that in Power BI). So is Arbutus limited only by memory?

      Regarding standardizing on a single product. That’s not what is happening. Mgmt just wants us to use tools (multiple tools, not singular) that are already used in the company rather than be the only users of a tool in our dept. I think this makes sense on several levels:

      1) When I leave for another job or get run over by a taxi, my department, like many other audit departments, will be in trouble. I have mentioned this many times to no avail. Sure, another ACL champion who has my skills can be found, but it isn’t easy. If I could find another job easily that met my requirements, I would have left long ago. Also, I have many other skills that enable me to be successful, which most IT auditors don’t even have.

      2) All the applications that I have created in ACL cannot be easily ported to another application across the company. We have one ACL application that another department wants to take over. Due to its complexity, the other department has to totally re-create the app in another language, and they can’t find the time or money to do it. In other words, I’m building business logic in ACL (or Arbutus if I moved to that) that is hidden from the rest of the company and can’t be easily transferred. If my code was Python, Power BI, or SQL, it would be much easier. And if I left, it would be easier to find others in the company to continue to run the applications if I built them in standard company tools.

      3) If audit uses the same tools the company uses to run their analytics, auditors would better understand how to audit the applications and processes built/bought by other departments; we’d be better auditors because we would know a lot more of the standard types of mistakes that developer make because we made them ourselves during our own development.
      When I started using SQL to do my own analytics, I got better at understanding the queries that other departments used when providing me data. I find SQL mistakes more readily. The same thing happened with Power BI. When I understood how Power BI works (and doesn’t) and the infrastructure used to support it and how access rights are granted, etc., I became a better Power BI auditor.

      While Arbutus intrigues me because it seems better than ACL and would leverage my ACL knowledge, if I brought it inhouse, I’d have to first master it, convert my code, and then teach it to my ACL auditors. Like ACL, I’d be the go-to guy for troubleshooting, architecting, etc. It seems like the only thing I’d gain is being able to access larger files and write to databases. I hope you really bite on this one. :)

      But if I continue my journeys in SQL, Power BI, Python, and R, I will become more well-rounded and more marketable. Yes, the path will be more difficult, but I think not only I will benefit, but so will my dept and company. Also, I already have others in the company who can help me with these tools. I’ve never liked being the top ACL dude at my company as I have no one nearby that I can learn from.

      My goal is to keep ACL running as is, build a few additional apps with it until I and others master other tools, and then build more in the new tools, while continuing to use ACL for one-time audits and adhoc projects where the ACL auditors can crank out something fast. Use the best tool for the job (and I don’t think moving to Arbutus will help with that goal).

      As always, I’d love to hear my response to all of this. I don’t mind if you put your Arbutus hat on either; we all know your bias, but at the same time, you have a unique perspective of knowing both ACL and Arbutus, and I assume, other tools. I am confident that you will answer as a developer first and an Arbutus developer second. But either way, the floor is yours my friend!

      Like

      • A truly long post, that deserves a reply, even though it would seem that you have cast your lot to stick it out with ACL. As always, I will respond in the order of your post.

        ACL Choking/write to a database: As I have previously mentioned, all your problems go away with Arbutus, not only are we backward compatible with ACL, but no more files on the LAN, if you choose.

        Low code: I get weak management, but that’s not a controllable factor. The tool you choose is. All you need to do is enforce documentation standards on your workflows that make them reviewable. We certainly offer such a capability.

        No analytics: I’ll have to disagree with your choice of no analytics over graphical analytics. I’m not even sure that’s your real position, but I feel your level of frustration. You’ve had trouble training staff, but that would be in a script-based environment. I get that they don’t always get it, but your only choices are to abandon analytics, hire better people, or switch to something that is easier for them to understand and work with. You’d be surprised how effective and compelling a graphical environment is, and as I said, the reviewability is just a matter of standards. Further, if our existing product doesn’t meet your needs then we’ll fix it: that’s what we do.

        No bent for analytics: This is absolutely a thing, but it’s hard to find rocket scientists. In the same way that an automatic transmission in a car allows people to drive that wouldn’t otherwise be able to (using a stick), graphical tools allow otherwise incapable people to perform quality analytics. It’s hard for people to change, but much easier to give them tools to help. I find most analysts are bright, and can come to correct conclusions and analyzes, but this is different from being able to write a script. Providing an automatic transmission just allows you to get more value out of that resource.

        Python: I find it interesting that you would subject Python to less scrutiny than ACL or Arbutus, trusting the libraries explicitly. You are certainly correct that it is impractical to review those Python libraries (relying instead on the community’s use of them instead), just as it is to review ACL or Arbutus’ Sample command. But at least with an audit-specific tool, the community you are relying on is also auditors, who generally have similar uses and objectives to yourself. I’ve been in this business for over 30 years, and in my opinion, you should always strive for as little code as possible. Every line of code you write has a chance of error. The more lines, the more errors. My entire career has been spent adding features that make analysis easier to do. Why add a total command when I am sure you could write a GROUP that added up each amount manually? Silly example, but my point is that making products easier to use is not a failing, and graphical interfaces are just the next evolution. I agree that you shouldn’t trust a vehicle like ACL’s script hub, as there is no control over the quality therein. I’ve seen some of that content and that’s a big reason we don’t offer that. The scripts we offer are supplied by Arbutus and subjected to the same rigor as anything else we produce.

        Text I/O: Arbutus is not limited by memory. It makes good use of memory, but fundamentally relies on external storage. We have customers (in China) with billions of records; far more than you’d ever host In memory. Speed is just a matter of architecture. If you start with a slow product you can to some extent compensate by storing data in memory, but you accept the consequences. Arbutus is already a fast product, and doesn’t need the crutch of in-memory processing. We do use in-memory processing for workflows, when we can, but that is for other reasons.

        Standardization: you seem to be falling back on the same (I think, flawed) logic as last time. When you refer to ‘other departments’ I expect that you are not referring to other audit departments. If you follow that argument, then the accounting department would be using Excel for the company books, rather than SAP (or whatever they use), as no other department uses that package.

        1, Run over by a taxi: You are absolutely correct. This is why all tools, not just Arbutus, are trying to make themselves as easy to use and get up to speed on as possible. The market has said (and I agree) that a graphical interface is a good step in that direction. Nobody knew Alteryx before they came into existence, but they have become the poster child for just such a graphical interface. Many other tools, including Arbutus, are adopting that style, because people occasionally get hit by taxis, or find another job. If you insist on experience in ‘that package’, rather than ‘that interface’ you are dooming yourself to limited options, a poorer audit fit, and higher costs.

        2, transferability: In my experience, there is almost never transferability between applications. The exception is when there are multiple market leaders, when a tool may offer conversion from a competitor, as we do with ACL projects. If this is a requirement then accounting should be done in Excel. The answer to this problem is to make sure this business logic is properly documented, so it is easier to re-code in the next tool. The question you might ask is why it is hard to duplicate the logic in the other tool? In many cases that is because the functionality offered by ACL and Arbutus isn’t available elsewhere.

        3, better understand how to audit other departments: I must confess I don’t get this argument at all. You wouldn’t know the standard types of mistakes made, as in most cases they’d be fixed before you saw them. Data is data, and is independent of the tool used to create (or analyze) it. Your SQL example is instructive, but sounds like on-the-job training to me.

        Bringing in Arbutus: You perhaps don’t realize how similar ACL and Arbutus are. Most users will be up to speed in a day, because we use all the same building blocks as ACL (formats, workspaces, Indexes, Views, Scripts, … They are so similar that we offer a conversion tool that converts every type of object from your existing ACL project. There are a few minor script differences, mostly relating to IMPORT syntax, but generally we are a super-set of the ACL language and environment. You will get more value out of our tool if you have training in our specific advantages, but should be able to work with Arbutus pretty much out of the box.

        Continue with SQL etc: If your goal is to make yourself more marketable then I can’t fault your logic. That said, I’m not sure the company will get as much benefit as you, since they will not get analyses from a specifically designed too (back to my Excel example).

        Arbutus is essentially a plug-compatible replacement for ACL. In other situations, I’ve used the example of switching to a different flavor of SQL. You get tons of benefit, with little or no cost. You actually save money, depending on the deal you struck with ACL regarding pricing.

        I’m not trying to sell you, although I’m sure it sounds that way, only to make sure you are well informed. As I said a number of posts ago, I see audit departments being forced to make decisions that are detrimental to their effectiveness, and I’m doing what I can to shine a light on this.

        Grant

        Liked by 1 person

        • Grant,
          Thanks for your input; I knew you’d come through. I don’t feel like I’m being sold your product at all. You are highlighting differences in the 2 products and how you and I think differently.

          I have cast my lot based on my experience with my mgmt, not on your tool and what I think it could do for me.

          I don’t have anything against low code/graphical interfaces, but I know my mgmt and my auditors. I haven’t seen in them the desire or the capability to understand analytics of any sort, including what you can do in Excel. In another department with other people, I would go all out.

          My ‘no analytics’ choice reflects people who don’t get it and don’t do well even in Excel analysis. We do have some effective auditors, so I would continue to work with them and build their skills; they would keep producing, and we wouldn’t have to constantly clean up messages from those who don’t, and the effective auditors would be more productive.

          I love your frankness and calling out what you believe is ‘flawed logic’. We need more of your kind.

          Regarding ‘other departments’, yes I meant other depts in the company. Let me explain it another way: While Arbutus may be able to make our ACL auditors more efficient and effective in analyzing data from the other depts, there’s no way Arbutus will help me understand other technology like the cloud, for example.

          Analyzing the data from a system can only tell you so much. It won’t help you audit the queries used to obtain and transform data, understand how S3 buckets work, how access in Power BI works, or where the control gaps are. Using the technology that helped build the process you are auditing makes all the difference in the world.

          Of course auditors aren’t expected to be experts in everything, but since my company uses Power BI and a few other mainstream technologies to analyze data, it makes sense for some of our auditors to uses those same technologies to analyze our own data; as I said before, that’s the best way to learn.

          And no, not all the problems are fixed before we audit them. I have audited in several companies and I find the problems and flaws are getting worse, not better, even with all the tools available to find them. And on top of that, it seems less people are truly engaged with their work. When I put on my auditor hat, that tells me we have problems with tone at the top, but that’s another subject entirely.

          Like

        • I know I’m preaching to the unconvertable, but if you or any of your readers were interested in the dark side, Arbutus is presenting a low-code/no-code webinar on March 15th (10AM Pacific), presented by Michael Kano. You can register at https://register.gotowebinar.com/register/5855906991578090763?source=Email&utm_campaign=On-Demand%20Webinars&utm_source=hs_email&utm_medium=email&utm_content=205345064&_hsenc=p2ANqtz-9vnl6Pp9QFnC5eQl4QDH-CrnOeB1n9GYygcsTO4Crpd_3Ur67hRvg2vJxWWT7aGQIXCWLOqvbNesXd0MrA8UfrKaCUQihzjOaQbsGtnKfeidudm_g . It may be that seeing is believing.

          Grant

          Like

        • Grant,
          Please post a link to the recording after the seminar occurs. I saw this, but am unable to attend, but would like to catch it in “reruns”.

          Like

  8. Pingback: Most Popular Blog Posts of 2021 | ITauditSecurity

  9. As requested, here’s a link to our recent LC/NC webinar, presented by Michael Kano: https://www.arbutussoftware.com/low-code-no-code-functionality-in-mainstream-data-analytics-watchnow . This mostly addresses the issues around LC/NC, which seemed to be your main concern. Michael did an excellent job of not making it a sales pitch.

    I really do believe you should do a re-think with regards to your position on LC/NC. I stick to my assertion that any analytic is generally better than no analytic at all. One can find exceptions, but I believe that is exactly what they are: exceptions. By opening up the field to less technical users you have a chance of exposing them to the power of analytics. The rest you can sort out later.

    Grant

    Like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.