While installing and configuring some new software on my Windows server, I noticed that the IT department forgot to remove some previous software components from my server.
I remember seeing the notice that the software was being uninstalled and replaced by another package.
I could have removed the left over components myself (I am admin on the server), but I wanted to see if they would ever be removed. Did the Windows server team forget about this, or did the team not concern itself with such things? Maybe the procedures don’t include a process to ensure all components are removed.
I waited about 2 months, but the components were not removed.
One of my current clients is trying really hard to do periodic access reviews.
They know that mistakes are made in granting access, that users get access and eventually don’t need it anymore, but don’t tell anyone, and that some users leave the company without their manager’s knowledge (I never have understood how that happens, but it does; it has happened in every Fortune 500 company in which I’ve worked).
I recently ran into some unneighborly security. It happens all the time to those of us who know how to build, upgrade, secure, and troubleshoot hardware and software.
I’m over at my neighbor’s house and he says, “Hey, you work with computers, so can you take a look at mine?”
There goes the afternoon.
When checking system access, make sure you look at all the different items that affect the user’s access. For example, the user might need one or more of the following:
- Application ID
- Application role or group
- Membership in an local server group, Active Directory (AD) group, or UNIX Group
- Access to the application’s share and/or folder on the server
- Database ID
- Database role, including access permissions (read/write)
- Other permission (from a home-grown application code or enterprise identify management system)
IT admins and IT auditors often don’t see eye-to-eye, and they don’t usually think their goals are similar.
The IT auditor just has to work a little harder to convince the IT admin of that. I’ve worn both hats, so I know it can be done.
Filed under Audit, Security
During an audit, I had a vendor provide me with access to data I shouldn’t have, no questions asked. I didn’t ask for the access, I just needed some information for my audit.
The audit involved checking some vendor software to determine whether it is patched by IT on a regular basis. I obtained from IT a screenshot of the version number of software that was installed, but needed to know the last couple of versions released by the vendor. The admin was going to send me the URL because he said I probably wouldn’t find it the info on the vendor’s site. After a couple days of waiting for the URL, I took matters into my own hands and went to the vendor’s website.
A lot of company data is lying around unprotected, making it very easy to steal. No, I’m not talking about picking up other people’s documents at the printer. Stealing printouts isn’t hard, but it can be risky, especially if the printer is a busy one. Besides, it has 2 other problems:
- Your chances of picking up confidential data are low at any given time.
- The person will look for the printout and wonder what happened to it.
There’s a much better way that is fast, easy, simple, raises no suspicion, and is basically impossible to detect, if you do it correctly. Can you think of what it is?
Minutes later, one of the security techs met me at Lynn’s cube with a box that we quickly filled with the contents of her desk: files, CDs, DVDs, notedpads, books, etc. The other help desk analysts in adjacent cubes looked at us with silent questions on their faces.
I noticed that one of them was a new employee that had attended my security presentation in employee orientation last week, so he knew who I was. That meant rumors would spread quickly. While I never enjoyed walkouts, they reminded the staff that security incidents have consequences.
This is a multi-part series. See Internal Attacker Detected: Part 1, Internal Attacker Detected: Part 2, and Internal Attacker Detected: Part 3.
Others on my team had already imaged the old computer and had started imaging the new one across the network as soon as my meeting with Lynn began (by design, she was not told of the meeting beforehand). Both images would be sent off to the Forensics team.
If you haven’t determined how server virtualization changes your audit plans, you better get moving. I’m not just talking about a virtualization audit (more on that later), but the audits that you typically do every year or on a multi-year cycle.
For example, if every year you do an audit on all networks, servers, applications, and databases that host your key financial reporting or PHI systems, you’re looking at policies and procedures, configuration management, security (including patching), user access, logging, and so on. But do you first consider whether those assets run on virtualized servers?
I read a blog post that quoted a security professional saying, ‘culture is defined as the beliefs we accept without question.’ The blogger, also a security professional, went on to say that his goal is to generate a new security culture, a security culture that “everyone accepts and makes a natural part of their activities.”
That definitely got me going, so I left a comment that explained why I disagreed with that statement.
On my walk to work, I cross a lot of 1-way streets. I always look both ways. Sometimes, when a friend or colleague is walking with me, I get teased me about this. I always reply with this question: Have you ever driven down a 1-way street the wrong way? For some reason, I never get a reply and another subject surfaces.
When I crossed one of those streets the other day, I realized that some people look at audit/security/risk the same way. They only look one way because of the people or rules or controls or norms that govern the activity. They fail to think outside of the cubicle and look the other way–the path seldom traveled.