Feed on

IAM is a very strong tool to get in control of your accounts. With an IAM system for all standard users you will quickly protect all standard access and manage all access control. On top of that comes the protection of your privileged accounts and that means more advanced solutions like PAW or ESAE. In essence you cannot protect a privileged account if you allow it to logon to a workstation where you access internet or check your mail no matter what protection you have.

Should IAM be used to provisioning privileged users? Yes, to some extent it could be useful, like for Tier 1 system operators, database admins etc. But for Tier 0 users (in essence Domain Administrators) you should not use an IAM as you would like to be in very tight control of your Domain Admins and also because you don’t want your IAM-system to be part of Tier 0.

You have all heard about the layered security approach and probably understood it. Sometimes it just becomes very visible how it works. I recently visited a client in southern Europe where we are delivering a high security project and as part of that project we are working in a secure room, a locked and secured room.

In the endeavors to secure the area the client decided to exchange the lock and key we currently were using to the central keycard solution with logging to make sure that we knew who was going in and out of the room. Of course, the owner of the solution was trusted to enter the room even before this. As a precaution I decided to review how the current solution was administered. This turned out to be a laugh for all of us, one of those laughs you have when you realise you should had stayed in bed instead. The administration terminal was a simple laptop standing wide open in the reception, no password and only a two-character password to log in to the system. Furthermore, the server was a desktop standing on the floor with no protection on the database, no backups and no physical protection what so ever. The computers were never patched and was running old operating systems.

Two hours later the vendor was there and received an order to replace the system the day after.

I had a chat with a friend of mine, who is an enterprise architect and a damn good one as well, regarding integration architecture vs security architecture and where the cross section. While his stand point is that integration architecture is imperative to understand how business unites should work together my viewpoint is that from an information perspective, adding a layer of information classification and collusion issues, I need to understand what information the different business units actually use to be able to apply a correct classification on the information. You who have followed me for some time know that my view on classification is that it is only an accelerator on what type of protection you should apply and the type of authentication that should be used.

I had a case a few years back regarding access control and how visitors was to be registered before being allowed into a secure building. It turned out that there was an automated approval flow that moved from classification level 3 to classification level 2 (lower classification) without anyone understanding the consequences. In this specific case this was the enablement of a social engineering and technical attack that in the end enabled me to enter the facility with a full access card as a consultant.

I received a mail recently regarding how many domain admins a company should aim for to have. Of course, this is always dependent on the structure of your company etc. but as a rule of thumb I aim for five domain administrators.

So, why five? It is actually quite easy to calculate. First of all: The domain admins should only do domain admin tasks, like manage domain admins, manage the domain controllers, manage the forest, etc. The domain admin should not bother with creating users. That task should be delegated to someone else. This means that the number of users needed shrinks quite rapidly. If you then have a 24/7 availability need we are looking at staffing for that and that is normally five persons. It´s as easy as that.

Processors are vulnerable. Who knew? Most interesting is that this is a flaw in the architecture in itself. Those types of errors tend to be harder to fix because it is part of the overall solution. Tis specific case will be interesting to follow. I don’t expect it to that much of a problem in reality. Most fixes will sort most of the problems and the solutions that actually are sensitive will be protected in other ways. Gladly we are not living in a monolith world anymore when it comes to IT.

I would say that the bigger problems resides in older systems that are not adequately protected in the first place but there are so many attacks easier to conduct so when they finally are at the level to start using meltdown as a way to get in the hardware has been changed to a secure version.

Tier 0 and GDPR

I love working with security and I´m fully aware that there is always an expert that knows the details better than I do and also another expert that knows the whole field better than I do. I can only prod along and do my best. Sometimes, however, I´m baffled by how some people are blind to the obvious just because they have worked for to long in a to narrow field.

I recently joined a Facebook group that discussed security. Albeit on a very technical level but still interesting. I got into a quite interesting discussion regarding GDPR protection solutions, mainly encryption tools and network protection tools. After a few minutes the I challenged the database security experts that wanted to implement a database encryption tool that would sort most of GDPR according to the sales person. Yes, it would give the database a lot more protection, but as it was dependent on authenticated users and they had no control over who gave whom access they were susceptible to a credential theft attack. After a few rounds they finally understood that they have to take a broader scope on GDPR (not even mentioning all the organizational and legal things they need to manage as well). However, halfway through the discussion a network security specialist gave chase and, with a very looking-down-the-nose-tone, told us all that we where all wrong, that all protocols are broken and that it is impossible to protect a database as long as you have control over the network. No matter what we came up with as suggestions the answer was always, ‘all protocols have been broken and you are all wrong because you don´t understand security as I do’. I´m the first to sign a paper that there are quite a few that know security better than me but during my 20+ years I have managed to pick up a thing or two and I know that there are encryptions that are not easily broken in 20+ years and that if you manage your networks with good account management and secure workstations it is almost impossible to break without resorting to physical threat or good old social engineering-tactics.

Still, it was an interesting experience, to try to argue with someone that has his thoughts locked in concrete. Most interesting was that what ever TLS-encryption I came up with it was flawed, but any other network encryption, he came up with was brilliant.

In the end I decided to leave the group because I really like a good solution discussion but I distaste all mine-is-bigger-than-yours-arguments. There tend to be many of those in the area of security. There are always different solutions to the same problem and some fits the client, some don´t. It is important to use the right one and not go with dogma.

Wohoo! The day has come when I turn 45 and I have still many years left to work in the most interesting of fields. There is so many new vulnerabilities still waiting to be found. I expect the next year to have focus on lower level attacks and also more stealthy attacks. On the level of operating systems and applications the cost of finding a flaw have started to rise. It is absolutely not impossible but it becomes harder by the day with more and more companies adopting secure coding methodologies. Hence, the bad guys will try to find other ways that are more persistent ways to attack a computer, as there is money to be made by owning computers.

I expect more attacks like Rowhammering or similar to surface during 2018, attacks that are so called, silicon based. Probably a specific error in a processor, a GPU or why not in the firmware of the hard drive? As those are dependent on the hardware they will be costly to fix if they ever are fixed. It would also make it possible to jump from a guest to a host in a cloud computing scenario.

I also expect more stealth because ransomware has turned harder to make money from. People are making backups to a larger degree and as you never know if you actually will receive a decryption key if you pay it is probably better to spend the monies on recovering. I expect there to be a shift from ransomware to crypto currency mining that hijacks processes and run those on 30-40% of your processors power making them not directly visible or disturbing.

Why do I care about defining Tier 0 and why is it a problem to have a large Tier 0? It is all part of minimizing the attack surface. You want to minimize the places where it is possible to find a domain administrator account and exploit that. It is far easier to secure 20 computers rather than 200 or 2000. But with GPOs you can manage 10 as easy as 10 000 computers so what’s the big deal with Tier 0?

It comes with the inherit problems of authentication and authorisation that we need to make sure that it is possible to work without having to type your password for every transaction. This is done by placing a hash that is calculated using your username/password/token etc. This is reused for some time to validate your credentials. This makes the system easier to use but also creates a possible vulnerability. Sadly, the hashes are possible to hijack and reuse therefor the problem with credential theft exist.

The hashes are stored in memory and is possible to steal with the right tools meaning that we have to focus on removing the places where domain administrators have logged in to as few as possible. This means that implementing privileged access workstations is imperative to minimize credential theft. Such workstations mean in short that you have one computer for mail and one for administrative tasks. This of course create a cost if you have 100´s of administrators that needs to have two computers.

Defining Tier 0

Credential Theft is a bid problem today. Many of the attacks we see are targeting accounts rather than the individual computers. This is due to the cost of exploiting. As soon as you have a valid account it is much easier to travel around and try to find a domain admin account. As soon as you have domain admin you have it all (this goes for root etc. as well).

One of the problems I´m challenged with is the definition of Tier 0. What is this then? How do you define Tier 0. The simple definition is: Every computer that either define or manages domain administrator accounts or can managed those computers in such a way that they either have physical access or administrator access on the computers. Compare with the PCI DSS definition of system components.

This for example means that any computer where a domain admin has logged in to recently or where a service account is run with domain admin privileges is also part of Tier 0. At a sales call recently after a brief chat we identified that the client had 2/3 of their computers part of Tier 0.

WPA2 breached

May you live in interesting times! Using WPA2 apparently is not a good idea anymore. This caught my interest as it is a breach on a protocol level rather than just a function and there are many companies that have moved to WIFI and rely of WPA2-Enterprise to secure the communication. So many WIFI units that are out there that needs to be upgraded to keep the company secure. Not to mention all the POS-systems that rely on WIFI for managing the payment using credit cards.

We will most probably see a surge of attacks against payment systems to harvest credit card information and a new tool to crack the corporate networks and look for vulnerabilities to expose.

We should, however, have in mind that no network is secure so if you asume breach in the first place and have implemented protection in layers this will not be one of your bigger problems.

Older Posts »