Feed on
Posts
Comments

I had a chat with a friend of mine, who is an enterprise architect and a damn good one as well, regarding integration architecture vs security architecture and where the cross section. While his stand point is that integration architecture is imperative to understand how business unites should work together my viewpoint is that from an information perspective, adding a layer of information classification and collusion issues, I need to understand what information the different business units actually use to be able to apply a correct classification on the information. You who have followed me for some time know that my view on classification is that it is only an accelerator on what type of protection you should apply and the type of authentication that should be used.

I had a case a few years back regarding access control and how visitors was to be registered before being allowed into a secure building. It turned out that there was an automated approval flow that moved from classification level 3 to classification level 2 (lower classification) without anyone understanding the consequences. In this specific case this was the enablement of a social engineering and technical attack that in the end enabled me to enter the facility with a full access card as a consultant.

I received a mail recently regarding how many domain admins a company should aim for to have. Of course, this is always dependent on the structure of your company etc. but as a rule of thumb I aim for five domain administrators.

So, why five? It is actually quite easy to calculate. First of all: The domain admins should only do domain admin tasks, like manage domain admins, manage the domain controllers, manage the forest, etc. The domain admin should not bother with creating users. That task should be delegated to someone else. This means that the number of users needed shrinks quite rapidly. If you then have a 24/7 availability need we are looking at staffing for that and that is normally five persons. It´s as easy as that.

Processors are vulnerable. Who knew? Most interesting is that this is a flaw in the architecture in itself. Those types of errors tend to be harder to fix because it is part of the overall solution. Tis specific case will be interesting to follow. I don’t expect it to that much of a problem in reality. Most fixes will sort most of the problems and the solutions that actually are sensitive will be protected in other ways. Gladly we are not living in a monolith world anymore when it comes to IT.

I would say that the bigger problems resides in older systems that are not adequately protected in the first place but there are so many attacks easier to conduct so when they finally are at the level to start using meltdown as a way to get in the hardware has been changed to a secure version.

Tier 0 and GDPR

I love working with security and I´m fully aware that there is always an expert that knows the details better than I do and also another expert that knows the whole field better than I do. I can only prod along and do my best. Sometimes, however, I´m baffled by how some people are blind to the obvious just because they have worked for to long in a to narrow field.

I recently joined a Facebook group that discussed security. Albeit on a very technical level but still interesting. I got into a quite interesting discussion regarding GDPR protection solutions, mainly encryption tools and network protection tools. After a few minutes the I challenged the database security experts that wanted to implement a database encryption tool that would sort most of GDPR according to the sales person. Yes, it would give the database a lot more protection, but as it was dependent on authenticated users and they had no control over who gave whom access they were susceptible to a credential theft attack. After a few rounds they finally understood that they have to take a broader scope on GDPR (not even mentioning all the organizational and legal things they need to manage as well). However, halfway through the discussion a network security specialist gave chase and, with a very looking-down-the-nose-tone, told us all that we where all wrong, that all protocols are broken and that it is impossible to protect a database as long as you have control over the network. No matter what we came up with as suggestions the answer was always, ‘all protocols have been broken and you are all wrong because you don´t understand security as I do’. I´m the first to sign a paper that there are quite a few that know security better than me but during my 20+ years I have managed to pick up a thing or two and I know that there are encryptions that are not easily broken in 20+ years and that if you manage your networks with good account management and secure workstations it is almost impossible to break without resorting to physical threat or good old social engineering-tactics.

Still, it was an interesting experience, to try to argue with someone that has his thoughts locked in concrete. Most interesting was that what ever TLS-encryption I came up with it was flawed, but any other network encryption, he came up with was brilliant.

In the end I decided to leave the group because I really like a good solution discussion but I distaste all mine-is-bigger-than-yours-arguments. There tend to be many of those in the area of security. There are always different solutions to the same problem and some fits the client, some don´t. It is important to use the right one and not go with dogma.

Wohoo! The day has come when I turn 45 and I have still many years left to work in the most interesting of fields. There is so many new vulnerabilities still waiting to be found. I expect the next year to have focus on lower level attacks and also more stealthy attacks. On the level of operating systems and applications the cost of finding a flaw have started to rise. It is absolutely not impossible but it becomes harder by the day with more and more companies adopting secure coding methodologies. Hence, the bad guys will try to find other ways that are more persistent ways to attack a computer, as there is money to be made by owning computers.

I expect more attacks like Rowhammering or similar to surface during 2018, attacks that are so called, silicon based. Probably a specific error in a processor, a GPU or why not in the firmware of the hard drive? As those are dependent on the hardware they will be costly to fix if they ever are fixed. It would also make it possible to jump from a guest to a host in a cloud computing scenario.

I also expect more stealth because ransomware has turned harder to make money from. People are making backups to a larger degree and as you never know if you actually will receive a decryption key if you pay it is probably better to spend the monies on recovering. I expect there to be a shift from ransomware to crypto currency mining that hijacks processes and run those on 30-40% of your processors power making them not directly visible or disturbing.

Why do I care about defining Tier 0 and why is it a problem to have a large Tier 0? It is all part of minimizing the attack surface. You want to minimize the places where it is possible to find a domain administrator account and exploit that. It is far easier to secure 20 computers rather than 200 or 2000. But with GPOs you can manage 10 as easy as 10 000 computers so what’s the big deal with Tier 0?

It comes with the inherit problems of authentication and authorisation that we need to make sure that it is possible to work without having to type your password for every transaction. This is done by placing a hash that is calculated using your username/password/token etc. This is reused for some time to validate your credentials. This makes the system easier to use but also creates a possible vulnerability. Sadly, the hashes are possible to hijack and reuse therefor the problem with credential theft exist.

The hashes are stored in memory and is possible to steal with the right tools meaning that we have to focus on removing the places where domain administrators have logged in to as few as possible. This means that implementing privileged access workstations is imperative to minimize credential theft. Such workstations mean in short that you have one computer for mail and one for administrative tasks. This of course create a cost if you have 100´s of administrators that needs to have two computers.

Defining Tier 0

Credential Theft is a bid problem today. Many of the attacks we see are targeting accounts rather than the individual computers. This is due to the cost of exploiting. As soon as you have a valid account it is much easier to travel around and try to find a domain admin account. As soon as you have domain admin you have it all (this goes for root etc. as well).

One of the problems I´m challenged with is the definition of Tier 0. What is this then? How do you define Tier 0. The simple definition is: Every computer that either define or manages domain administrator accounts or can managed those computers in such a way that they either have physical access or administrator access on the computers. Compare with the PCI DSS definition of system components.

This for example means that any computer where a domain admin has logged in to recently or where a service account is run with domain admin privileges is also part of Tier 0. At a sales call recently after a brief chat we identified that the client had 2/3 of their computers part of Tier 0.

WPA2 breached

May you live in interesting times! Using WPA2 apparently is not a good idea anymore. This caught my interest as it is a breach on a protocol level rather than just a function and there are many companies that have moved to WIFI and rely of WPA2-Enterprise to secure the communication. So many WIFI units that are out there that needs to be upgraded to keep the company secure. Not to mention all the POS-systems that rely on WIFI for managing the payment using credit cards.

We will most probably see a surge of attacks against payment systems to harvest credit card information and a new tool to crack the corporate networks and look for vulnerabilities to expose.

We should, however, have in mind that no network is secure so if you asume breach in the first place and have implemented protection in layers this will not be one of your bigger problems.

Here in Sweden GDPR is one of the hottest topics within security. There is a lot of confusion regarding what is needed to be done and what different parties need to do.

First of all, GDPR is a law. Any lawyers out there would probably want to correct me as it´s an EU thing, but in essence it needs to be abided by. Hence, the company lawyer has the last say about what is right or wrong. For me, as an architect, my goal is to deliver possible solutions to the problems that arise when you try to abide by the law.

That said, GDPR, is not the only law to follow. On the contrary, for a public entity there are a number of laws that normally are a lot stricter. GDPR more or less, just enforces those laws more, meaning that the security needs to be ramped up quite a bit to ensure that the risks are managed.

GDPR, from a public entity point of view, should be regarded for all information that is not managed by other laws, and as security and operational requirements for the more sensitive information residing in processes governed by the more stricter laws of secrecy or health data.

As many public entities, especially municipalities, have a rather complex setup of services, including partly outsourcing systems to other municipalities, the amount of information that potentially could fall under GDPR is large meaning that, from a threat perspective, there is a large need of security services.

From my point of view, as a security architect, the first action to take is to conduct a risk analysis, either on a larger scale or more focused on the technology. In the end GDPR demands that a lot of legal issues are managed as well, but as stated before, I only try to sort the problems that have a technical or procedural solution.

Second step is to create a governance structure for managing changes that make sure that GDPR in general and security in specific is adhered to and that mapping to the risk analysis is done. GDPR mandates security by design and privacy by design. This is translated in operational terms to map the risk analysis to the solution so that you could show that the risks are managed in a correct way.

Third step is to manage the standard risks that exist due to the fact that you have an infrastructure to manage, no matter the type of technology used, those are the risks that are classified as You Have if you follow my You3 model. One of the most important steps to do here is to secure your environment against credential theft and lateral movement. You could find an official Microsoft description on Securing Privileged Access here but I strongly advice you to take external help here as it is rather complex to do it yourself.

Fourth step is to secure your databases with encryption. There are many ways of encrypting your databases and if you are running Microsoft SQL Server you´ll find the process rather straight forward with only a small amount of time needed to implement the technology. The processes for key management is a rather different thing however.

Fifth step is to ensure fully working backups of all important data and that all of those are secured as well. Remember to backup your keys as well, otherwise you have built a Do-It-Yourself-wiperware.

Sixth step is to ensure that you monitor your environment and make sure that you catch any attackers sooner rather than later. There are many tools out in the market but make sure that you go for those with built in machine learning and those that utilise information of attacks globally to protect your organisation. Add a good antimalware with ransomware blocking capabilities as well.

There are many more steps to take into account to build a secure environment but the steps taken above will be more than enough to keep you busy for months to come.

Yet a former client of mine has been hit by a ransomware. They used an online backup system that used mapped drive so they was partly encrypted as well. Still they were immensely lucky to having tried Azure Recovery Vault. Before joining Microsoft I had very little knowledge of the inner workings of Azure. Currently I´m supposed to attend a mandatory training in Azure and has to sit through 60 h of training in Azure at http://openedx.microsoft.com (yes, free for all). I came in contact with Azure Recovery Vault and tried it out with a free account. I called a friend of mine and asked him about it and if they tried it. He started laughing a told me that he had tried it for backup purposes about three weeks ago and that they where hit by a ransomware the week after. But thanks to this they could recreate their data. I think I´ll start looking at using Azure as backup solution as a way to protect against ransomware.

« Newer Posts - Older Posts »