Windows infrastructure suffers due to weak data protection

Windows infrastructure weak data protection

Keeping your data safe is a must when it comes to your organization’s data and IT infrastructure. The most effective way to ensure that you are meeting these necessary goals. Some measure of success is by implementing multiple levels of data protection.

This post will explore a Windows infrastructure that is not adequately protected from data loss. If you happen to be running a production environment on Windows. You must implement multiple levels of data protection between your source and destination computers.

Different levels of data protection

There are three levels of data protection that you need to implement to protect your sensitive data:

  1. Destination-to-source protection – If you are using a *nix server or workstation as your source computer (e.g., Linux or Mac). Then this level of security is required to prevent data loss when a machine is turned off or goes offline unexpectedly. This level of protection is necessary if you backup your source computer with an automated tool like Rsync to improve the effectiveness of security protection because the source files are at risk when the network card on the computer goes offline.
  2. Source-to-destination protection – This is only required when the source computer goes offline. If the machine is turned off, or cannot be accessed. Then, no changes should be made to any files on that server. Unless it has been back ‘online’ for more than 30 days. And has received both a successful network reboot and an NTLM restart. Minor script changes can be made. But significant user configuration changes (e.g., passwords and files ownership, etc.) should only be made with an NTLM restart. This level of protection is practical and prevents data loss on the source computer if the source computer is turned off or cannot be accessed. This type of protection is required if your source computer is a desktop or laptop. That goes offline unpredictably (e.g., after a power outage or theft).
  3. Destination-to-destination protection – This form of data protection requires any changes to files after the destination computer has been online for at least 60 days. The destination computer receives an NTLM restart with a successful reboot before any changes should take effect. This level of protection is required by any server accessible to multiple users. This includes Windows file servers and print servers. Moreover, Exchange servers, SQL servers, and any other service that accepts remote connections. This level of protection prevents changes to your files. If there is a change in ownership or permissions on the destination computer before the network restart.

How does this affect your Windows?

If you’re running a Windows environment, you need to ensure that all of your servers use a multi-level data protection strategy. Still, different levels of protection are required depending on how critical the server is to your organization. Given that there are multiple services (e.g, DHCP, DNS, Exchange, and SQL server) running on most servers in your environment. You need to ensure that each of these services has its level of protection.

Having different levels of protection becomes apparent. When you realize that it’s not required to reboot the entire server after an update. The reboot requirement comes into play because any change to one service can affect another service on the same computer. So, if you have an exchange server, the first level of protection will be required for DHCP, but not for DNS.

However, if you’re using a client-side backup tool like Rsync. To sync your source computer with a destination computer (like a remote file server. Then, there is no need to use the same level of protection on both computers. Any change to the source computer will affect both the source and destination computers. So, when you sync your local files to a remote server or vice versa.

How to implement this in your environment?

The easiest way is to have everyone use the same level of protection because they all share the same NTLM pool. However, if you have an environment where each server may be using a different operating system. Or different levels of security, then separate source-to-destination NTLM pools are required for each machine. This will require two or more NTLM pools for each service running on your servers. But this will ensure no cross-over between devices in the event of an update or hardware failure.

What other types of protection do you need?

This is just the beginning. Beyond the three levels of data protection mentioned above. There are many other things that you should be doing to ensure that your data remains safe.

  • You should have different passwords for each user and service. If a password is compromised on one computer. Then, it can’t be used to access another computer or service. It’s also essential to use strong passwords. Because weak passwords can be guessed by brute force attacks in seconds or minutes instead of days or weeks.
  • If you have sensitive information on your servers. Then make sure that these directories are encrypted so that only authorized people can access them.
  • Make sure your phone apps do not use any sensitive data. This would add another layer of security.
  • You should also backup your data regularly. Because if you lose a server, there is a good chance that your data will be affected.

Keep these simple data protection strategies in mind the next time you’re faced with a new system, HDD or server upgrade. It’s better to be safe than sorry, and you’ll be happier for it.

Conclusion

Windows administrators are often tasked with managing multiple servers, storage devices, and other computer resources. It’s easy to forget about your environment. The consequences of not maintaining proper data protection can be significant. We hope this post has provided you with new ways to approach data protection in your environment.