Security


Script to clear browsing history and cache for IE, Firefox and Chrome

Recently I wrote a couple of scripts to clear the browsing history and cache for IE, Firefox and Chrome

Internet Explorer (Powershell)

$t_path_7 = "C:\Users\$env:username\AppData\Local\Microsoft\Windows\Temporary Internet Files"
$c_path_7 = "C:\Users\$env:username\AppData\Local\Microsoft\Windows\Caches"
$d_path_7 = "C:\Users\$env:username\Downloads"

$temporary_path =  Test-Path $t_path_7
$check_cache =    Test-Path $c_path_7
$check_download = Test-Path $d_path_7

if($temporary_path -eq $True -And $check_cashe -eq $True -And $check_download -eq $True)
{
    echo "Clean history"
    RunDll32.exe InetCpl.cpl,ClearMyTracksByProcess 1

    echo "Clean Temporary internet files"
    RunDll32.exe InetCpl.cpl,ClearMyTracksByProcess 8
    (Remove-Item $t_path_7\* -Force -Recurse) 2> $null
    RunDll32.exe InetCpl.cpl,ClearMyTracksByProcess 2

    echo "Clean Cache"
    (Remove-Item $c_path_7\* -Force -Recurse) 2> $null

    echo "Clean Downloads"
    (Remove-Item $d_path_7\* -Force -Recurse) 2> $null

    echo "Done"
}

Firefox (batchfile)

@echo off
set DataDir=C:\Users\%USERNAME%\AppData\Local\Mozilla\Firefox\Profiles
del /q /s /f "%DataDir%"
rd /s /q "%DataDir%"
for /d %%x in (C:\Users\%USERNAME%\AppData\Roaming\Mozilla\Firefox\Profiles\*) do del /q /s /f %%x\*sqlite

and Chrome (batchfile)

@echo off
set ChromeDir=C:\Users\%USERNAME%\AppData\Local\Google\Chrome\User Data
del /q /s /f "%ChromeDir%"
rd /s /q "%ChromeDir%"

The (dis)advantages of Cloud backup

Cloud backup (online data backup) is a service where data is remotely maintaned, managed and backed up. This service allows the use of storing backup files online, so that they can be accessed from any location due to the use of the internet.

More and more people are backupping their data to the Cloud. Many more people are thinking about it. What are the (dis)advantages?

Cloud backup

Difference between an offline/offsite back-up and a Cloud backup

A cloud backup uses storage in the cloud (provided by a Cloud storage provider) to store the data and is remotely accessible. An offsite backup is a copy of your local backup and is kept offline and offsite. When you have the need for the off-site back-up you have to pick up that storage device, mount it and/or do some configuration and then you are able to restore the data on it.

Disadvantages : the device (data) can be stolen, corrupted (no RAID usage for example), it takes time to use the storage because it’s offsite (you first have to collect the storage device)

Advantages are : data is offsite and any influences on that location (storm, theft, fire) is not appliacable. Data can’t be altered and is safe for malware and corruption.

Advantages of Cloud backups

  • Effeciency, usability and reliability
    Cloud backups are extremely reliable. Due to it being stored in a cloud environment, redundant config compensate for possible hardware corruption and facilitates improved data integrity
  • 24/7 monitoring
    Any decent Cloud storage provider has a 24×7 monitoring solution. So any flaws concering their (Cloud) service are noticed immediatly
  • Scalability & Accessibility
    You can easiliy up- and down scale (pay for usage) and the backup service can be accessed everywhere as long there is access to the internet.
  • Recovery time improvement
    You can increase your recovery time. Because everything is in place all the time. (You don’t have to load tapes etc) It’s as simple as a push on the button.
  • Disaster Recovery out of the box & accessibility.
    You don’t have to build a disaster recovery infrastructure. It’s right there for you to use. It’s important to keep a copy of your backup offsite. Even when all your local backups are in order, there could be a hurricane or flood which could prevent access to your servers. (Cloud backups are offsite too!)
  • Cost savings
    Business and organizations can often reduce operating cost by using cloud storage. But be aware, use of Cloud backup can be more expensive also.
    It’s important to use a solution that makes sense and won’t require to incur a capital expenditure.
  • Your Cloud partner or backup vendor takes care of things for you
    Most of the time it’s a simple one time configuration and afterwards you don’t have to pay any attention about configuring.
  • Protection against physical theft and natural disasters
    A tornado could hit your office, or you could be a victim of burglary of theft. Now you could still rest easy knowing that your personal data is safe offsite in the Cloud.

Disadvantages of Cloud backup

  • A full backup or recovery job can be too time-consuming
    Even if you’ve got a large bandwidth internet connection, it will almost certainly take some time to initially upload and backup your data.
    The same goes for restoring. Slow speeds are, without a doubt, the majority of people’s biggest gripe with online data backup.
  • Limitations of the amount of data that can be uploade to the cloud depending on bandwidth availability. You are completely reliant on your internet connection and on the connection of your Cloud provider. If your internet connection goes down, so does your ability to backup or restore data from the Cloud.
  • No direct control
    When you send your data up into the the Cloud you have less control over it than over the storage you have onsite.
  • Discontinuation of the service. Providers can stop their service
    This is one reason you can’t rely on Cloud storage only. You always need a local backup solution.  For legal reasons you (may) have to hold your backups for several years. What if your Cloud storage provider cancels their service? Then you have to rely on your local backups!
  • Bad or nonexistent service-level agreements
    If a solid agreements with the cloud storage provider is not in place, it could result in disappointment. Make sure your expectations and the provider’s capabilities are cleary spelled out in the contract.(SLA!!)
  • Datasecurity & encryption
    There are concerns with the safety and privacy of important data stored remotely.
    If a Cloud storage provider doesn’t follow adequate data security practices, your data will be exposed to greater risk than with off-site backups.  Any online backup provider worth mentioning will encrypt the data of its customers during both transmissiond and storage using high-level encryption algorothms such as AES or Blowfish, the same used by banks and government agencies. However some provider don’t use those technices and may expose as as security risk.

Conclusion

Backup to the Cloud is easily configured and often works like a charm. But don’t forget to investigate the different vendors and platforms (Amazon, Azure, Google etc) and check the support from your backup application (/solution). There are multiple vendors like for example Nakivo Backup & Replication. There are gigantic differences between costs, possibilities, liabilities and support.


How to fight Ransomware using Backup Technology

With the amount of ransomware cases seeming to increase every day this is coming more and more a problem. Ransomware cost hundreds of millions in damages worldwide en is increasing rapidly.

Modern total data protection solutions take snapshot based incremental backups on frequently based.

if your business suffers a ransomware attack, this technology allows you to roll-back your data to a point-in-time before the corruption occurred. When it comes to ransomware, the benefit of this is two-fold. First, you don’t need to pay the ransom to get your data back.

Second, since you are restoring to a point-in-time before the ransomware infected your systems, you can be certain everything is clean and the malware cannot be triggered again.

Recent surveys illustrated how extensive ransomware threats have been and recent studies show that an adequate backup solution is the best remedy. Therefore you need an adequate disaster recovery plan. This blogpost is about recovery and not preventing ransomware. I will blog about that later.

A great backup solution is not the answer for preventing Ransomware but it is the best way to provide a fast recovery. This way downtime and data loss is minimized.

While it may seem basic, experts agree that a solid backup plan is still the best prescription for addressing the threat of ransomware.

But what exactly does implementing a backup plan really mean, and what does a well-executed plan look like?

Working backup

Make sure your backups are working. test them! A green check mark isn’t enough!

According to an recent study by Symantec, most large companies test their backup plans on average once a year.

Simple backups should be tested much more frequently — at least once a quarter and whenever there is a major hardware or software change to your backup system. It’s particularly important to run a test after upgrading or changing major components in your backup system (for example the firmware version) to make sure everything works properly with the rest of your system.

Testing should consist of more than just simple some file restores. For example, if you just restore a couple of files you can’t be sure that your directory trees and other features are working as they are supposed to.

When you test a restore, take a minute to study the directories to make sure everything that should be backed up is actually backed up. The test should include restoring entire folders, complete with subfolders, as well as one or more critical applications.

Don’t forget your application-aware backups like SQL/Exchange etc. Some things are very difficult to test in their natural environment, but therefore you could use an OTAP environment. You can take advantage of your hypervisor and your backup solution for that purpose.

Retention

Good data retention policies are necessary, you need to be able to restore data at least two weeks old, better a month Recent studies discovered that large companies are infected months before they notice they are infected! How long are you keeping your backups? 14 days? 7 weeks? 6 months? Review, validate and, if needed, modify the retention policy (as defined in your backup policy) to ensure a sufficient Recovery Point Objective (RPO).

This may vary depending on your particular industry and regulations, and internal IT policies — IT, Legal, and Compliance teams — will make the call on data retention needs.  Rest assured that no matter what length you choose, the more the better. Using Cloud storage like Azure or Amazon could help you keeping the costs acceptable.

Offsite backup

A necessary part of the DR plan is to create an offsite backup as part of your backup strategy. Backups are critical. But, if you’re just performing regular backups to a single location, you’re missing an important part of your backup strategy. You need your files stored in separate physical locations.

Copy Backups Offsite

By using Nakivo Backup & Replication you can keep the copies of your backups locally, having at least one copy of your most critical backups offsite. This can save you from a lot of trouble in case a local disaster wipes your primary backups.  The secondary Backup Repository can be placed in any location that has a connection to the Internet, because backup data can be transferred via AES 256 encrypted link, and your secondary backup repository can be encrypted as well:

nakivo_offsite

Copy Backups to the Cloud (for example Amazon or Azure)

By Using Nakivo Backup & Replication you can use create fast, reliable, and affordable copies of your backups in the Cloud. This way your backup files are safely stored.

nakivo_offsite_amazon

More information about Nakivo Integration with Azure Cloud here.

Conclusion

Testing your backup strategy on a regular base is essential to make sure your backup solution does what it’s supposed to do! Offsite backups are a necessary fail safe to make sure backups are safe and can be relied on.

If an organization has no offsite disaster recovery facility, then backups to cloud should be considered as a means to safely store data outside of the scope of potential malware infection. Retention policies also can be leveraged to make sure data is kept for the period that makes sense to the business and that allows for recovery point objectives to be met.

 


Top tip: Linux security & auditing tool Lynis

For my work I often deploy Linux VM’s. I use Lynis for checking my system for security isssues en baseline(s).  Lynis is a security auditing tool for UNIX derivatives like Linux, macOS, BSD, Solaris, AIX, and others. It performs an in-depth security scan. Extensive reports in HTML and TXT are delivered. The company behind Linus (CISOfy) delivers great support and has a community of people working together.

Screenshot of Lynis:

lynis-screenshot

Installation is very simple (if you know your way round Linux)

Ensure that cURL, NSS, openssl, and CA certificates are up-to-date.

yum update ca-certificates curl nss openssl

Create /etc/yum.repos.d/cisofy-lynis.repo

[lynis]
name=CISOfy Software - Lynis package
baseurl=https://packages.cisofy.com/community/lynis/rpm/
enabled=1
gpgkey=https://packages.cisofy.com/keys/cisofy-software-rpms-public.key
gpgcheck=1

Next step is installing Lynis with yum.

yum makecache fast
yum install lynis

First time it might ask to import the GPG key. This ensures you only updates are received from Cisofy.

Now you start using Lynis. First time users are advised to use the Get Started guide.

lynis audit system

You see something like this (DONE/FOUND/YES/NO etc). You can open the logfiles afterwards in /var/log. Personally I prefer to pipe the output to a file also. (lynis audit system >> output_file)

lynis-check

Download Lynis here.

It is also possible to add extra checks (plugins) and/or change the default one. I created my own baseline which I can use every time.

Good luck with scanning your system! (and securing afterwards :-))

 


NITCtxPatcher a patchmanager for Citrix XenApp and XenDesktop 7.x

CtxPatcher is a tool for downloading (and installing) updates for Citrix products.

Just install the tool using next-next-next and enter your MyCitrix account details :

image

image

Select the products, hit search and all available patches are ready to download. When you download the patches a install script is also created for easy installation:

image

 

Requirements
Features
  • Fast select and download Patches for XenApp and XenDesktop 7.1 and higher
  • Fast select and download LIMITED Patches for XenApp/Desktop 7.1 and higher with one click
  • Automatic detect superseded patches
  • Automatic download LIMITED Patches
  • Generate silent installation scripts for your patches
  • Proxy support
  • Support for an automatic Mycitrix login
  • Html Reports for you downloaded patches
  • Unzip Hotfix archives (for example the DDC and the Citrix Studio)
  • Run custom Scripts after the download
  • Command line support
  • Full silent run of the tool

Download the tool here


How to protect your Linux server using iptables 1

In this post I will describe how to configure the basic Linux firewall IPTABLES. Using iptables you can easily protect your server from intruders. Nowadays many people are hosting virtual machines on Linux (for example for hosting purposes). Ofcourse you can configure an external firewall, but you can also use the internal firewall (the Linux equivalant of the Windows firewall).

Step 1, check if your iptables is installed using the following command:

# yum info iptables

You should see all kind of information, make sure you see the INSTALLED parameter next to the REPO option.

Not installed? Use the following command to installe iptables:

# yum install iptables

(we are not using IPV6)

Step 2 Flush all existing rules using the following command:

# iptables -F

Now all the existing firewall rules are cleared and everything is clean.

Step 3 Block null packets

We can then add a few simple firewall rules to block the most common attacks, to protect our VPS from script-kiddies. We can’t really count on iptables alone to protect us from a full-scale DDOS or similar, but we can at least put off the usual network scanning bots that will eventually find our VPS and start looking for security holes to exploit. First, we start with blocking null packets.

# iptables -A INPUT -p tcp --tcp-flags ALL NONE -j DROP

Step 4 Block syn-flood packages

We told the firewall to take all incoming packets with tcp flags NONE and just DROP them. Null packets are, simply said, recon packets. The attack patterns use these to try and see how we configured the VPS and find out weaknesses. The next pattern to reject is a syn-flood attack.

# iptables -A INPUT -p tcp ! --syn -m state --state NEW -j DROP

Step 5 Block XMAS packets

Syn-flood attack means that the attackers open a new connection, but do not state what they want (ie. SYN, ACK, whatever). They just want to take up our servers’ resources. We won’t accept such packages. Now we move on to one more common pattern: XMAS packets, also a recon packet.

# iptables -A INPUT -p tcp --tcp-flags ALL ALL -j DROP

We have ruled out at least some of the usual patterns that find vulnerabilities in our Linux environment.

Step 6 Add the localhost interface to the firewall filter

Now we can start adding selected services to our firewall filter. The first such thing is a localhost interface:

# iptables -A INPUT -i lo -j ACCEPT

We tell iptables to add (-A) a rule to the incoming (INPUT) filter table any trafic that comes to localhost interface (-i lo) and to accept (-j ACCEPT) it. Localhost is often used for, ie. your website or email server communicating with a database locally installed. That way our VPS can use the database, but the database is closed to exploits from the internet.

Step 7 Now we can allow web server traffic:

# iptables -A INPUT -p tcp -m tcp --dport 80 -j ACCEPT

# iptables -A INPUT -p tcp -m tcp --dport 443 -j ACCEPT

In this example we are only allowing port 80 (HTTP) and 443 (HTTPS) to go through the firewall. Add your (custom?) ports when you need to.

Step 8 Saving the configuration

Now that we have all the configuration in, we can list the rules to see if anything is missing.

# iptables -L -n

Now we can finally save our firewall configuration:

# iptables-save | sudo tee /etc/sysconfig/iptable

The iptables configuration file on CentOS is located at /etc/sysconfig/iptables.

You can now use a portscanner to test your firewall.


Updating mailcleaner.org (updated post) 1

I use the mailcleaner community version for anti spam/virus. Mailcleaner is an Linux appliance that’s easy to set up and configure. The community version is free of charge and can be used to block spam and viruses before they enter your mailserver (for example my Exchange servers). When you have any questions or want to hear about my experience regarding Community please leave a comment. Here is how you can update to the latest version :

cd /usr/mailcleaner cvs update -dP updates

For a full update is better to update using the following steps :

  • aptitude update
  • aptitude safe-upgrade
  • cd /usr/mailcleaner
  • cvs -q update -dP
  • source lib/updates/update_binaries.sh
  • stabilizeBinaries
  • install/install_perl_libs.sh
  • install/install_sa.sh
  • /usr/mailcleaner/bin/check_db.pl –update
  • /etc/init.d/mailcleaner restart

[stextbox id=”warning”]Before and after the above steps it’s recommended to restart your server.[/stextbox]


Unable to manually install Windows Updates (WSUS)

Sometimes you don’t want to wait/rely to your WSUS environment for specific updates.

But when you try to install updates manually you might notice the following message :

windows updates before

It’s easy to go around this, you copy/paste the following code into a reg file :

Windows Registry Editor Version 5.00

[HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\Policies\Explorer] “NoDriveTypeAutoRun”=dword:00000095 “NoWindowsUpdate”=dword:00000000 “NoAutoUpdate”=dword:00000001

When you apply the above regfile and go to windows updates you notice that updates are possible again :

windows updates after


Review : Mailcleaner(.org)

We all now there are lots of ANTI SPAM/Virus appliances, software solutions etc. out there. But somehow in my production & test environment it was too expensive or too much hazzle. All the products I thought were interesting were too expensive or didn’t work like they suppose to. Because I didn’t want to spent thousands of dollars for hardware/software and licenses I started looking for a solution.

I ran into mailcleaner.org. They have a community driven antispam & antivirus solution with some a nice feature pack :

image

SMTP gateway

  • fully compatible with any SMTP server
  • routes mail on a per-domain basis
  • per recipient/host whitelist and blacklist
  • SMTP and LDAP/Active Directory callout for e-mail address validation
  • temporary storage with retries in case of final server failure
  • outgoing load balacing and/or failover

Anti SPAM

  • SpamAssassin base ruleset and additional rules
  • Bayesian controls, with auto-learn
  • RBL checks
  • Checksum protocols such as Razor/DCC/Pyzor
  • URL RBLs
  • SPF checks

Administration Web Gui

  • Complete configuration of filtered domains, with forwarding and preferences
  • configuration of the base system (network, proxy,…)
  • access to all users’ configurations and quarantine
  • full real-time spam quarantine access
  • full real-time blocked content protection policy configuration
  • advanced administration access list, so you can delegate light administrative access to other people, such as a hotline for example
  • monitoring of the whole mailcleaner filter farm
  • access to all logs
  • mail queue access and control

User Web Gui

  • Authenticator connectors so your users won’t have to remember another credential
  • Full access to their quarantine, with message release and analysis options
  • per e-mail address configuration options, such as delivery mode (tag, quarantine, drop) and periodic summaries

Installer

  • support for almost any x86 compatible hardware
  • automatic software raid genertion

and much more…

For your impression watch the screendumps :

image

image

image

image

Some details I left behind because of security purposes. But it took me about 10 minutes to install it, another 10 to configure it and I totally forgot I was running it for some time untill now. Glimlach Just check it out at : http://www.mailcleaner.org