The most important piece to getting hired isn’t the certs or the degree. It’s the interview.

It’s an unfortunate reality that you can be, and maybe already have been, rejected for a position you applied for. Certifications in the relevant fields in hand, a degree to your name…would you be gutted to know that “warm and fuzzies” carries equal weight to the other two?

Improving your warmth and fuzziness is key not only to getting a foot in the door, but advancement in your career. This is one of a few things that go into the nebulous area of “soft skills,” and I have passed on a lot of candidates because the soft skills aren’t there, and it takes a long, long time to train. Lacking IT skills, in comparison, isn’t a big deal; I can tell an admin that if they aren’t comfortable with what the next step is on a project, stop and come find me. When you lack soft skills, I can’t tell you to change who you are.

What goes into “warm and fuzzies” from the perspective of a technical manager? It’s a few things for me, and I’ll lay them out to the best of my ability:

Confidence is one. This manifests as high self-esteem, eye contact, and being relaxed and able to express a range of emotions in the interview. If you’re shrinking away the whole time, looking down, giving fearful responses that you’re very obviously hoping are the ones I want to hear, it gives me concern that you’re not going to be composed in an actual emergency, or that you’re going to embarrass the both of us if come higher-up puts you on the spot.

Another is the ability to speak to topics in your resume, on the spot. I am not a quizmaster-style interviewer. However, if your resume mentions “Configured and maintained Active Directory and Group Policy,” that is highly relevant to the position here, so I’m going to say “Tell me about your work with AD and Group Policy.” A bad answer would be “Uh, you know, all sorts of stuff.” It’s a bad answer for both of us, because now I’m going to have to ask some more direct quizmaster style questions to understand where you’re at with understanding Windows infrastructure. A good answer would be “Sure, before I arrived everyone was having to add network printers by hand, so I set up some security groups for different areas of the building, put the computers in the group that made sense, and set up GPOs to do the printer installation depending on the security group.” That answers a lot of questions at the same time, it shows knowledge of multiple areas at the same time. It probably knocks 5 minutes off the interview that would’ve only been an opportunity for you to dig yourself in a hole.

It’s really important to me that if you are going to put a technology on your resume, you’d better be prepared to explain exactly what you know about it. I have lost count of the times I’ve had someone say on their resume that they know “networking”, but the reality of the situation is that they know what an IP address is and what a MAC address is. They don’t know why they have to enter a subnet mask, they have no concept of basic switching and routing, they don’t know TCP vs. UDP. They would’ve been better off leaving it off entirely, because it feels very dishonest to me, and it puts you in a huge hole for the rest of the interview.

But that’s me, I’m a technical manager. When I had my interview here with 3 other C-levels, I could give a nonsensical answer like “An IP address is what identifies you to the world,” but as soon as the words “IP address” hit their ears, I may as well be talking about tachyon emitters and flux capacitors. My predecessor told me, “The best thing about this position is nobody here knows what you do. The worst thing about this position is nobody here knows what you do.”

So think about this. You’ve got non-technical management evaluating you. Your certifications may as well be from Starfleet Academy. They don’t know, nor do they particularly care, what a hypervisor is. Their needs are simpler:

  • The computers and internet must not break.
  • If the computers or internet break, I must feel comfortable with you being the one handling it.
  • I must be able to have a human interaction with you when things are not broken if I’m ever to let you out of your cage.

And guys and girls, we have a bad reputation when it comes to that third bullet point. Some of it is on our predecessors, some of it is on us, some of it is on the media’s portrayal of us. TV sitcoms involving “nerdy guys” have not done us any favors. But you don’t need to fix the whole world’s vision of IT professionals. You need to fix the hiring managers vision of you. This is more for advancement potential than getting in, because many organizations aren’t expecting that human interaction from an IT person. But they must feel comfortable with you if you’re going to get in.

In the long run, spending $300 on a public speaking course at your local junior college will do more for your career than $300 on another certification.

You may be in the majority of people that say “Ugh, I hate giving speeches.” And that’s totally understandable. Honestly though, the real world doesn’t have a lot of people giving speeches like you had to give in 8th grade History. Sales professionals and executives need that skillset, but that’s not the extent of what you get out of public speaking classes. What you just might get is a bit of confidence as you speak around a table, going over a plan of action, or answering questions in an interview.

What else goes into comfort for a non-technical manager? I think comfort and trust are very tightly linked. You can never be entirely comfortable around someone you deem untrustworthy. Trust, in turn, goes to integrity, it goes to reliability, and it goes to rationality. You won’t be fully comfortable around someone, and you won’t fully trust them, if they aren’t fully honest, reliable, or rational. Consider how each of those can manifest in an interview.

If I find that you really oversold your abilities with little probing, I must conclude that you are not fully honest. If your previous employers indicate that you had issues with absences or tardiness, or that you take far longer than you should to complete tasks, I have to question your reliability. Your overall behavior in the interview is being judged for rationality, but it is gauged mainly by questions about actions you’ve taken in the past, and whether they match up with what I would consider a rational action. I tend to ask candidates about the biggest mistake they’ve made in IT, and the actions they took after they made the mistake. There are two main points to this question. One, if the candidate says “I’ve never made any big mistakes,” they’re either very green or being dishonest. We all make mistakes, and everyone has their own context for what a big mistake is. Two, I’m not particularly interested in what the mistake was. What I’m after is the response. Letting your manager know immediately after the mistake has been made is rational and honest. Fixing a self-inflicted outage and walking away whistling is rational, but not honest. Quitting your job because you didn’t want to be seen after making the mistake is honest, but not rational and raises huge issues about reliability. (This did happen to a candidate, and they were not remotely equipped to be unemployed.)

Oh, and saying “I don’t make mistakes” is neither honest nor rational, and I’ve heard it more than once. Really need to knock that off.

I’m hoping this is helpful for a few of you. I know it can be disheartening to spend so much time and money (and money and money and) on the papers that show you know your stuff, but still not be given a chance. Spend a little time thinking about how you can up your warmth and fuzziness, and I’m confident it’ll help you in the long run.

Use nested traversal groups to allow access to ABE-enabled grandchild folders.

Say you have the following structure:

  • \\DOMAIN\DFS\Folder1\ <– UserA can access this folder.
  • \\DOMAIN\DFS\Folder1\Folder2 <– UserA has no privileges on this folder.
  • \\DOMAIN\DFS\Folder1\Folder2\Folder3 <– UserA has modify access to this folder, subfolders and files.
  • \\DOMAIN\DFS\Folder1\Folder2\Folder4 <– UserA has no privileges on this folder.

With ABE, the UserA will not see Folder2 even though Folder3 is something they do have access to. So permissions are added for UserA, but because they were done incorrectly, they can now also enumerate Folder4 which they should not even have been aware of.

This comes up a lot, and there is a good way to handle it:

  1. Create the ACL Group: ACL_Folder1-Folder2_TRAVERSE and make UserA a member.
  2. Add the (Advanced) ACL on Folder2: ACL_Folder1-Folder2_TRAVERSE –> Read –> This Folder Only

Now, let’s extend the scenario a level deeper:

  • \\DOMAIN\DFS\Folder1\ <– UserA can access this folder.
  • \\DOMAIN\DFS\Folder1\Folder2 <– UserA has no privileges on this folder.
  • \\DOMAIN\DFS\Folder1\Folder2\Folder3 <– UserA has no privileges on this folder.
  • \\DOMAIN\DFS\Folder1\Folder2\Folder4 <– UserA has no privileges on this folder.
  • \\DOMAIN\DFS\Folder1\Folder2\Folder3\Folder5 <– UserA has modify rights on this folder, subfolders and files.

So, this is handled in much the same way, but it would be smart in this case to nest the security groups.

  1. Create the ACL Group: ACL_Folder1-Folder2_TRAVERSE.
  2. Create the ACL Group: ACL_Folder1-Folder2-Folder3_TRAVERSE.
  3. Make the Folder3 traverse group a member of the Folder2 group.
  4. Make UserA a member of the Folder3 traverse group.
  5. Add the (Advanced) ACL on Folder2: ACL_Folder1-Folder2_TRAVERSE –> Read –> This Folder Only
  6. Add the (Advanced) ACL on Folder3: ACL_Folder1-Folder2-Folder3_TRAVERSE –> Read –> This Folder Only

This keeps the ACLs clean and lets you attach the most explicit permission needed to the user.

Use GPO to change the default behavior of potentially malicious file extensions.

If you’re like me and don’t have direct control of your own email filtering, or want to go a step beyond, you’re going to want a way to prevent non-PE viruses from running. Software Restriction Policies are good for this if you’re using them in a whitelist capacity, provided that you’ve also added the extension to the Designated File Types. From a blacklist standpoint it’s tougher.

Another way to approach the problem is to change the default program to open a particular file extension, and if nothing else it’s another layer of security. This has come up in response to .js-powered ransomware variants and, most recently, a .hta variant of Locky.

hta override 3

hta override
The policy is as follows:

User Configuration -> Preferences -> Control Panel Settings -> Folder Options -> Open With

Action: Replace
File Extension: hta
Associated Program: %windir%\system32\notepad.exe
Set as default: Enabled.

These policies do require you to be either aware of any applications that this will impact or being willing to find out after it’s kicked in. Most organizations can modify .hta and .wsh with no negative repercussions. Many can modify .vbs but many cannot, as they are used by many old logon scripts still in use. Many can modify .js, as even those in web development are likely using an IDE to work with JavaScript files rather than simply double-clicking them. Just be aware of your environment’s needs when implementing this.

Enabling this policy and setting it to open in Notepad also benefits you because the end-user will still have something unusual (but now harmless) pop-up, a Notepad window that is full of dangerous-looking stuff. With any luck, they’ll notify the helpdesk so they can clean the system up.

DangItBobby.ps1 – Remotely disable a NIC given only a username.

So I have a code offering today, which I’m calling DangItBobby.ps1. It lets you remotely disable the NIC of a computer given only the username that is logged in. In essence, when in the middle of a ransomware infection, and you see that the owner of all the files is changing to Bobby, you run the script and provide credentials of a local admin account. Then you tell it you’re looking for Bobby, it’ll check AD to make sure that’s a valid account, then check with WMI to see if there’s an explorer.exe process running under Bobby’s context on each computer, which you can narrow down with the first few characters of what the workstation might be. If they’re logged into multiple workstations it’ll let you choose which one to work with. Then it’ll give you a list of NICs and a little information about each one, and let you choose which one to disable.

I hope I don’t need to tell you to be careful running this.

Quick Tips: Programmatically emptying the Temp folder for all user profiles in a terminal server.

I ended up needing to do this last week, we have a LOB application that people access via Terminal Services, and it doesn’t clean up after itself in the Temp folder, which causes the application to act up. Can’t get the developers to fix the problem so it’s on us. The existing fix was one batch file, tied to one scheduled task, for every user (50+) of the terminal server. Nightmare to keep maintained.

So I built a simple powershell script, one script for all user profiles.

for(;;) {
try {
Set-Location "C:\Users"
Remove-Item ".\*\AppData\Local\Temp\*" -recurse -force
catch {
# EventSentry will watch for Powershell dying.

# wait for a minute
Start-Sleep 60

Then I created a scheduled task to have Powershell run on startup with the argument -file Path:\to\script.ps1 and had it run as SYSTEM with highest privileges. Since this was the first time using ps1 files on this server I also needed to Set-ExecutionPolicy RemoteSigned.

Quick Tips: Share Permissions do more than you think.

While rebuilding a piece of my lab for file server and DFS services, I had an odd set of symptoms. I had a user in a security group that was not set to be able to change permissions, and no ability to take ownership, in the NTFS permissions. Yet they were able to add permissions to give others elevated access, or even elevate their own access.

It turns out I’d forgotten the share permission side, where this still had some debug settings; in particular, that “Authenticated Users” had full control. Have you ever really messed with those share permissions? Usually we rush right on and do Everyone -> Full Control and then lock it down with NTFS permissions later. But, have you ever tried just doing Change and Read and leaving Full Control off? It’s actually what you’re usually trying to accomplish, and gives you a little head start in that you don’t have to hope you don’t get a clever end-user later that elevates their own NTFS permissions.

Ransomware is the future.

When I first started fighting ransomware in late 2013, I had a premonition that this was something serious. While CryptoLocker was rather easily defeated in the enterprise and ultimately killed by killing the botnet, media outlets and tech sites ran with the story. It showed this small group making millions and millions of dollars. Guess what? More people started writing ransomware.

Cut to now. Ransomware-as-a-service is a real thing you can buy, some variants have live chat support to receive payment, and we routinely see new versions with bugfixes and feature-adds. One of the last major flaws in ransomware, the inability to enumerate non-mapped network shares, was overcome in the latest Locky build that calls WNetOpenEnum() to attempt to traverse every share on the network.

For several years, the endgame was more abstract. Hit the PC with your rootkit, join it to your botnet, sell botnet access to spammers for a fee. Now they can cut out the middle man and have less overhead, since there’s not a need for constant command-control oversight. It’s a path to riches hampered only by the still-high knowledge barrier to acquire and send bitcoin. And by affecting the user’s files, rather than a popup about the FBI or TotesLegit AntiVirus which can simply be fixed by backing up the data and nuking the PC, you’ve got them at your mercy.

This is going to get worse way before it gets better. You’ll see builds that try to invoke APIs for popular cloud storage providers to delete the versioning. They’ll find ways to avoid taking ownership of a file to quickly spot the vector of infection. While it’s almost impossible to be truly proactive to effectively block ransomware, there are things that can be done.

  • Avoid mapping your drives and hide your network shares. WNetOpenEnum() will not enumerate hidden shares. This is as simple as appending a $ to your share name.
  • Work from the principle of least permission. Very few organizations need a share whereby the Everyone group has Full Control. Delegate write access only where it’s needed, don’t allow them to change ownership of files unless it’s a must.
  • Be vigilant and aggressive in blocking file extensions via email. If you’re not blocking .js, .wsf, or scanning the contents of .zip files, you’re not done. Consider screening ZIP files outright. Consider if you can abolish .doc and .rtf in favor of .docx which cannot contain macros.
  • Install ad-blockers and script-blockers as standard loadout. Drive-by malware is out of control right now. Cut off the vector of infection. I use uBlock Origin which is easy to disable case-by-case and offers niceties like element blockers.
  • Install the old CryptoLocker Software Restriction Policies which will block some rootkit-based malware from working effectively. You can create a similar rule for %LocalAppData%\*.exe and %LocalAppData%\*\*.exe as well. It was pointed out in the reddit comments, that if it’s at all feasible, run on a whitelist approach instead of a blacklist. It’s more time-intensive but much safer.
  • Stay up-to-date on the latest ransomware news, how they operate, and what the decrypt instruction filenames are. These can be added to file screens with FSRM to execute a command to kill the share. Simply disabling the affected user is not enough without also forcing them to log off.
  • Backups. Having good, working, versionable, cold-store, tested backups makes this whole thing a minor irritation rather than a catastrophe. Even Windows Server Backup on a Wal-Mart External USB drive is better than nothing. Crashplan does unlimited versioned backups with unlimited retention at a flat rate, and there’s a Linux agent as well. Hell, Dropbox does versioned backups. Get something.
  • [Added 5/12/2016] For some non-program (WinPE) viruses, you can change the default behavior of some extensions to open in Notepad rather than the original vulnerable target. A guide is here.

Ransomware has evolved a lot in just a couple of years, and the end is nowhere in sight. Get your defenses up, and know the enemy. There’s a lot less heartache doing the prevention before you’re hit than after.

Setting up secure Home Folders without touching AD.

In preparing for a new file server schema, I’ve been playing with home folders and quotas. I wanted to come up with a “home folder” solution that met a couple needs:

  • Worked well with Windows 7 through 10
  • Did not involve the client support team having to set the home folder in each user’s AD profile
  • Did not rely on folder redirection or roaming profiles
  • Set-and-forget implementation and minimal headache to manage in the event of data going missing out of a directory
  • Secure with no ability for users to see other users shares or even be aware of them, or to delegate permissions to other users
  • Does not use a mapped drive, to mitigate against current ransomware trends
  • Can be managed with File Server Resource Manager for quotas and file screening.

After three days in the lab, I came up with a solution that accomplishes all of these goals. It takes a particular combination of NTFS permissions, Access Based Enumeration, and some group policy, and then whatever you want to do with FSRM. The core concept will work without ABE and FSRM, that’s just a cleanliness thing for other related projects we’re planning.

Setting up the share:

Set up your share however you normally do it. I like making shares for home folders hidden, e.g., \\FS01\UserShares$. Give the Authenticated Users object Full Control. We’ll get more restrictive with NTFS permissions.

The NTFS permissions:

This is the secret sauce to get the security piece working correctly. We’re going to stack a few permissions to balance functionality and security.

Working from the base of your User Shares folder, right click, Properties, Security, Advanced.

  1. Disable Inheritance and remove all inherited permissions.
  2. Add SYSTEM and give it Full Control over “This folder, subfolders and files”.
  3. Optionally, have a security group that will have permissions to browse all shares for the purposes of investigation or data recovery, and give it Full Control over “This folder, subfolders and files.”
  4. Add Authenticated Users, specify that it applies to this folder only, and switch to advanced permissions. Give the object: Traverse folder / execute file, List folder / read data, Read attributes, Read extended attributes, Create folders / append data, Write attributes, Write extended attributes, Read permissions.
  5. Add the OWNER RIGHTS object, specify that it applies to subfolders and files only, and check Modify on top of the existing check boxes.

Before I go any further I want to point out two negatives to the implementation:

  • It does not prevent the user from deleting their home folder, assuming they figure out how to get to the parent folder.
  • It does not prevent the user from creating another folder in the UserShares$ share, assuming they could figure out how to get there.

In our case, those two caveats are acceptable for us. If you’re wondering why we’re using OWNER RIGHTS instead of the usual CREATOR OWNER object, doing the latter will allow the users to delegate permissions to any valid AD object (i.e., letting them have others browse their home folder). OWNER RIGHTS does not. OWNER RIGHTS is available as an object starting with Server 2008 and Windows Vista.

Building the Group Policy:

So now that folders will have the right permissions when a user makes them, let’s have them get created automatically, and put in a convenient place on the workstation.

  1. In Group Policy Management Center, create a new policy for users.
  2. User Configuration -> Preferences -> Windows Settings -> Folders -> New Folder
  3. Action: Create, Path: \\FS01\UserShares$\%LOGONUSER%
  4. Under the Common tab, check the box for “Run in logged-on user’s security context (user policy option)” or else the computer account will own the folder.
  5. User Configuration -> Preferences -> Windows Settings -> Shortcuts -> New Shortcut
  6. Action: Update, Name: Home Folder, Target type: File System Object, Location: My Network Places, Target path: \\FS01\UserShares$\%LOGONUSER%, Icon index: 9
  7. Under the Common tab, check the box for “Run in logged-on user’s security context (user policy option)”

Run a gpupdate /force from the DC and restart your workstation to test. What you should see on logon is a new item in Network Locations called Home Folder. If you check the file server, you should see a new folder under your share with the username of your account you’re testing with.

Observe a few things here:

  • The user can’t access any other shares, but they can see them. We will change this later.
  • The user can’t drag their folder into another user’s folder.
  • The user can’t change the permissions of their folder, or any subfolder or file within their folder.

At this point you have a working solution if you don’t care about file quotas, file screening, or hiding folders the user can’t read. For the rest of it, read on.

Feature add: Access-Based Enumeration

ABE is an under-loved feature of Windows Server and has been in since 2003. In a nutshell, if a user has no access to a folder within a share, it will not be displayed to the user if ABE is turned on for the share. The exact method to enable it varies by the OS but in Server 2012 and R2, it requires the File Server role to be enabled. Though there’s a gray checkbox indicating some of the roles are already on, this is not one out of the box. Once it’s added, you’ll be able to go to Server Manager -> File and Storage services -> Shares. Right click your UserShares$ share and go to Properties. Under the tab for Settings, check the box for “Enable access-based enumeration” and hit OK. That’s all there is to it. Create another test logon, and then hit your UserShares$ share with Explorer. You will only see your own share, and not any other user home folders.

Feature add: Quotas

I feel like most people are going to want some sort of quota system in place for this. This requires the File Server Resource Manager role enabled. You could manage this from Server Manager but that way is very rigid and doesn’t allow for exceptions for different users.

Once it’s enabled, you’re going to want to start with a Quota Template in FSRM. The big decisions are the amounts, whether it’s a hard or soft cap, and if there are actions you want taken (an email, something in the event log, or an arbitrary command) at a certain percentage full. Once the template is completed, go ahead and add a quota. Under Quota path, browse to the root folder where your home folders reside. Hit the radio button that reads “Auto apply template and create quotas on existing and new subfolders.” Then under quota properties, choose to “Derive properties from this quota template” and select the template you just created. Then hit Create. That’s about all you’ve gotta do.

You may also wish to make a more relaxed quota for power users. Make another template, then under Quotas, hit Filter, choose All and All, and hit OK. You should see all your home folders individually laid out. Choose your example power user and right click, then Edit Quota Properties. At the top, choose to “Copy properties from quota template,” select your relaxed template, and hit Copy. If you don’t hit copy, nothing happens. Then hit OK.

If you wish to disable quotas altogether for a user,  choose that user in Quotas, and at the action bar on the right, hit Disable Quotas. Simple.

Feature add: File Screening

File screening lets you block certain extensions from being placed in a folder. It’s best to do this early and adjust later if you’re rolling out home folders and are considering implementing this. For us, we don’t wish to use expensive server storage backing up Jimmy Buffett’s Greatest Hits and the vacation pictures that Stan from Accounting took eight years ago. You can use File Screening in either a whitelist or blacklist function. Both ways involve the File Screening Management section of File Server Resource Manager (FSRM), which you will need to enable as a Role.

Option A: Blacklist

This may be easier in the long run, albeit less secure. Head to File Groups under File Screening Management, you’ll see a number of groups already made. A fair warning, under Audio and Video Files, *.m4a is not included by default and should probably be added by you, and *.mp4 is not in by default in WS2008 R2 when I last looked. Also, Executable Files includes *.ps1 and *.js which may give your developers some grief. Once you’ve got your groups dialed in, you can build the template to screen against. Under File Screen Templates, choose Create File Screen Template via the Action bar. Here, you can choose multiple file groups to add to the template, whether to hard-screen or simply monitor, and also how to handle exceptions. You have to fill out one tab at a time because reasons. Once that’s done, you can actually do the screening under File Screens -> Create File Screen. Browse to the base directory for your home folders, and choose the template you created under “Derive properties from this file screen template.” Then hit create. It should take effect immediately.

File Screen Exceptions are for power users that need access to one or more of your blocked extensions. Choose their home folder, and either create the file group on the fly or build it out with the extensions to allow.

Option B: Whitelist

This may take more care and feeding but will result in a more secure environment, and this can easily be extended to larger file shares to help alert about malware. Start in File Groups under File Screening Management in FSRM. Create a File Group and call it Everything. Enter * in “Files to include” and hit Add. Then go to File Screens -> Create File Screen. Browse to your base home folders directory and choose to “Define custom file screen properties” and hit Custom Properties. Check the box for your “Everything” file group, and determine what notifications you wish to be sent in the other tabs, then hit OK.

Now you’re going to create the exceptions, which in this case serve as a whitelist. Use the same base home folders directory as the Exception Path, and then check the file groups you want to allow through. I highly recommend you allow the Temporary Files group, or Office files will not work right. Also bear in mind, *.pdf is not specified in any existing group by default. You may wish to go the route of making an Exception Group in File Groups to have everything in one place that all employees will need access to.

When power users need exceptions, you’ll follow the same process, just with the exception path as their home folder. You could theoretically not screen someone at all by choosing your Everything object as an exception group.


What you’re left with is a home folder structure that’s very secure, with entirely automated provisioning and screening, accessible to end-users, compatible with all supported versions of Windows, and minimizes your exposure to ransomware by avoiding mapping a drive.

I hope this was informative, it was quite a constructive lab session.