Active Directory and its uses

Gavin Livingstone, Bryley Systems Inc.

Microsoft’s Active Directory (AD) is not well known, but it is a critical component in securing Windows Server-based networks.

Active Directory, introduced with Windows Server 2000, is included with most versions of Windows Server, but is also available as a service1.  Its primary function is to facilitate authentication and authorization of users (members) and resources within an AD domain.  (An AD domain is a logical collection of users, computers, groups, and other objects; multiple domains can be created for different sites or groups, and trust relationships can be established between these domains.)

One of AD’s greatest strengths is to permit the centralized creation of user and group-based policies; it can then enforce these policies, ensuring that members comply with login and usage requirements.  Plus, it logs policy violations and login attempts, supporting the automation of error-log-checking solutions.

Basic AD services include:

  • Domain Services (AD DS) – Stores and verifies member credentials
  • Lightweight Directory Services (AD LDS) – A limited-feature version of AD DS
  • Certificate Services (AD CS) – Public-key certificates supporting encryption
  • Federation Services (AD FS) – Single sign-on functionality; AD and non-AD
  • Rights Management Services (AD RMS) – Management of access rights

Single instances of AD DS run on a server; once AD DS is deployed, this server is known as a domain controller (DC).  Most Windows Server-based networks have two or more domain controllers; a primary DC and secondary DC(s) to provide failover directory (via replication) and location-based access to the directory.

During login, users authenticate to the primary DC or to a secondary DC.

Active Directory is managed through a series of tools; most are included within Windows Server, but third-party tools2 exist that provide better control and automation, particularly for larger organizations managing complex environments.

Best practices for AD design include3:

  • Build a logical structure based on a hierarchical, tree-like approach:
    • Forests – Top-level container (not always used)
    • Domains – Second-level containers within forests
    • Organizational units – Third-level containers within domains
  • Construct a physical model to address location requirements/constraints:
    • Place at least one domain controller (preferably two) at each site
    • Determine placement of replicas of domain data
    • Describe network topology
    • Consider traffic limitations

AD design tips4 include:

  • Keep it simple
  • Match site topology to network topology
  • Ensure you have at least two DNS servers
  • Try to dedicate a server as a domain controller

Security best practices for AD include5:

  • Rename or disable the Administrator account
  • Physically secure domain controllers and servers
  • Apply Group Policy settings to restrict users, group, and computer access

Basically, Active Directory forms the heart of any Windows Server-based network; it is a critical component, even when using Cloud-based resources.  (Cloud-based resources can often be integrated within AD through Federated Services.)

References

1Active Directory as a service is available through Microsoft’s Azure Active Directory, Bryley Systems’ Hosted Cloud Server™, and other providers.

210 Must-Have Active Directory Tools by Walker Rowe of Anturis, 4/14/15.

3Best Practice Active Directory Design for Managing Windows Networks and Best Practice Active Directory Deployment for Managing Windows Networks from the Microsoft Developer Network.  (These are dated, but extremely detailed.)

410 Tips for effective Active Directory design by Brien Posey of TechRepublic, 8/23/2010.

5Active Directory Best Practices at Microsoft TechNet on 1/21/2005.

Nicole Sawitz joins Bryley Systems!

Nicole Sawitz Profile PhotoNicole Sawitz is the newest member to join Bryley Systems’ Business Development team.  In her role as Administrative Assistant, Nicole will assist with proposals, presentations and general business-development functions.

Mrs. Sawitz has a BA from Goucher College and a Certificate for Program Evaluation from Tufts University.  Prior to Bryley Systems, Nicole was an Administrative Assistant at Newton Scientific, Inc., in Cambridge, MA.

Bryley Basics: How ransomware attacks

Eric Rainville and Gavin Livingstone, Bryley Systems Inc.

Most ransomware attacks through email; an end-user unwittingly opens an innocuous attachment within an email, which then loads software that quietly encrypts all data files.  Once completed, it announces its accomplishment (Hooray!) and provides instruction on how to pay the ransom (through an anonymous, online payment method) to then receive the key which removes the encryption.

The only effective ways to prevent ransomware:

  • Block the email before it is distributed to the email recipient.
  • Train email recipients to not open email attachments from uncertain sources.

Once infected, the recommended recourse is to restore the encrypted files from backup. (We recommend that you do NOT pay the ransom; this will likely put you at risk for future infections.)

An example of a recent ransomware email:

From: info@yortax.com

Sent: Wednesday, February 17, 2016 10:48 AM

To: XXX

Subject: February payment

Importance: High

We’re ready to pay, just need you to confirm the payment details.

Check the invoice, it’s attached, and let me know if everything is correct.

We will remit the payment as soon as we hear from you.

Thank you

This variant of ransomware infects a computer in a step-by-step fashion:

  • The email recipient opens the attached Microsoft Word (.doc) file.
  • The body of the text within the .doc file is a picture that tells the email recipient to “Enable editing features” and shows how to do so.
  • The email recipient follows this instruction and enables editing features.
  • Once editing features are enabled, the original .doc file downloads a document to your appdatatemp location and opens it at this location. (It looks like the same exact document, but with a different name.)
  • As requested, the email recipient again enables editing features, which causes an executable (.exe) file to be downloaded to the same location; the .exe either runs right away or runs at the next startup. (Sometimes, the .exe does not start encrypting files right away; it may have a timer to lie dormant and wait for a period of time.)

For remediation tips, see Dealing with CryptoLocker from the July 2015 issue of Bryley Information and Tips (BITs).

Windows Server 2016

Lawrence Strauss of Strauss and Strauss

This is an exciting time in business computing. We’ve witnessed dramatically new improvements in systems, architecture, storage, and networking. Windows Server 2016 offers the promise of helping organizations deal with all these rapid changes within the entirely familiar Windows environment.

Windows Server 2016 (expect a fall release [as of this writing Windows Server 2016 is in Technical Preview 4]) represents developments that ensure stability and easy adaptability to provide a software environment able to help organizations weather the pace of change. Stability is achieved by delivering increased ways of isolating data on your servers and in the Cloud. Easy adaptability comes from moving more and more functionality to the Cloud, where both software and its underlying hardware can continue to develop; your organization sees only the benefits of these changes, not the costly interruptions.

Today’s Windows Server is a Swiss army knife that has the ability to run millions of different applications, which is where the problem lies: The base operating system (OS) continues to grow in size and complexity. (The overhead of a traditional Windows Server providing a single-core service is staggering: Simple features, such as DNS or DHCP, require a 20GB server installation.)

Windows Server Core, a full Windows Server OS without the GUI, was first introduced with Windows Server 2008 and helped address this issue. Now, Nano Server is the next step in the evolution toward a small-footprint base OS.

Nano Server is possibly the most revolutionary element of this Windows Server release. As its name indicates, Nano Server is a very lightweight OS that can host applications built on frameworks like .Net, or Microsoft’s Hyper-V virtual machines.

Nano Server is made for remote management with scripting automation through small pieces of modular code, rather than by traditional GUI OS management techniques. It is managed by PowerShell. Nano Server is incredibly efficient in that it shrinks the OS footprint by 93 percent, the number of patches and maintenance by 92 percent, and the number of reboots by 80 percent. These efficiencies make it ideal for Cloud-based implementations.

Microsoft’s Nano Server is a unique departure for Microsoft and, according to the company, the future of the Windows Server platform. Linux has a head start with its microservices journey but Microsoft has shown an uncanny ability to turn on a dime when needed. If Microsoft can find the balancing point between the agile, quick, streamlined, container platforms that are still versatile enough to support the gigantic Windows developer community, all while allowing balanced administration, Nano Server could be a game changer. While this all sounds like a lot to balance (and it is), let’s not forget the improvements Microsoft made with Server Core from Windows Server 2008 to Windows Server 2012, which put Windows Server Core 2012 into the enterprise with the proper balance between performance, versatility, and managerial features. Nano Server looks to be that evolutionary and revolutionary step for Windows Server.

It is very unlikely that Nano Server will replace the traditional server OS overnight; Microsoft is still working on tools for the administrator to support it. (Windows Server Core 2008 suffered slow deployment due to the lack of remote tools for the administrator, a problem that was addressed in Windows Server 2012.) The other challenge will be developing applications for the Nano Server. (Since these containers do not run a full installation of the .Net Framework, it will require developers to redesign at least part of their applications to take advantage of the .Net core framework.) While this may seem troubling, streamlining the server to focus only on exactly what it needs to do is ideal in today’s world, where a system administrator’s time is so heavily focused on administration duties, such as patching and security hardening.

The ideal target with Nano Server is the infrastructure of native, Cloud-based applications. The small footprint in disk space and code help to make the Nano Server a platform that should require little patching or maintenance – making it ideal for Cloud-based environments.

The Nano Server isn’t Microsoft starting over – but it is pretty close. Without the traditional .Net Framework, remote management is needed. Even many of the traditional hooks that allow servers with graphical user interfaces to perform remote management are missing.

Moving toward miniaturization, while based on the Microsoft server platform, has much of the interface, application stack, and traditional .Net framework removed. The Nano Server becomes a lightweight host for Hyper-V VMs or applications designed to run on the .Net Core framework.

The other important functions for Nano Server are in Hyper-V and scale-out file-server roles. Both of these roles fit very well within Azure and the Cloud-based strategy that Microsoft is moving forward with.

The Hyper-V role should be of particular interest to many administrators looking to use Hyper-V as an alternative to VMware. While Nano Server is still not as streamlined as VMware’s ESXi, it is a great step in the right direction and an improvement over Windows Server Core. However, the unique thing about Nano Server is that it can run on bare metal, as a virtual machine, or even as a container, something VMware’s ESXi cannot do, giving the developer and administrator the ultimate in flexibility.

Windows Server 2016 also offers robust support for containers and virtualization. Containers are isolated sections of data that can host applications, including the OS software needed to run those applications. This allows software requiring different operating systems to easily coexist on the same server. Windows Server 2016 supports open-source Docker containers that offer the promise of a more efficient, lightweight approach to application deployment than most organizations are currently implementing.

Unlike virtual machines (VMs), however, containers still expose the underlying operating-system version and capabilities. New Hyper-V Containers, however, offer a blend of features from Hyper-V virtual machines and Windows Containers. Like a VM, Hyper-V Containers provide isolation from the underlying operating system, but like a container it uses a filesystem for deploying single apps. The benefits to organizations of this isolation include increased security, the ability to address problems without having them affect other operations, and an increase in the number of entirely independent functionality handled on the same architecture; additionally in DevOps situations, everyone involved has the exact same conditions in which to write, test, and use.

To aid in disaster recovery and to speed failover, Microsoft has introduced Storage Replica, which gives you the ability to replicate entire volumes at the block level in either synchronous or asynchronous modes.

Storage Spaces Direct is an advancement over Storage Spaces’ high availability, storage-management software. Storage Spaces Direct gives you the ability to build a highly available storage system using only directly attached disks on each node. Also Storage Spaces Direct enables organizations to make use of new hardware like NVMe (NVM Express) SSDs and older HDDs; locally accessible node storage can be used as shared storage.

The Resilient File System (ReFS) is another feature that was introduced with Windows 8 and Windows Server 2012. Designed from the beginning to be more resistant to corruption than its predecessor, ReFS brings many advantages to the NTFS on-disk format. Microsoft has elevated both the usefulness and the importance of ReFS in Windows Server 2016 TP2 by making it the preferred file system for Hyper-V workloads.

This has huge performance implications for Hyper-V. For starters, you should see new virtual machines with a fixed-size VHDX created almost as fast as you hit return. The same advantages apply to creating checkpoint files and to merging VHDX files created when you make a backup. These capabilities resemble what ODX (Offloaded Data Transfers) can do on larger storage appliances. One point you need to keep in mind is that ReFS allocates the storage for these operations without initializing it, meaning there could be residual data left over from previous files.

With Windows Server 2016, your organization gets the functionality to build a Cloud infrastructure and to run a self-service, high-density Cloud. In microservices implementation, Nano Server, dramatically cuts the weight of OS services and is, per Microsoft, the future of the Windows Server platform. Containerization and improved virtualization allow you to create protected environments making issues easy to address. There’s a lot in this important, evolutionary step for Windows Server.

Microsoft’s focus on delivering a hybrid Cloud platform is clearly dictating the direction it’s taking in Windows Server 2016. Improvements to Hyper-V mean it’s easier to host and manage virtual machines as you upgrade your host environment, while PowerShell takes center stage with the arrival of the headless Nano Server option.

The value of a computer-network assessment

Gavin Livingstone, Bryley Systems Inc.

Most situations benefit from an assessment – Firefighters assess the structure, locale, and availability of resources (water) before rushing into a burning building; politicians (hopefully) assess the potential consequences before stating their position on a controversial topic; my insurance company wants to assess the damage before they fix my car.

Business owners and decision-makers use assessments continuously:  Useful, structured information is key to reducing risks and to measuring these risks against the intended result.  An assessment simply allows one to read the current state, consider the desired outcome and potential consequences, and provide (hopefully) all of the information needed to make a superior decision.

In order to make an informed business decision on your IT investment and future, you need comprehensive, factual information on the current state of your IT infrastructure, focusing on at least these topics:

  • Business goals, needs, and budget
  • Applications, Cloud or on-premise, and their operating environments
  • End-users devices (workstations, notebooks, mobile devices, printers, etc.)
  • Network equipment (servers, SANs, firewalls, switches) and Cloud options
  • Exceptions to best and standard practices

To do so, you would request a computer-network assessment, which identifies network-based and Cloud-based assets; it should also expose security gaps and all other issues that could impact uptime.

Done right, an assessment should include:

  • Business:
    • Review business goals relative to mission-critical technology.
    • Determine current and future needs in terms of applications, users, network capacity, and Cloud options.
    • Define the available budget to address these goals and needs.
  • Applications:
    • List each application; include vendor-contact information.
    • Identify all users of each application and their operating environment.
    • Assess application’s environment for current and future needs.
  • End-user devices:
    • Create a configuration sheet for each device with relevant details.
    • Assess capacity of device compared to current and future needs.
  • Network equipment and Cloud options:
    • Create a configuration sheet for each on-premise item with full details.
    • Assess capacity of equipment compared to current and future needs.
    • Identify software licenses:
      • Review non-OEM licensing.
      • Verify license count to server settings and actual users.
    • Identify and assess Cloud options.
  • Exceptions:
    • Identify exceptions to standard practices.
    • Identify environmental exceptions.
    • Create exceptions document.

Bryley Systems offers an entry-level service, Network Assessment/Basic™, but also offer two higher-value network-assessment options:

  • Network Assessment/Plus™
  • Network Assessment/Pro™

Network Assessment/Basic™ provides basic information at a modest investment.  It includes the following:

  • Deployment of our secure, non-invasive, network-assessment tool for one-time collection of data (followed by immediate removal of this tool).
  • Brief, non-client-facing review of these reports by a Bryley Engineer.
  • Presentation of these network-infrastructure reports:
    • Network Assessment PowerPoint – Summary with risk and issue scores
    • External Network Vulnerabilities Summary – External vulnerabilities
    • Client Risk Report – Overall risk score with risk-area charts
    • Site Diagram – A Visio-style graphic of network assets

Network Assessment/Plus™ is a mid-level approach with additional reports and an in-depth review with written comments.  It includes the deliverables above plus these additional items:

  • In-depth, non-client-facing review with comments from a Bryley Engineer.
  • Presentation of these additional, network-infrastructure and security reports:
    • Full Detail Report – Unfiltered details on configurations and activity
    • Internal Vulnerabilities Report – Deviations from industry standards
    • Network Security reports – Proprietary Security Risk Score
    • Security Assessment reports –Security policies and login

Network Assessment/Pro™ is an all-inclusive effort with an onsite, client-facing presence by a Bryley engineering team and a complete, detailed write-up with recommendations.  Its purpose is to review and document all assets, security gaps, and related issues identified via our network-assessment tool and an onsite, visual examination.  These findings are documented, along with all relevant reports from our network-assessment tool, and are presented onsite to the recipient.

Click here to see our current promotion on Network Assessment/Basic™.  You can also email ITExperts@Bryley.com or call 978.212.5806.